Spend enough time on the command line and you’ll eventually want to do many tasks…that take some intricate commands…repeatedly. A good example of this, is making thumbnails of photos. Basically, our workhorse of this script is not ImageMagick (which provides convert, identify and mogrify), but the for loop in bash itself. Ready? Grab your pen-knife and let’s whittle out a script: (Read the rest at Freedom Penguin)
RDO Liberty (beta) DVR Deployment (Controller/Network)+Compute+Compute (ML2&OVS&VXLAN) on CentOS 7.1
Per http://specs.openstack.org/openstack/neutron-specs/specs/juno/neutron-ovs-dvr.html
1. Neutron DVR implements the fip-namespace on every Compute Node where the VMs are running. Thus VMs with FloatingIPs can forward the traffic to the External Network without routing it via Network Node. (North-South Routing).
2. Neutron DVR implements the L3 Routers across the Compute Nodes, so that tenants intra VM communication will occur with Network Node not involved. (East-West Routing).
3. Neutron Distributed Virtual Router provides the legacy SNAT behavior for the default SNAT for all private VMs. SNAT service is not distributed, it is centralized and the service node will host the service.
Complete text of this post may be seen here
Infographic: LXD Machine Containers from Ubuntu Linux
Canonical, through Alexia Emmanoulopoulou, had the great pleasure of publishing what it appears to be the first ever infographic of the LXD container hypervisor used in the Ubuntu Linux operating system.
“LXD containers look and act like virtual machines, but have the lightweight performance and scalability of process containers,” says Alexia Emmanoulopoulou. “The infographic attached on the right will help new users to better understand how LXD works, how it performs compared with other similar software,…
Autodesk Open Sources Linux-Based 3D Printer
Autodesk has open sourced the electronics and firmware of its resin- and DLP-based Ember 3D printer, revealing it to run Linux on a BeagleBone Black clone.
In releasing the design of its Ember 3D Printer under open source licensing, Autodesk has revealed a mainboard that runs Linux on a customized spin-off of the BeagleBone Black hacker SBC. In March, the company published the recipe for the printer’s “PR48” Standard Clear Prototyping resin, and in May, it followed through by open sourcing its mechanical files.
Splunk Adds New MINT, Hunk, IoT Support to Platform
Splunk, which provides an analytics platform for machine data, today announced updates to several products including Splunk MINT, Splunk Light and Hunk.
Hunk 6.3 is an integrated analytics platform used to explore, analyze and visualize big data in Hadoop and Amazon S3. Splunk Light is a lighter weight version of the Splunk platform made for smaller IT shops, and MINT is an operational intelligence application that runs on top of Splunk Enterprise and Splunk Cloud.
Google Launches Service for Managing Hadoop, Spark Clusters
Cloud Dataproc will make it easier to administer and manage clusters, the company says.
Big data analytics technologies such as Hadoop and Spark can help organizations extract business value from massive data sets, but they can be very complex to administer and to manage. Hoping to help reduce some of that complexity, Google Wednesday announced the launch of a new service dubbed Cloud Dataproc for customers of its cloud platform.
The Companies That Support Linux: DCHQ
DCHQ is a governance, deployment automation, and lifecycle management platform for container-based applications. The company offers out-of-the-box integrations with private and public cloud platforms, which lets development teams automate the provisioning and scaling of virtual infrastructure they’re already using.
DCHQ recently joined The Linux Foundation as a new member. In this profile, Amjad Afanah, founder of DCHQ, tells us more about the company and their open source strategy, including why they joined The Linux Foundation and how they are innovating with Linux and open source.
Can you describe DCHQ.io’s business for us?
DCHQ delivers enterprise discipline to container app lifecycle management. It combines Docker agility with advanced application modeling, lifecycle management, policy and governance controls. Available in hosted and on-premise versions, DCHQ gives infrastructure operators the controls and end-to-end automation they need while still giving app developers the agility they want when moving from dev/test to production model.
DCHQ provides an advanced application modeling framework. It ships standard with enhancements to Docker Compose like cross-image environment variable bindings, extensible BASH script plug-ins that can be invoked at request time or post-provision, application clustering for high availability across multiple hosts or regions and auto scaling. It facilitates application deployments on Linux hosts running on-premise or in the public cloud using an agent-based architecture. This supports advancement placement, containers/hosts/clusters monitoring, application backups, continuous delivery, container updates and out-of-box alerts/notifications. DCHQ automates the provisioning and auto-scaling of virtual infrastructure on 12 different cloud providers and frameworks. The current list includes OpenStack, CloudStack, DigitalOcean, Microsoft Azure, Amazon Web Services, Google Compute Engine, Rackspace, HP Public Cloud, and IBM SoftLayer. Lastly, our product eliminates “Shadow IT” by enabling granular access controls to data-centers, application templates, builds, plug-ins and Docker repositories (including Docker Hub, Quay, and Red Hat Registry).
Why is open source important to the company? How and why do you use open source and/or Linux?
While DCHQ is commercially licensed software with enterprise-grade support, the company has embraced and contributes to open-source technologies in the Docker ecosystem, including the Open Container Initiative. DCHQ On-Premise runs on Docker containers and may be installed on-premise on Red Hat Enterprise Linux (RHEL), Ubuntu or CentOS. It installs via shell script or automated deployment from DCHQ Hosted PaaS. DCHQ registers a variety of Docker repositories, including Red Hat Container Registry, Docker Hub and Quay. It also integrates with Weave for cross-container communication across different hosts.
What is the company’s open source strategy?
DCHQ will continue to embrace and integrate with complementary open-source technologies. In the near future, the company plans to open source parts of the platform to get the support of the DCHQ community and continue to improve its services.
Why is Docker a core part of your strategy? Why is Docker important?
Our platform today focuses on providing end-to-end automation for Docker-based applications. As more Linux Container technologies become popular, DCHQ plans to be container-agnostic and support other emerging container technologies.
What do you see as the biggest benefit of collaborative development?
The benefits are obvious as on-boarding the help of a development community can always take the platform further — especially as technologies and application frameworks evolve.
Why did you join the Linux Foundation?
We believe in the foundation’s mission and are keen on ensuring the growth of Linux as our platform ultimately depends on a stable and continuously improving Linux.
What interesting or innovative trends are you witnessing and what role does Linux or open source play in them?
The growing adoption of Linux Containers for continuous integration and delivery is the ultimate example of how open-source initiatives can transform development processes and accelerate application development. Our company is now focused on simplifying the containerization of enterprise applications and providing the infrastructure management, governance controls and application life-cycle management needed to fully unleash the power of Linux Containers.
What other open source or collaborative projects do you participate in and why?
We have just started contributing to various Apache projects, and in the near future, our contribution to Docker and other Linux projects will grow.
Anything else important or upcoming that you’d like to share?
The explosive growth and interest in containers needs support from companies that understand enterprise technology consumption models. That’s what DCHQ was founded to deliver for app modeling and lifecycle management.
GitLab One-Ups GitHub With Open Source Enterprise Code Hosting
With a new version of its product in the offing and $4 million in Series A funding in its pocket, GitLab — creator of an open source alternative to code-hosting nexus GitHub — is setting out to expand its reach with enterprise customers.
While it seems nearly impossible for a third party to displace GitHub as a public code-sharing resource, on the enterprise side and behind the firewall GitLab may have a fighting chance…
Read more at IT World
Evolution of Apache Hadoop
The year 2016 will see Americans lining up to elect their new president. While passion and sentiments will dictate the outcome of the elections on the surface, deep down, modern technology will be at play, helping determine who will be the next president. These elections will harness the power of Big Data on a scale never done before. We have already seen the role that Big Data played in 2012 elections, and it’s only going to get bigger. This Big Data revolution is led by, as expected, open source and Apache Hadoop, in particular.
Brief History of Apache Hadoop
Almost a decade ago, Yahoo! asked its developers to work on a great web processing structure for the company in order to modernize its infrastructure. The team created an internal project that could handle what they needed. However, the developers wanted to open source the project so that they could collaborate with other players like Facebook and Twitter on the project.
Yahoo! had many patents on search, and their lawyers, for all the right reasons, didn’t want the patents to go into public domain. It was a deadlock. The team didn’t want to take on such a proprietary project, which would be deprived of collaboration, so they started to look around and found Hadoop, a little component written for the Nutch search engine project by Doug Cutting and Mike Cafarella.
Yahoo! folks realized that their own prototype was ahead of Hadoop at that point, but the fact that it could not be open sourced led them to make a tough decision. “We thought, it’s early enough, even if we are ahead of Hadoop as it was then, it makes no sense to develop proprietary infrastructure. So we abandoned our internal project and decided to adopt Hadoop. We also convinced Doug to join us,” said Arun C Murthy, founder and Architect at Hortonworks, who worked with Yahoo! back then.
The team worked very hard to improve Hadoop. “When we started on Hadoop, it worked on barely 2-3 machines. We did a lot of work, and at some point we had 100-plus people working on software and then we reached the point where in 2008 we went to production with a web search app on Hadoop, WebMap. This was an app where we were trying to grab the entire web,” added Murthy.
Beyond Web Search
Apache Hadoop, however, was meant to do much more than just power web search at Yahoo! Back then, Yahoo! was doing basic customization for users based on the IP address. “We found that they could offer content customized based on different factors, such as usage patterns and search history which goes beyond IP addresses. It not only allowed Yahoo! to serve better, personalized content but also to offer more suited ads, thus leading to monetization,” recalled Murthy.
And, this technology went beyond mere ads and customization; it went beyond Yahoo!
Today, almost every major industry utilizing Big Data is using Hadoop in one form or another, and it has brought a sea change to those industries. Advertising and the content industry clearly benefit from such analytic capabilities. The health industry also gains from it as, based on different data, a company can offer medicine at right place, at right time, and in the right quantity.
The insurance industry is also taking great advantage of it. By using data gathered through a tracking device installed in cars, for example, companies can offer better rates to careful drivers, and higher rates to reckless ones. The oil industry is using it, governments are using it, and even security agencies are heavy users of Big Data as analytics plays a critical role in national security.
In a nutshell, Hadoop is everywhere.
What Made Hadoop Such a Huge Success?
Many factors contributed to Hadoop’s success. The Apache Foundation (ASF) offered the environment it needed to attract the best developers. Doug Cutting, the founder of Hadoop told me in an interview, “Apache’s collaborative model was critical to Hadoop’s success. As an Apache project, it has been able to gather a diverse set of contributors that together refine and shape the technology to be more useful to a wider audience.”
Hadoop benefitted greatly from the infrastructure of Apache Foundation. “…be it communications (mailing lists, bug tracking etc) or hardware resources for helping with software development processes like building/testing the projects,” said Vinod Kumar Vavilapalli, a member of the Hadoop Project Management Committee.
The foundation also offered the project “legal shelter for the said contributions via the non-profit legal entity and the Apache Software License for the code. Besides these structural benefits, the diverse communities that form the foundation also help in fostering a collaborative, meritocratic environment,” added Vinod.
Hadoop Is Like a Solar System
Apache Hadoop, once a tiny component of another Apache project, is now a star in its own right, with different open source components revolving around it.
In talking about the evolution of Apache Hadoop, Vinod said, “It’s been a long and fantastic journey for Apache Hadoop since its modest beginnings.” Apache Hadoop has in fact become much more than a single Big Data project.
“Today, Hadoop together with its sister projects like Apache Hive, Apache Pig, Apache HBase, Apache Tez, Apache Oozie, Apache Spark, and nearly 20 (!) other related projects has spawned an entirely new industry aimed at addressing the big data storage and processing needs of our times,” said Vinod.
A recent addition to Hadoop’s ecosystem is YARN, which stands for Yet Another Resource Negotiator. It sits on top of the HDFS/distributed file system and essentially acts as the operating system for Hadoop. It has transformed the “Hadoop project from being a single-type data-processing framework (MapReduce) to a much-larger-in-scope cluster-management platform that facilitates running of a wide variety of applications and frameworks all on the same physical cluster resources,” stated Vinod.
“Then there are many data access engines that can plug into YARN such as Spark, Hive (for SQL), or Storm (for Streaming). But that isn’t enough for enterprises – they need security (Apache Ranger), data governance (Apache Atlas) and operations (Apache Ambari) capabilities. We have teams working on each of these projects and many more,” added Murthy.
Communities Are the Leaders of True Open Source Projects
Community-driven projects are always better than company owned ones; they attract more talent and derive more benefit from them. No matter how large a company is, it can’t hire everyone. The developer with the right skills may be working for a competitor. When you create a community driven by open source, the developer being paid by your competitor actually works to improve your code.
Such a community-driven development model was also pivotal to Hadoop’s success. “There is no leader or primary contributor. We’re all peers, reviewing and refining each other’s contributions, building consensus to move the project forward. Contributors are vital to open source. They provide improvements motivated by need. Contributors direct the project and drive it forward. One initially seeds an open source project with some useful code, but it’s not really alive until folks are contributing,” said Cutting.
“And that’s why ASF is a great place to collaborate,” said Murthy, “I can influence it with my code, but no one owns it except for ASF and they are like Switzerland — you don’t worry about ASF doing anything nasty to your code. That’s why everyone from Microsoft to IBM are comfortable putting their IP in the ASF.”
Community-Driven Capitalism
Many successful open source projects have struck a fine balance between the cathedral and the bazaar; open source is as much about entrepreneurship as it is about community. Apache Hadoop allowed early contributors such as Murthy to create companies like Hortonworks, which now offers open source software support subscriptions and training and consulting services. The company now serves industries including retail, healthcare, financial services, telecommunications, insurance, oil and gas, and manufacturing.
Open Source Is Becoming the Norm in Enterprise
The deep penetration of Apache Hadoop in multi-billion dollar industries is yet another example of open source becoming the norm in the enterprise segment; you don’t much hear about proprietary technologies at all.
Jim Zemlin, the executive director of the Linux Foundation says “Organizations have discovered that they want to shed what is essentially commodity R&D and software development that isn’t core to their customers and build all of that software in open source.”
This approach allows them to focus on their core business instead of building every single component used in their product. Sam Ramji, CEO of Cloud Foundry summed it nicely, “Users want it and it’s more efficient way to build software. The time for open source is here, even if has not taken over the world yet. I think in 10 years from now we won’t even have a word open source, we will just say software.”
Making the switch from Windows to Linux
I have been helping folks break away from Windows and switch to either Linux Mint or Ubuntufor a while now and I’m going to share part of an email I got this morning with you. It really exemplifies the reaction I get when people start using a Linux distro for the first time. You just have to get them in front of it and let them experience it for themselves. This fellow’s name is Brandon and he writes:
“I have come a long way since my first and even the last email I sent you. Linux Mint Cinnamon is looking great and thank you for emphasizing its value. I guess I had just never heard of it. I am also considering abandoning my support of Windows. Meaning I will let it live on a spare laptop in the mess that it’s in but from this point forward the mental weight of optimizing it and carrying it along in a reliable form seems like it is not worth it anymore. I will use that large amount of energy and focus towards proactively advancing skill with Linux, enough so that I can depend on it entirely. So rather than multi-boot it looks like I am going to throw in a blank SSD and put Mint 17.2 on it. Many of the configurations I saw you do in your Mint videos cover the points I am looking for. I am confident the other usability tweaks I am seeking in time will fall into place.” (Read the rest)