Home Blog Page 1002

Autodesk Open Sources Linux-Based 3D Printer

Autodesk has open sourced the electronics and firmware of its resin- and DLP-based Ember 3D printer, revealing it to run Linux on a BeagleBone Black clone.

In releasing the design of its Ember 3D Printer under open source licensing, Autodesk has revealed a mainboard that runs Linux on a customized spin-off of the BeagleBone Black hacker SBC. In March, the company published the recipe for the printer’s “PR48” Standard Clear Prototyping resin, and in May, it followed through by open sourcing its mechanical files.

Read more at LinuxGizmos

Splunk Adds New MINT, Hunk, IoT Support to Platform

Splunk, which provides an analytics platform for machine data, today announced updates to several products including Splunk MINT, Splunk Light and Hunk.

Hunk 6.3 is an integrated analytics platform used to explore, analyze and visualize big data in Hadoop and Amazon S3. Splunk Light is a lighter weight version of the Splunk platform made for smaller IT shops, and MINT is an operational intelligence application that runs on top of Splunk Enterprise and Splunk Cloud.

Read more at eWeek

Google Launches Service for Managing Hadoop, Spark Clusters

Cloud Dataproc will make it easier to administer and manage clusters, the company says.

Big data analytics technologies such as Hadoop and Spark can help organizations extract business value from massive data sets, but they can be very complex to administer and to manage. Hoping to help reduce some of that complexity, Google Wednesday announced the launch of a new service dubbed Cloud Dataproc for customers of its cloud platform.

Read more at eWeek

The Companies That Support Linux: DCHQ

DCHQ-Banner-Linux-Foundation copyDCHQ is a governance, deployment automation, and lifecycle management platform for container-based applications. The company offers out-of-the-box integrations with private and public cloud platforms, which lets development teams automate the provisioning and scaling of virtual infrastructure they’re already using.

DCHQ recently joined The Linux Foundation as a new member. In this profile, Amjad Afanah, founder of DCHQ, tells us more about the company and their open source strategy, including why they joined The Linux Foundation and how they are innovating with Linux and open source.

Can you describe DCHQ.io’s business for us?

DCHQ delivers enterprise discipline to container app lifecycle management. It combines Docker agility with advanced application modeling, lifecycle management, policy and governance controls. Available in hosted and on-premise versions, DCHQ gives infrastructure operators the controls and end-to-end automation they need while still giving app developers the agility they want when moving from dev/test to production model.

DCHQ provides an advanced application modeling framework. It ships standard with enhancements to Docker Compose like cross-image environment variable bindings, extensible BASH script plug-ins that can be invoked at request time or post-provision, application clustering for high availability across multiple hosts or regions and auto scaling. It facilitates application deployments on Linux hosts running on-premise or in the public cloud using an agent-based architecture. This supports advancement placement, containers/hosts/clusters monitoring, application backups, continuous delivery, container updates and out-of-box alerts/notifications. DCHQ automates the provisioning and auto-scaling of virtual infrastructure on 12 different cloud providers and frameworks. The current list includes OpenStack, CloudStack, DigitalOcean, Microsoft Azure, Amazon Web Services, Google Compute Engine, Rackspace, HP Public Cloud, and IBM SoftLayer. Lastly, our product eliminates “Shadow IT” by enabling granular access controls to data-centers, application templates, builds, plug-ins and Docker repositories (including Docker Hub, Quay, and Red Hat Registry).

Why is open source important to the company? How and why do you use open source and/or Linux?

While DCHQ is commercially licensed software with enterprise-grade support, the company has embraced and contributes to open-source technologies in the Docker ecosystem, including the Open Container Initiative. DCHQ On-Premise runs on Docker containers and may be installed on-premise on Red Hat Enterprise Linux (RHEL), Ubuntu or CentOS. It installs via shell script or automated deployment from DCHQ Hosted PaaS. DCHQ registers a variety of Docker repositories, including Red Hat Container Registry, Docker Hub and Quay. It also integrates with Weave for cross-container communication across different hosts.

What is the company’s open source strategy?

DCHQ will continue to embrace and integrate with complementary open-source technologies. In the near future, the company plans to open source parts of the platform to get the support of the DCHQ community and continue to improve its services. 

Why is Docker a core part of your strategy? Why is Docker important?

Our platform today focuses on providing end-to-end automation for Docker-based applications. As more Linux Container technologies become popular, DCHQ plans to be container-agnostic and support other emerging container technologies.

What do you see as the biggest benefit of collaborative development?

The benefits are obvious as on-boarding the help of a development community can always take the platform further — especially as technologies and application frameworks evolve.

Why did you join the Linux Foundation?

We believe in the foundation’s mission and are keen on ensuring the growth of Linux as our platform ultimately depends on a stable and continuously improving Linux.

What interesting or innovative trends are you witnessing and what role does Linux or open source play in them?

The growing adoption of Linux Containers for continuous integration and delivery is the ultimate example of how open-source initiatives can transform development processes and accelerate application development. Our company is now focused on simplifying the containerization of enterprise applications and providing the infrastructure management, governance controls and application life-cycle management needed to fully unleash the power of Linux Containers.

What other open source or collaborative projects do you participate in and why?

We have just started contributing to various Apache projects, and in the near future, our contribution to Docker and other Linux projects will grow.

Anything else important or upcoming that you’d like to share?

The explosive growth and interest in containers needs support from companies that understand enterprise technology consumption models. That’s what DCHQ was founded to deliver for app modeling and lifecycle management. 

GitLab One-Ups GitHub With Open Source Enterprise Code Hosting

With a new version of its product in the offing and $4 million in Series A funding in its pocket, GitLab — creator of an open source alternative to code-hosting nexus GitHub — is setting out to expand its reach with enterprise customers.

While it seems nearly impossible for a third party to displace GitHub as a public code-sharing resource, on the enterprise side and behind the firewall GitLab may have a fighting chance…

Read more at IT World

Evolution of Apache Hadoop

hortonworks logo blackThe year 2016 will see Americans lining up to elect their new president. While passion and sentiments will dictate the outcome of the elections on the surface, deep down, modern technology will be at play, helping determine who will be the next president. These elections will harness the power of Big Data on a scale never done before. We have already seen the role that Big Data played in 2012 elections, and it’s only going to get bigger. This Big Data revolution is led by, as expected, open source and Apache Hadoop, in particular.

Brief History of Apache Hadoop

Almost a decade ago, Yahoo! asked its developers to work on a great web processing structure for the company in order to modernize its infrastructure. The team created an internal project that could handle what they needed. However, the developers wanted to open source the project so that they could collaborate with other players like Facebook and Twitter on the project.

Yahoo! had many patents on search, and their lawyers, for all the right reasons, didn’t want the patents to go into public domain. It was a deadlock. The team didn’t want to take on such a proprietary project, which would be deprived of collaboration, so they started to look around and found Hadoop, a little component written for the Nutch search engine project by Doug Cutting and Mike Cafarella.

Yahoo! folks realized that their own prototype was ahead of Hadoop at that point, but the fact that it could not be open sourced led them to make a tough decision. “We thought, it’s early enough, even if we are ahead of Hadoop as it was then, it makes no sense to develop proprietary infrastructure. So we abandoned our internal project and decided to adopt Hadoop. We also convinced Doug to join us,” said Arun C Murthy, founder and Architect at Hortonworks, who worked with Yahoo! back then.

The team worked very hard to improve Hadoop. “When we started on Hadoop, it worked on barely 2-3 machines. We did a lot of work, and at some point we had 100-plus people working on software and then we reached the point where in 2008 we went to production with a web search app on Hadoop, WebMap. This was an app where we were trying to grab the entire web,” added Murthy.

Beyond Web Search

Apache Hadoop, however, was meant to do much more than just power web search at Yahoo! Back then, Yahoo! was doing basic customization for users based on the IP address. “We found that they could offer content customized based on different factors, such as usage patterns and search history which goes beyond IP addresses. It not only allowed Yahoo! to serve better, personalized content but also to offer more suited ads, thus leading to monetization,” recalled Murthy.

And, this technology went beyond mere ads and customization; it went beyond Yahoo!

Today, almost every major industry utilizing Big Data is using Hadoop in one form or another, and it has brought a sea change to those industries. Advertising and the content industry clearly benefit from such analytic capabilities. The health industry also gains from it as, based on different data, a company can offer medicine at right place, at right time, and in the right quantity.

The insurance industry is also taking great advantage of it. By using data gathered through a tracking device installed in cars, for example, companies can offer better rates to careful drivers, and higher rates to reckless ones. The oil industry is using it, governments are using it, and even security agencies are heavy users of Big Data as analytics plays a critical role in national security.

In a nutshell, Hadoop is everywhere.

What Made Hadoop Such a Huge Success?

Many factors contributed to Hadoop’s success. The Apache Foundation (ASF) offered the environment it needed to attract the best developers. Doug Cutting, the founder of Hadoop told me in an interview, “Apache’s collaborative model was critical to Hadoop’s success. As an Apache project, it has been able to gather a diverse set of contributors that together refine and shape the technology to be more useful to a wider audience.”

Hadoop benefitted greatly from the infrastructure of Apache Foundation. “…be it communications (mailing lists, bug tracking etc) or hardware resources for helping with software  development processes like building/testing the projects,” said Vinod Kumar Vavilapalli, a member of the Hadoop Project Management Committee.

The foundation also offered the project “legal shelter for the said contributions via the non-profit legal entity and the Apache Software License for the code. Besides these structural benefits, the diverse communities that form the foundation also help in fostering a collaborative, meritocratic environment,” added Vinod.

Hadoop Is Like a Solar System

Apache Hadoop, once a tiny component of another Apache project, is now a star in its own right, with different open source components revolving around it.

In talking about the evolution of Apache Hadoop, Vinod said, “It’s been a long and fantastic journey for Apache Hadoop since its modest beginnings.” Apache Hadoop has in fact become much more than a single Big Data project.

“Today, Hadoop together with its sister projects like Apache Hive, Apache Pig, Apache HBase, Apache Tez, Apache Oozie, Apache Spark, and nearly 20 (!) other related projects has spawned an entirely new industry aimed at addressing the big data storage and processing needs of our times,” said Vinod.

A recent addition to Hadoop’s ecosystem is YARN, which stands for Yet Another Resource Negotiator. It sits on top of the HDFS/distributed file system and essentially acts as the operating system for Hadoop. It has transformed the “Hadoop project from being a single-type data-processing framework (MapReduce) to a much-larger-in-scope cluster-management platform that facilitates running of a wide variety of applications and frameworks all on the same physical cluster resources,” stated Vinod.

“Then there are many data access engines that can plug into YARN such as Spark, Hive (for SQL), or Storm (for Streaming). But that isn’t enough for enterprises – they need security (Apache Ranger), data governance (Apache Atlas) and operations (Apache Ambari) capabilities. We have teams working on each of these projects and many more,” added Murthy.

Communities Are the Leaders of True Open Source Projects

Community-driven projects are always better than company owned ones; they attract more talent and derive more benefit from them. No matter how large a company is, it can’t hire everyone. The developer with the right skills may be working for a competitor. When you create a community driven by open source, the developer being paid by your competitor actually works to improve your code.

Such a community-driven development model was also pivotal to Hadoop’s success. “There is no leader or primary contributor. We’re all peers, reviewing and refining each other’s contributions, building consensus to move the project forward. Contributors are vital to open source. They provide improvements motivated by need. Contributors direct the project and drive it forward. One initially seeds an open source project with some useful code, but it’s not really alive until folks are contributing,” said Cutting.

“And that’s why ASF is a great place to collaborate,” said Murthy, “I can influence it with my code, but no one owns it except for ASF and they are like Switzerland — you don’t worry about ASF doing anything nasty to your code. That’s why everyone from Microsoft to IBM are comfortable putting their IP in the ASF.”

Community-Driven Capitalism

Many successful open source projects have struck a fine balance between the cathedral and the bazaar; open source is as much about entrepreneurship as it is about community. Apache Hadoop allowed early contributors such as Murthy to create companies like Hortonworks, which now offers open source software support subscriptions and training and consulting services. The company now serves industries including retail, healthcare, financial services, telecommunications, insurance, oil and gas, and manufacturing.

Open Source Is Becoming the Norm in Enterprise

The deep penetration of Apache Hadoop in multi-billion dollar industries is yet another example of open source becoming the norm in the enterprise segment; you don’t much hear about proprietary technologies at all.

Jim Zemlin, the executive director of the Linux Foundation says “Organizations have discovered that they want to shed what is essentially commodity R&D and software development that isn’t core to their customers and build all of that software in open source.”

This approach allows them to focus on their core business instead of building every single component used in their product. Sam Ramji, CEO of Cloud Foundry summed it nicely, “Users want it and it’s more efficient way to build software. The time for open source is here, even if has not taken over the world yet. I think in 10 years from now we won’t even have a word open source, we will just say software.”

Making the switch from Windows to Linux

I have been helping folks break away from Windows and switch to either Linux Mint or Ubuntufor a while now and I’m going to share part of an email I got this morning with you. It really exemplifies the reaction I get when people start using a Linux distro for the first time. You just have to get them in front of it and let them experience it for themselves. This fellow’s name is Brandon and he writes:

“I have come a long way since my first and even the last email I sent you. Linux Mint Cinnamon is looking great and thank you for emphasizing its value. I guess I had just never heard of it. I am also considering abandoning my support of Windows. Meaning I will let it live on a spare laptop in the mess that it’s in but from this point forward the mental weight of optimizing it and carrying it along in a reliable form seems like it is not worth it anymore. I will use that large amount of energy and focus towards proactively advancing skill with Linux, enough so that I can depend on it entirely. So rather than multi-boot it looks like I am going to throw in a blank SSD and put Mint 17.2 on it. Many of the configurations I saw you do in your Mint videos cover the points I am looking for. I am confident the other usability tweaks I am seeking in time will fall into place.” (Read the rest)

SMBs And Cloud Security

servers-logoFor SMBs, using cloud and managed hosting services relieves IT of the need to buy, house, and manage infrastructure, and of many associated costs and tasks. But “going cloud” does not eliminate all in-house IT responsibilities — including security.

To be sure, a cloud/hosting provider must be responsible for many aspects of IT security. How much depends in part on whether you are simply using infrastructure, or also using applications and other services from the provider.

But, in general, whatever your company does using provider services — running on them or connecting to them — it is up to your company to make sure they are properly secured.

In his “Schneier on Security” blog, security/privacy expert Bruce Schneier points out “Cloud providers have the potential to be far more secure than the corporations whose data they are holding. It is the same economies of scale. For most companies, the cloud provider is likely to have better security than them — by a lot.”

Here’s a look at what aspects of IT security you can — and can’t — look to your cloud vendor to handle, according to Kostyantyn Bezruchenko, CTO of global cloud platform and hosting provider Servers.com.

Security Your Cloud Provider Should Provide

“For Servers.com, cloud and hosting IT security begins at the hardware configuration level,” says Bezruchenko. “For example, at the network level, we have a fully redundant private wide-area network, isolated at the hardware level,” says Bezruchenko. “The private networks ensure the security of communication of customer processes among servers and storage both within and between our data centers, such as virtual machines, containers, and clustering.”

“Because a cloud is a mix of different hardware components, which may be highly dependent one from another, physical security is more important for cloud than for typical bare-metal server infrastructure,” says Bezruchenko.

“However, it’s way more important to keep software infrastructure up-to-date, since any security breach can lead to massive data exposure of all virtual machines running on the same host,” says Bezruchenko.

Servers-datacenter4In terms of software, what you can expect depends in part on what services you are purchasing. If you’re buying bare-metal hosting or cloud virtual machines, the provider is responsible for the security of the platform — but security for the applications, data, and interactions with other systems and with users is likely to be up to your company.

“Security common across all service provides includes network firewall, web application firewall (WAF), private networking, and DDoS protection,” says Bezruchenko. “We already have the last two, and are working with various vendors to implement network and web application firewalls.”

Along with security proper, your cloud provider is responsible for some of the regulatory compliance requirements — but check carefully, as your company is likely responsible for ensuring security compliance of your software architecture and your applications.

“Not every enterprise can afford to maintain same service quality as data centers do,” says Bezruchenko. “Nowadays, keeping any data in data center is more secure than on-premises. A data center may be less secure in terms of physical access, but in terms of power and connectivity — which is also a part of security — the data center absolutely wins. Take a DDoS attack as an example — each of our data centers has at least 400Gbps of external network capacity, which may help to sustain volumetric DDoS attacks. It will be hard to do that on-premises.”

Cloud-Related Security Your Company Is Responsible For

“We, as a service provider, can only provide a secure infrastructure and some additional instruments, like private networks, DDoS protection, and firewalls,” says Bezruchenko. “However, the most important part is customer application security. We can only suggest customers to run penetration testing before an application goes live, and use qualified sysadmins to secure their servers.”

This includes securing all the applications, and managing passwords and permissions. It may include operating system instances, system images, and virtual machine and container templates, which are come “out of the box” needing to be secured. It also includes securing all interaction between your company’s IT and the cloud provider, including APIs and the network connections.

Because your developers and administrators are working “remotely” with cloud resources, you need to provide secure remote-access methods, tools, and procedures — and be sure that all access credentials, and the tools that manage those, are well-secured.

You also need to make sure that the same level of IT security you use for your own systems and networks is applied to your cloud activity, such as network firewalls and intrusion detection/monitoring.

It’s also advisable that you do regular backup of data that’s stored in the provider to a separate third-party service.

Security Questions For Your Cloud Provider

Here are some security questions to ask a prospective cloud provider:

  • Multitenant security (shared environments): How do they ensure that other tenants (i.e., unauthorized users) won’t be able to access your private data?

  • Securing the virtualization layer: Similarly, for servers hosting VMs from multiple customers, how are these secured?

  • Regulatory compliance: How do they help you identify, and comply with, all relevant industry and geographic/political regulations? Which ones are the provider responsible for?

  • How do they prevent “shadow cloud” activity of their services by your employees and contractors?

  • Do they offer encryption? Does that include key management? If so, who has access to the encryption keys?

  • Do they offer identify and access management? File integrity monitoring?

  • Do they offer integration points that work with whatever identity and security you are using?

In general, ask your target cloud provider what security they do — and don’t — provide, and what if any services they offer to help your company fill in those gaps.

A beginners Guide to Bash Scripting on Linux

A Brief Introduction
Bash or Bourne again shell is a replacement to the original unix shell written by Stephen Bourne at Bell Labs.

It offers vast improvements over the original shell, which include
Integer arithmetic,
Indexed arrays,
Command line editing,
Unlimited command history.

Bash scripts are available by default on most Linux distributions.

To find out which version of bash you are running type the following command.

[leo@bash101 ~]$ bash –version

 

Read More 

Microsoft and DataStax Tie Up Cassandra on Azure Deal As New Titan Graph Database Rolls Out

It’s big day for Cassandra firm DataStax, with its database offering now on Microsoft Azure, plus the release of the Titan graph database.

After a year’s technical collaboration, Microsoft and DataStax have today unveiled a tie-up that puts the distributed database firm’s enterprise Apache Cassandra offering on the Azure cloud computing platform. The two companies say DataStax Enterprise on Microsoft Azure will help developers create and manage internet-of-things, web and mobile apps across public and private clouds.

Read more at ZDNet News