One key benefit of open source is its ability to enable rapid innovation. Collaborating on non-competitive pieces of technology frees up resources, enabling companies to focus more on developing new products and services.
We are seeing this play out now in the automotive industry as automakers are adopting open source software for core technologies like the infotainment operating system. This allows them to focus more resources towards the industry-wide race to develop new technologies, mobility services, and autonomous vehicles.
According to the 2017 Autotrader Car Tech Impact Study, 53 percent of consumers expect vehicle technology to be as robust as their smartphone. Unfortunately, the automotive industry has fallen behind the smartphone in terms of features and functionality. Automotive innovation is too slow, time-to-market is too long, and there’s very little software reuse.
In the sometimes-contentious Linux Kernel developer community, the gentle giant of a man Greg Kroah-Hartman is the friendliest face. When you plug a device into a Linux system and it works out of the box, the credit goes to Kroah-Hartman. He travels around the globe, talking to hardware vendors to get Linux to work on their devices.
But Kroah-Hartman was not a Linux user from the beginning: He evolved over time into one of the most influential kernel developers.
Kroah-Hartman has been a techie from a very early age. He started programming on his parent’s personal computer that they bought for home use.
A few years later, Kroah-Hartman was at his first job working on firmware for printers. He attended an embedded systems conference and saw a booth for Cygnus who was offering support for Stallman’s GNU C Compiler Collection (gcc) at the time.
There’s been a long-running debate over open source and security, and it goes something like this:
Pro: Open source is awesome! Given enough eyes, all bugs are shallow. This is why open source software is inherently more secure.
Con: Hackers can see the code! They’ll look at the source code and find ways to exploit it. This is why open source software is inherently more insecure.
And on and on… ad nauseum. There are a variety of studies that each side can finger to help state their case. The problem as I see it, is that we’re not even talking about the same thing. If someone says open source software is more or less secure, what are they actually talking about?
Edge computing is poised to boost the next generation of IoT technology into the mainstream. Here’s how it works with the cloud to benefit business operations in all industries.
Cloud computing has dominated IT discussions for the last two decades, particularly since Amazon popularized the term in 2006 with the release of its Elastic Compute Cloud. In its simplest form, cloud computing is the centralization of computing services to take advantage of a shared data center infrastructure and the economy of scale to reduce costs. However, latency, influenced by the number of router hops, packet delays introduced by virtualization, or server placement within a data center, has always been a key issue of cloud migration. Edge computing has also been a driver of innovation within OpenStack, the open source cloud computing project.
This is where edge computing comes in. Edge computing is essentially the process of decentralizing computer services and moving them closer to the source of data.
We’ve been hearing the phrase “year of the Linux desktop” from times immemorial. The FOSS and Linux community tosses up this idea at the beginning of a new year and expects the Linux adoption to rise exponentially in the upcoming months. While a complete Linux dominance in the desktop scene looks like a far-fetched dream, Tux continues to make slow strides.
According to the latest data from NetMarketShare, Linux is running on 3.37% desktop computers and laptops. This Linux market share number is from August 2017.
I’ve been working with container networking a bunch this week. When learning about new unfamiliar stuff (like container networking / virtual ethernet devices / bridges / iptables), I often realize that I don’t fully understand something much more fundamental.
This week, that thing was: network interfaces!!
You know, when you run ifconfig and it lists devices like lo, eth0, br0, docker0, wlan0, or whatever. Those.
This is a thing I thought I understood but it turns out there are at least 2 things I didn’t know about them.
I’m not going to try to give you a crisp definition, instead we’re going to make some observations, do some experiments, ask some questions, and make some guesses.
Linux serves — of that there is no doubt — literally and figuratively. The open source platform serves up websites across the globe, it serves educational systems in numerous ways, and it also serves the medical and scientific communities and has done so for quite some time.
I remember, back in my early days of Linux usage (I first adopted Linux as my OS of choice in 1997), how every Linux distribution included so many tools I would personally never use. Tools used for plotting and calculating on levels I’d not even heard of before. I cannot remember the names of those tools, but I know they were opened once and never again. I didn’t understand their purpose. Why? Because I wasn’t knee-deep in studying such science.
Modern Linux is a far cry from those early days. Not only is it much more user-friendly, it doesn’t include that plethora of science-centric tools. There are, however, still Linux distributions for that very purpose — serving the scientific and medical communities.
Let’s take a look at a few of these distributions. Maybe one of them will suit your needs.
Scientific Linux
You can’t start a listing of science-specific Linux distributions without first mentioning Scientific Linux. This particular take on Linux was developed by Fermilab. Based on Red Hat Enterprise Linux, Scientific Linux aims to offer a common Linux distribution for various labs and universities around the world, in order to reduce duplication of effort. The goal of Scientific Linux is to have a distribution that is compatible with Red Hat Enterprise Linux, that:
Provides a stable, scalable, and extensible operating system for scientific computing.
Supports scientific research by providing the necessary methods and procedures to enable the integration of scientific applications with the operating environment.
Uses the free exchange of ideas, designs, and implementations in order to prepare a computing platform for the next generation of scientific computing.
Includes all the necessary tools to enable users to create their own Scientific Linux spins.
Because Scientific Linux is based on Red Hat Enterprise Linux, you can select a Security Policy for the platform during installation (Figure 1).
Figure 1: Selecting a security policy for Scientific Linux during installation.
Two famous experiments that work with Scientific Linux are:
Collider Detector at Fermilab — experimental collaboration that studies high energy particle collisions at the Tevatron (a circular particle accelerator)
DØ experiment — a worldwide collaboration of scientists that conducts research on the fundamental nature of matter.
What you might find interesting about Scientific Linux is that it doesn’t actually include all the science-y goodness you might expect. There is no Matlab equivalent pre-installed, or other such tools. The good news is that there are plenty of repositories available that allow you to install everything you need to create a distribution that perfectly suits your needs.
Scientific Linux is available to use for free and can be downloaded from the official download page.
Bio-Linux
Now we’re venturing into territory that should make at least one cross section of scientists very happy. Bio-Linux is a distribution aimed specifically at bioinformatics (the science of collecting and analyzing complex biological data such as genetic codes). This very green-looking take on Linux (Figure 2) was developed at the Environmental Omics Synthesis Centre and the Natural Environment for Ecology & Hydrology and includes hundreds of bioinformatics tools, including:
abyss — de novo, parallel, sequence assembler for short reads
Artemis — DNA sequence viewer and annotation tool
bamtools — toolkit for manipulating BAM (genome alignment) files
Big-blast — The big-blast script for annotation of long sequence
Galaxy — browser-based biomedical research platform
Fasta — tool for searching DNA and protein databases
Mesquite — used for evolutionary biology
njplot — tool for drawing phylogenetic trees
Rasmo — tool for visualizing macromolecules
Figure 2: The Bio-Linux desktop.
There are plenty of command line and graphical tools to be found in this niche platform. For a complete list, check out the included software page here.
Bio-Linux is based on Ubuntu and is available for free download.
Poseidon Linux
This particular Ubuntu-based Linux distribution originally started as a desktop, based on open source software, aimed at the international scientific community. Back in 2010, the platform switched directions to focus solely on bathymetry (the measurement of depth of water in oceans, seas, or lakes), seafloor mapping, GIS, and 3D visualization.
Figure 3: Poseidon Linux with menus (Image: Wikipedia).
Poseidon Linux (Figure 3) is, effectively, Ubuntu 16.04 (complete with Ubuntu Unity, at the moment) with the addition of GMT (a collection of about 80 command-line tools for manipulating geographic and Cartesian data sets), PROJ (a standard UNIX filter function which converts geographic longitude and latitude coordinates into Cartesian coordinates), and MB System (seafloor mapping software).
Yes, Poseidon Linux is a very niche distribution, but if you need to measure the depth of water in oceans, seas, and lakes, you’ll be glad it’s available.
A group of British IT specialists took on the task to tailor Ubuntu Linux to be used as a desktop distribution by the UK National Health Service. NHSbuntu was first released, as an alpha, on April 27, 2017. The goal was to create a PC operating system that could deliver security, speed, and cost-effectiveness and to create a desktop distribution that would conform to the needs of the NHS — not insist the NHS conform to the needs of the software. NHSbuntu was set up for full disk encryption to safeguard the privacy of sensitive data.
NHSbuntu includes LibreOffice, NHSMail2 (a version of the Evolution groupware suite, capable of connecting to NHSmail2 and Trust email), and Chat (a messenger app able to work with NHSmail2). This spin on Ubuntu can:
Perform as a Clinical OS
Serve as an office desktop OS
Be used as in kiosk mode
Function as a real-time dashboard
Figure 4: NHSbuntu main screen.
The specific customizations of NHSbuntu are:
NHSbuntu wallpaper (Figure 4)
A look and feel similar to a well-known desktop
NHSmail2 compatibility
Email, calendar, address book
Messager, with file sharing
N3 VPN compatibility
RSA token supported
Removal of games
Inclusion of Remmina (Remote Desktop client for VDI)
NHSbuntu can be downloaded, for free, for either 32- or 64-bit hardware.
The tip of the scientific iceberg
Even if you cannot find a Linux distribution geared toward your specific branch of science or medicine, chances are you will find software perfectly capable of serving your needs. There are even organizations (such as the Open Science Project and Neurodebian) dedicated to writing and releasing open source software for the scientific community.
Learn more about Linux through the free “Introduction to Linux” course from The Linux Foundation and edX.
“The Compliance Industrial Complex” is a term that evokes dystopian imagery of organizations engaging in elaborate and highly expensive processes to comply with open source license terms. As life often imitates art, many organizations engage in this practice, sadly robbing them of the many benefits of the open source model. This article presents an economically efficient approach to open source software license compliance.
Open source licenses generally impose three requirements on a distributor of code licensed from a third party:
Provide a copy of the open source license(s)
Include copyright notices
For copyleft licenses (like GPL), make the corresponding source code available to the distributees
If you explore the Wiki pages of EdgeX Foundry, you will see several references to the project’s architectural “tenets”. These are the principles that guide how the project’s contributors and technical steering committee decide what changes are accepted into the project, what features will be pursued, and ultimately what technology the group will advance together.
The tenets of EdgeX did not get established overnight. They are not some sort of religious doctrine or commandments (although some of us would like to see them carved in stone someday). We didn’t blindly establish them because it fit within today’s software mantra.
No, the EdgeX tenets have evolved from industry-wide collaboration that addresses the use cases and challenges of edge computing. More specifically, they evolved through trial and practice in Project Fuse, which Dell started more than two years ago and donated to The Linux Foundation earlier this year to seed EdgeX Foundry. These tenets represent the imbued lessons learned while building EdgeX Foundry, and they are the bedrock that will allow the EdgeX community and the commercial ecosystem around the project to continue to grow and thrive.
Understanding the impact and expanding influence of DevOps culture, and how to apply DevOps principles to make your digital operations more performant and productive.
A few years ago, I wrote that DevOps is the movement that doesn’t want to be defined. That’s still true, though being cryptic doesn’t help executives who need to understand what’s going on in their industries, or who want to make their digital operations more productive. You may already have people in your company who are “doing DevOps,” or who want to. What are they doing? What do they want?
Let’s start with origins. Back in 2007, a Belgian engineer named Patrick Debois became frustrated by the friction between developer and operations teams. As a developer and a member of the Agile community, Debois saw an opportunity to use Agile methodologies to manage infrastructure management process, much like developers manage development process. He initially described this concept as Agile Infrastructure but later coined the phrase DevOps, a portmanteau of development and operations.