Among all the changes, Torvalds highlights the one concerning the implementation of the SMB protocol in the kernel: The CIFS behavior in kernel 4.13 defaults to SMB3 as opposed to SMB1, which was the default in previous kernels.
SMB is a protocol used to access and share files, printers, and other services over a network, and the reason for the switch is that SMB 1 has aged horribly and is rife with vulnerabilities. The number of servers that still use it was one of the reasons the WannaCry ransomware spread like wildfire back in May. However, SMB1 is still accessible from kernel 4.13 for those that really, really have to use it. If you can’t make the change (although you are highly encouraged to find a way to do so), you may need to add an explicit
vers=1.0
to your mount options in your/etc/fstab file.
Another security-related feature that has found its way into 4.13 is the kernel-based TLS implementation. TLS, which stands for Transport Layer Security, provides privacy over a network when, for example, you to a web server. TLS encrypts data flowing from the client to the server and back, it authenticates the server so you make sure you are really connecting to what you think you are connecting to, and the connection ensures integrity, meaning that each message comes with a code that either side can use to check that there has been no data loss or changes along the way.
With all this encrypting and checking going on, using TLS is much more CPU-hungry than the old insecure way of sending and receiving messages. By building TLS into the kernel, you get better performance for HTTPS and other protocols that use TLS.
Other stuff that’s new in Kernel 4.13
The kernel now supports HDMI Stereo 3D output courtesy of the new Nouveau drivers. To be able to enjoy 3D Stereo output, you will of course need hardware (a video card and a display) that supports 3D. Kudos to the Nouveau team.
The EXT4 file system now has the largedir feature. This means a single EXT4 directory can now support 2 billion entries, over the prior limitation of 10 million entries of prior kernels.
As usual, there’s a whole slew of new ARM devices that get native support in kernel 4.13, including the NanoPi NEO2, Orange Pi Prime, LicheePi Zero dock board, Orange Pi Zero Plus 2, SoPine SoM and the NanoPi M1 Plus. Especially interesting is the support for the BeagleBone Blue, a single board computer developed by Texas Instruments which is specially designed for use in robotics and drones.
To find out more and get a full list of changes and what they mean, you can check out the writeups at Kernel Newbies and Phoronix.
Connect with the Linux kernel development community at Open Source Summit. You can check out the full schedulehere. Linux.com readers save on registration with discount code LINUXRD5. Register now!
In today’s rapidly changing system administration landscape, skills and credentials count for a lot, but professional certification can also make a difference. With that in mind, let’s take a look at five valuable types of certification for sysadmins along with relevant training options.
Linux credentials
As mentioned previously, Linux provides the foundation for many servers and cloud deployments, as well as mobile devices. And, several salary studies have shown that Linux-savvy sysadmins are better compensated than others.
Meanwhile, training options for Linux-focused sysadmins are expanding. For professional certification, CompTIA Linux+ is an option, as are certifications from Linux Professional Institute. The Linux Foundation’s Linux Foundation Certified System Administrator (LFCS) is another good choice. These educational options delve into everything from managing file permissions and partitioning storage devices to troubleshooting filesystem issues.
Sysadmins without much previous experience may want to consider the Introduction to Linux online course, which is delivered through a partnership between The Linux Foundation and edX. The course is hugely popular and can help with basic preparation for the Linux Foundation Certified SysAdmin Exam.
Platform-specific certification
Many organizations are in need of sysadmins who have specialized and specific skillsets surrounding the core technology platforms that they run. For example, organizations based on Red Hat’s platform technology may prefer to hire a Red Hat Certified System Administrator. This credential is earned after successfully passing the Red Hat Certified System Administrator (RHCSA) Exam (EX200). Likewise, training and certification are available for SUSE Certified Administrators, for Microsoft-focused administrators, for VMware administrators, and for numerous other platforms.
CompTIA Server+ hardware, network and security certification
Today’s IT environments demand more planning, better security, and more maintenance than ever before, and CompTIA offers an array of entry-level certifications, including A+ for hardware technicians, Network+ for network admins and Security+ for security specialists. These certification have earned recognition among hiring managers, and they can help a sysadmin land a job or serve as a good platform for obtaining a more targeted type of certification. HP, Intel, and the U.S. Department of Defense are all among organizations that employ CompTIA Server+-certified staffers.
Cloud certification
Salary studies show that sysadmins fluent with the cloud command more pay. As mentioned previously, 51 percent of surveyed hiring managers said that knowledge of cloud platforms has a big impact on open source hiring decisions, according to the 2016 Linux Foundation/Dice Open Source Jobs Report. If you search recruitment sites for sysadmin positions that demand cloud skills you’ll see that opportunities abound. There are positions that require strong cloud monitoring skills, and jobs that demand facility with both open source and popular public cloud platforms.
A sysadmin who holds the Red Hat Certified System Administrator in Red Hat OpenStack credential has demonstrated the skills, needed to create, configure, and manage private clouds using Red Hat OpenStack Platform. Red Hat’s training for this certification covers configuring and managing images, adding compute nodes, and managing storage using Swift and Cinder.
Many sysadmins have experience with scripting, and some have experience with full-blown application development. For those with some scripting and development skills, Cloud Foundry Developer Certification is an emerging credential worth looking into. It’s a professional cloud-native developer certification that can be earned through a performance-based exam that evaluates knowledge of the Cloud Foundry platform.
Next time, we’ll consider some non-technical skills that are equally important for sysadmins looking to advance their careers.
One key benefit of open source is its ability to enable rapid innovation. Collaborating on non-competitive pieces of technology frees up resources, enabling companies to focus more on developing new products and services.
We are seeing this play out now in the automotive industry as automakers are adopting open source software for core technologies like the infotainment operating system. This allows them to focus more resources towards the industry-wide race to develop new technologies, mobility services, and autonomous vehicles.
According to the 2017 Autotrader Car Tech Impact Study, 53 percent of consumers expect vehicle technology to be as robust as their smartphone. Unfortunately, the automotive industry has fallen behind the smartphone in terms of features and functionality. Automotive innovation is too slow, time-to-market is too long, and there’s very little software reuse.
In the sometimes-contentious Linux Kernel developer community, the gentle giant of a man Greg Kroah-Hartman is the friendliest face. When you plug a device into a Linux system and it works out of the box, the credit goes to Kroah-Hartman. He travels around the globe, talking to hardware vendors to get Linux to work on their devices.
But Kroah-Hartman was not a Linux user from the beginning: He evolved over time into one of the most influential kernel developers.
Kroah-Hartman has been a techie from a very early age. He started programming on his parent’s personal computer that they bought for home use.
A few years later, Kroah-Hartman was at his first job working on firmware for printers. He attended an embedded systems conference and saw a booth for Cygnus who was offering support for Stallman’s GNU C Compiler Collection (gcc) at the time.
There’s been a long-running debate over open source and security, and it goes something like this:
Pro: Open source is awesome! Given enough eyes, all bugs are shallow. This is why open source software is inherently more secure.
Con: Hackers can see the code! They’ll look at the source code and find ways to exploit it. This is why open source software is inherently more insecure.
And on and on… ad nauseum. There are a variety of studies that each side can finger to help state their case. The problem as I see it, is that we’re not even talking about the same thing. If someone says open source software is more or less secure, what are they actually talking about?
Edge computing is poised to boost the next generation of IoT technology into the mainstream. Here’s how it works with the cloud to benefit business operations in all industries.
Cloud computing has dominated IT discussions for the last two decades, particularly since Amazon popularized the term in 2006 with the release of its Elastic Compute Cloud. In its simplest form, cloud computing is the centralization of computing services to take advantage of a shared data center infrastructure and the economy of scale to reduce costs. However, latency, influenced by the number of router hops, packet delays introduced by virtualization, or server placement within a data center, has always been a key issue of cloud migration. Edge computing has also been a driver of innovation within OpenStack, the open source cloud computing project.
This is where edge computing comes in. Edge computing is essentially the process of decentralizing computer services and moving them closer to the source of data.
We’ve been hearing the phrase “year of the Linux desktop” from times immemorial. The FOSS and Linux community tosses up this idea at the beginning of a new year and expects the Linux adoption to rise exponentially in the upcoming months. While a complete Linux dominance in the desktop scene looks like a far-fetched dream, Tux continues to make slow strides.
According to the latest data from NetMarketShare, Linux is running on 3.37% desktop computers and laptops. This Linux market share number is from August 2017.
I’ve been working with container networking a bunch this week. When learning about new unfamiliar stuff (like container networking / virtual ethernet devices / bridges / iptables), I often realize that I don’t fully understand something much more fundamental.
This week, that thing was: network interfaces!!
You know, when you run ifconfig and it lists devices like lo, eth0, br0, docker0, wlan0, or whatever. Those.
This is a thing I thought I understood but it turns out there are at least 2 things I didn’t know about them.
I’m not going to try to give you a crisp definition, instead we’re going to make some observations, do some experiments, ask some questions, and make some guesses.
Linux serves — of that there is no doubt — literally and figuratively. The open source platform serves up websites across the globe, it serves educational systems in numerous ways, and it also serves the medical and scientific communities and has done so for quite some time.
I remember, back in my early days of Linux usage (I first adopted Linux as my OS of choice in 1997), how every Linux distribution included so many tools I would personally never use. Tools used for plotting and calculating on levels I’d not even heard of before. I cannot remember the names of those tools, but I know they were opened once and never again. I didn’t understand their purpose. Why? Because I wasn’t knee-deep in studying such science.
Modern Linux is a far cry from those early days. Not only is it much more user-friendly, it doesn’t include that plethora of science-centric tools. There are, however, still Linux distributions for that very purpose — serving the scientific and medical communities.
Let’s take a look at a few of these distributions. Maybe one of them will suit your needs.
Scientific Linux
You can’t start a listing of science-specific Linux distributions without first mentioning Scientific Linux. This particular take on Linux was developed by Fermilab. Based on Red Hat Enterprise Linux, Scientific Linux aims to offer a common Linux distribution for various labs and universities around the world, in order to reduce duplication of effort. The goal of Scientific Linux is to have a distribution that is compatible with Red Hat Enterprise Linux, that:
Provides a stable, scalable, and extensible operating system for scientific computing.
Supports scientific research by providing the necessary methods and procedures to enable the integration of scientific applications with the operating environment.
Uses the free exchange of ideas, designs, and implementations in order to prepare a computing platform for the next generation of scientific computing.
Includes all the necessary tools to enable users to create their own Scientific Linux spins.
Because Scientific Linux is based on Red Hat Enterprise Linux, you can select a Security Policy for the platform during installation (Figure 1).
Figure 1: Selecting a security policy for Scientific Linux during installation.
Two famous experiments that work with Scientific Linux are:
Collider Detector at Fermilab — experimental collaboration that studies high energy particle collisions at the Tevatron (a circular particle accelerator)
DØ experiment — a worldwide collaboration of scientists that conducts research on the fundamental nature of matter.
What you might find interesting about Scientific Linux is that it doesn’t actually include all the science-y goodness you might expect. There is no Matlab equivalent pre-installed, or other such tools. The good news is that there are plenty of repositories available that allow you to install everything you need to create a distribution that perfectly suits your needs.
Scientific Linux is available to use for free and can be downloaded from the official download page.
Bio-Linux
Now we’re venturing into territory that should make at least one cross section of scientists very happy. Bio-Linux is a distribution aimed specifically at bioinformatics (the science of collecting and analyzing complex biological data such as genetic codes). This very green-looking take on Linux (Figure 2) was developed at the Environmental Omics Synthesis Centre and the Natural Environment for Ecology & Hydrology and includes hundreds of bioinformatics tools, including:
abyss — de novo, parallel, sequence assembler for short reads
Artemis — DNA sequence viewer and annotation tool
bamtools — toolkit for manipulating BAM (genome alignment) files
Big-blast — The big-blast script for annotation of long sequence
Galaxy — browser-based biomedical research platform
Fasta — tool for searching DNA and protein databases
Mesquite — used for evolutionary biology
njplot — tool for drawing phylogenetic trees
Rasmo — tool for visualizing macromolecules
Figure 2: The Bio-Linux desktop.
There are plenty of command line and graphical tools to be found in this niche platform. For a complete list, check out the included software page here.
Bio-Linux is based on Ubuntu and is available for free download.
Poseidon Linux
This particular Ubuntu-based Linux distribution originally started as a desktop, based on open source software, aimed at the international scientific community. Back in 2010, the platform switched directions to focus solely on bathymetry (the measurement of depth of water in oceans, seas, or lakes), seafloor mapping, GIS, and 3D visualization.
Figure 3: Poseidon Linux with menus (Image: Wikipedia).
Poseidon Linux (Figure 3) is, effectively, Ubuntu 16.04 (complete with Ubuntu Unity, at the moment) with the addition of GMT (a collection of about 80 command-line tools for manipulating geographic and Cartesian data sets), PROJ (a standard UNIX filter function which converts geographic longitude and latitude coordinates into Cartesian coordinates), and MB System (seafloor mapping software).
Yes, Poseidon Linux is a very niche distribution, but if you need to measure the depth of water in oceans, seas, and lakes, you’ll be glad it’s available.
A group of British IT specialists took on the task to tailor Ubuntu Linux to be used as a desktop distribution by the UK National Health Service. NHSbuntu was first released, as an alpha, on April 27, 2017. The goal was to create a PC operating system that could deliver security, speed, and cost-effectiveness and to create a desktop distribution that would conform to the needs of the NHS — not insist the NHS conform to the needs of the software. NHSbuntu was set up for full disk encryption to safeguard the privacy of sensitive data.
NHSbuntu includes LibreOffice, NHSMail2 (a version of the Evolution groupware suite, capable of connecting to NHSmail2 and Trust email), and Chat (a messenger app able to work with NHSmail2). This spin on Ubuntu can:
Perform as a Clinical OS
Serve as an office desktop OS
Be used as in kiosk mode
Function as a real-time dashboard
Figure 4: NHSbuntu main screen.
The specific customizations of NHSbuntu are:
NHSbuntu wallpaper (Figure 4)
A look and feel similar to a well-known desktop
NHSmail2 compatibility
Email, calendar, address book
Messager, with file sharing
N3 VPN compatibility
RSA token supported
Removal of games
Inclusion of Remmina (Remote Desktop client for VDI)
NHSbuntu can be downloaded, for free, for either 32- or 64-bit hardware.
The tip of the scientific iceberg
Even if you cannot find a Linux distribution geared toward your specific branch of science or medicine, chances are you will find software perfectly capable of serving your needs. There are even organizations (such as the Open Science Project and Neurodebian) dedicated to writing and releasing open source software for the scientific community.
Learn more about Linux through the free “Introduction to Linux” course from The Linux Foundation and edX.
“The Compliance Industrial Complex” is a term that evokes dystopian imagery of organizations engaging in elaborate and highly expensive processes to comply with open source license terms. As life often imitates art, many organizations engage in this practice, sadly robbing them of the many benefits of the open source model. This article presents an economically efficient approach to open source software license compliance.
Open source licenses generally impose three requirements on a distributor of code licensed from a third party:
Provide a copy of the open source license(s)
Include copyright notices
For copyleft licenses (like GPL), make the corresponding source code available to the distributees