For the last 10 years, I have had the chance to work in companies that embraced the cloud, and in particular AWS. This two-part blog post is an attempt to share that experience with you. Hope you enjoy! Please do not hesitate to give feedback, share your own stories or simply like đ
EMBRACE FAILURE
âIt is not failure itself that holds you back; it is the fear of failure that paralyses you.â Brian Tracy
Let me start by saying that scared developers wonât:
* try things out
* wonât innovate as fast as your business would need to
* wonât dare to jump in and fix things when (pardon my French) shit hits the fan
But what if youâre still largely â if not entirely â mired in manual, legacy processes? Taking the first meaningful, results-oriented steps toward increasing automation can be a significant challenge.
Weâre here to help. We asked a variety of automation experts for actionable, results-oriented advice on getting started with automation. Letâs take a closer look at their tips:
Microsoft CEO Satya Nadella, who headed up Microsoftâs cloud division before he became CEO, has been vocal about converging the Azure cloud platform with Linux. In fact, he has noted that abouta third of the Azure platform is Linux-based. Moreover, Microsoft has made clear that more than 60 percent of Azure Marketplace images are Linux-based,as ZDNet has reported.
The convergence of Linux and the Azure platform spells big opportunities and is giving rise to many new jobs. On this front, The Linux Foundation hasannounced the availability of a new training course,Administering Linux on Azure (LFS205). It is more important than ever forLinux and Azure professionals to make sure they know how to manage Linux workloads in an Azure environment, and this $299 course provides the requisite knowledge.
âAs shown by The Linux Foundation and Diceâs Open Source Jobs Report, cloud computing skills are by far the most in demand by employers,â said Linux Foundation General Manager for Training & Certification, Clyde Seepersad. âThis shouldnât be a surprise to anyone, as the world today is run in the cloud. Azure is one of the most popular public clouds, and a huge portion of its instances run on Linux. Thatâs why we feel this new course is essential to give Azure professionals the Linux skills they need, give Linux professionals the Azure skills they need, and train new professionals to ensure industry has the talent it needs to meet the growing demand for Linux on Azure.â
Not only are many Linux workloads running in Azure environments, butyou can choose from most popular Linux distributions to run in this context. Distributions such as Red Hat Enterprise, CentOS, SUSE Linux Enterprise, Debian, Ubuntu, CoreOS, RancherOS, FreeBSD, and more are in the AzureMarketplace.
The new LFS205 course covers how to deploy virtual machines in Azure, discussing different deployment scenarios. Once the VMs are available in Azure, students need to know how to manage them in an efficient way, which is covered next. The last part of the course teaches how to troubleshoot Linux in Azure, and how to monitor Linux in Azure using various open source tools. Importantly, the course also delves into container management.
As noted here, experience with cloud infrastructure tools and open source technologies can make a substantial compensation difference for everyone from sysadmins to c-suite technology leaders. Dice data scientist Yuri Bykov has said,âas businesses have begun relying more upon open source solutions to support their business needsâŠemployers are looking for individuals with cloud computing and networking experience and a strong working knowledge of configuration management tools.â
The new LFS205 course is taught by Sander van Vugt, author of many Linux-related video courses and books as well as course developer for The Linux Foundation. He is also a managing partner of ITGilde, a large co-operative in which about a hundred independent Linux professionals in the Netherlands have joined forces. The $299 course fee provides unlimited access to the course for one year and to all content and labs materials.
To find more open source focused training and certification opportunities, check out this post for ways to fast-track your education and certification. The Linux Foundation also offersmuch coursework for extending your Linux-specific skills, ranging from Developing Applications for Linux to Linux Performance Tuning.
With virtualization, organizations began to realize greater utilization of physical hardware. That trend continued with the cloud, as organizations began to get their machines into a pay-as-you-go service. Cloud computing further evolved when Amazon Web Services (AWS) launched its Lambda service in 2014, introducing a new paradigm in cloud computing that has become commonly referred to as serverless computing. In the serverless model, organizations pay for functions as a service without the need to pay for an always-on stateful, virtual machine.
The most popular docker base container image is either busybox, or scratch. This is driven by a movement that is equal parts puritanical and pragmatic. The puritan asks âWhy do I need to run init(1) just to run my process?â The pragmatist asks âWhy do I need a 700 meg base image to deploy my application?â And both, seeking immutable deployment units ask âIs it a good idea that I can ssh into my container?â But letâs step back for a second and look at the history of how we got to the point where questions like this are even a thing.
In the very beginnings, there were no operating systems. Programs ran one at a time with the whole machine at their disposal. While efficient, this created a problem for the keepers of these large and expensive machines. To maximise their investment, the time between one program finishing and another starting must be kept to an absolute minimum; hence monitor programs and batch processing was born.
Any intelligent system regardless of complexity needs to be powered by data. At the heart of any intelligent system, we have one or more algorithms based on machine learning, deep learning or statistical methods which consume this data to gather knowledge and provide intelligent insights over a period of time. Algorithms are pretty naive by themselves and cannot work out of the box on raw data. Hence the need for engineering meaningful features from raw data is of utmost importance which can be understood and consumed by these algorithms.
Any intelligent system basically consists of an end-to-end pipeline starting from ingesting raw data, leveraging data processing techniques to wrangle, process and engineer meaningful features and attributes from this data. Then we usually leverage techniques like statistical models or machine learning models to model on these features and then deploy this model if necessary for future usage based on the problem to be solved at hand. A typical standard machine learning pipeline based on the CRISP-DM industry standard process model is depicted below.
The data landscape is changing right in front of our eyes. We are seeing gargantuan growth in total volume of data; we are generating and consuming massive amounts of video, images, sensor inputs of all sorts.
Moreover, âthe type of data thatâs growing most rapidly are not the data sets we think of historically as part of the legacy enterprise IT stack,â said Crystal Valentine, vice president of technology strategy at MapR Technologies, in this newest edition of The New Stack Makers podcast.
JavaScript UI frameworks and libraries work in cycles. Every six months or so, a new one pops up, claiming that it has revolutionized UI development. Thousands of developers adopt it into their new projects, blog posts are written, Stack Overflow questions are asked and answered, and then a newer (and even more revolutionary) framework pops up to usurp the throne.
Using the Stack Overflow Trends tool and some of our internal traffic data, we decided to take a look at some of the more prominent UI frameworks: Angular, React, Vue.js, Backbone, Knockout, and Ember.
Framework lifecycle
Stack Overflow Trends lets us examine how each of these technologies has been asked about over time. We can start by looking at some of the larger frameworks.
Commonly, a mixture of open source and expensive proprietary tools are shoehorned into a pipeline to perform tests on nightly as well as ad hoc builds. However, anyone who has used such tests soon realizes that the maturity of a smaller number of time-honored tests is sometimes much more valuable than the extra detail you get by shoehorning too many tests into the pipe then waiting three hours for a nightly build to complete. The maturity of your battle-hardened tests is key.
The tests you require might involve interrogating the quality of code from developers or checking code for licensing issues. A continuous testing strategy can be onerous to set up but brings unparalleled value to your end product, including improvements in uptime, performance, compliance, and security.
To make any of the tests you run within your pipeline useful, you should be able to integrate them with existing tools and fire them following simple event-based hooks or triggers.
Twenty years ago, when I first started using Linux, finding a distribution that worked, out of the box, was an impossible feat. Not only did the installation take some serious mental acuity, configuring the software and getting connected to the Internet was often a challenge users were reluctant to attempt.
Today, things are quite different. Linux now offers distributions that anyone can use, right out of the box. But, even among those distros that âjust work,â some rise to the top to stand as the best in breed. These particular flavors of Linux are perfect for users hoping to migrate away from Windows or mac OS and who donât want to spend hours getting up to speed on how the platform works, or (more importantly) making the system perform as expected.
In this article, I highlight the three distributions I believe are the best bets for anyone to use, without having to put in any extra âpost installâ time for configuration or problem solving.
So, without further ado, letâs take a look at those distributions that qualify as the best in the âjust worksâ category.
Ubuntu
For the longest time, Ubuntu was considered the distribution for new users. It was also the single most popular distribution. But then Canonical abandoned GNOME for Unity, and things took a downward turn. Donât get me wrong, I was a big fan of Unity (The HUD was well ahead of its time), but the average user ⊠not so much. Ubuntu has now returned to GNOME, which should go a long way to winning back some of the users it lost with Unity.
One of the great things about the latest releases of GNOME (Figure 1), is that they just work. Of every desktop on the market, youâd be hard-pressed to find one more reliable and hassle-free than GNOME. Once you understand the components of the desktop, everything works without a hitch. GNOME development is among the strongest of any desktop, so issues are resolved very quickly and the resultant interface is incredibly stable. Since the release of GNOME 3.26, Iâve yet to experience a single issue. Thatâs impressive.
Figure 1: The GNOME Dash in action.
Desktop interface aside, one of the things that Ubuntu has enjoyed, for years, is some of the best hardware recognition of any distribution. Install Ubuntu and the odds are very high everything will work flawlessly: Wireless, sound, video ⊠everything. Unless youâre looking at peripheral hardware designed for a specific operating system, chances are all will work under Ubuntu.
Ubuntu contains just the right amount of software (such as Firefox, Thunderbird, and Libreoffice) to help users get their work done. Should there be a title missing from the mix, the Ubuntu Software Center (a rebrand of GNOME Software) is there to help users find (and easily install) the tools they need.
Linux Mint
Out of the box, Linux Mint benefits from a Ubuntu base. Because of this, it enjoys the same level of hardware recognition. However, whereas Ubuntu defaults to GNOME, Mint defaults to the Cinnamon desktop (although you can download spins with Mate, Xfce, or KDE). For some, this is ideal, as it closely resembles a very familiar interface metaphor (think WIndows XP/7). The Cinnamon desktop (Figure 2) does a great job of making interacting with the operating system and installed applications easy. Although for some, the interface will seem a bit on the outdated side, itâs as straightforward a UI as youâll find on a modern operating system.
Figure 2: Linux Mint running the Cinnamon desktop.
One area where Linux Mint improves over Ubuntu, is the software titles installed out of the box. Whereas Ubuntu keeps things on the minimal side, Linux Mint adds a few more titles that make it even easier for users to function without having to install third-party software. Linux Mint adds to the mix:
For me, the one glaring issue for Linux Mint is the addition of the Synaptic front end for the package manager. Itâs not that Synaptic isnât a solid tool ⊠it is. But with the likes of the included Software Manager (which is more in line with something like GNOME Software, the addition of Synaptic is redundant. I understand why they might be including Synaptic (for those users who might prefer the flexibility of the older tool), but Software Manager is far more user-friendly and should be considered the only option. And considering some software managers (such as Elementary’s AppCenter) can do both installs and upgrades in one location, it would behoove the Mint team to fold that feature into Software Manager. As it stands, the install/update/upgrade process is handled in two tools, which isn’t nearly as efficient as it could be.
Elementary OS
And now we get to my personal desktop of choice, as well as my winner for best distribution for 2017 and 2018. Elementary OS is another distribution based on Ubuntu (sense a theme here), that makes Linux as easy to use as any operating system on the planet. Just as Elementary OS is an easy distribution to use, it works incredibly well. Iâve been working with Elementary OS as my daily driver for three years now, and have rarely experienced an issue. In fact, of every operating system I have used over the years, Elementary OS has been, hands down, the most trouble free.
Upon installation, Elementary OS includes everything you need to get your work doneâwith one caveat. Out of the box, Elementary uses the Epiphany Web Browser. The reason for defaulting to this particular browser is to keep things on the lighter side. However, any browser power user will understand that Epiphany simply doesnât cut it for everyday usage.
That being said, the developers of Elementary OS have created their own app store (called the AppCenter) that makes installing Firefox Quantum (or any other piece of software) incredibly easy. In fact, Elementaryâs AppCenter is, without question, on part with GNOME Software for ease of use and stability.
The true highlight of Elementary OS, however, is the Pantheon desktop. If you like the macOS interface, youâll love Pantheon (Figure 3). It offers a similar layout as well as one of the most consistent design schemes to be found on a Linux desktop.
Figure 3: My personal Elementary OS desktop in action.
Elementary OS is a bit light on the out of the box software. Because of this, users will need to venture into the AppCenter to install the tools they need to work. Fortunately, the AppCenter is as user-friendly as they get. Open the tool, type office, then scroll down and click to install LibreOffice.
Elementary OS âjust worksâ on every level, for every level of Linux user.
You be the judge
Iâve been using Linux for two decades now, and in that time I have never known a better crop of distributions to work right out of the box than you’ll find here. Unlike distros from back in the late 1990s and early 2000s, these choices make using the flagship open source platform both a breeze and a joy. If youâre looking for one of the best distributions to get started with, you cannot go wrong with these three.
Learn more about Linux through the free “Introduction to Linux” course from The Linux Foundation and edX.