Home Blog Page 423

10 Lessons from 10 Years of AWS (part 1)

I recently presented a talk at the AWS Community Day in Bangalore. The tweet following the talk became my most popular tweet ever and I received quite a few requests for more details.

For the last 10 years, I have had the chance to work in companies that embraced the cloud, and in particular AWS. This two-part blog post is an attempt to share that experience with you. Hope you enjoy! Please do not hesitate to give feedback, share your own stories or simply like đŸ™‚

EMBRACE FAILURE

“It is not failure itself that holds you back; it is the fear of failure that paralyses you.” Brian Tracy

Let me start by saying that scared developers won’t:

* try things out

* won’t innovate as fast as your business would need to

* won’t dare to jump in and fix things when (pardon my French) shit hits the fan

* won’t do more than ask for

* and won’t stay long in the job

Read more at HackerNoon

Getting Started with Automation: 6 Tips

With forward-looking CIOs and their teams embracing automation instead of treating it like a boogeyman, 2018 appears to be an important year for this trend. Red Hat chief technology strategist E.G. Nadhan recently examined six ways automation is likely to impact the enterprise in the year ahead: Think customer experience, for starters.

But what if you’re still largely – if not entirely – mired in manual, legacy processes? Taking the first meaningful, results-oriented steps toward increasing automation can be a significant challenge.

We’re here to help. We asked a variety of automation experts for actionable, results-oriented advice on getting started with automation. Let’s take a closer look at their tips:

Read more at EnterprisersProject

New Linux on Azure Training Course Addresses Demand for Skills

Microsoft CEO Satya Nadella, who headed up Microsoft’s cloud division before he became CEO, has been vocal about converging the Azure cloud platform with Linux. In fact, he has noted that about a third of the Azure platform is Linux-based. Moreover, Microsoft has made clear that more than 60 percent of Azure Marketplace images are Linux-based, as ZDNet has reported.

The convergence of Linux and the Azure platform spells big opportunities and is giving rise to many new jobs. On this front, The Linux Foundation has announced the availability of a new training course, Administering Linux on Azure (LFS205). It is more important than ever for Linux and Azure professionals to make sure they know how to manage Linux workloads in an Azure environment, and this $299 course provides the requisite knowledge.

“As shown by The Linux Foundation and Dice’s Open Source Jobs Report, cloud computing skills are by far the most in demand by employers,” said Linux Foundation General Manager for Training & Certification, Clyde Seepersad. “This shouldn’t be a surprise to anyone, as the world today is run in the cloud. Azure is one of the most popular public clouds, and a huge portion of its instances run on Linux. That’s why we feel this new course is essential to give Azure professionals the Linux skills they need, give Linux professionals the Azure skills they need, and train new professionals to ensure industry has the talent it needs to meet the growing demand for Linux on Azure.”

Not only are many Linux workloads running in Azure environments, but you can choose from most popular Linux distributions to run in this context. Distributions such as Red Hat Enterprise, CentOS, SUSE Linux Enterprise, Debian, Ubuntu, CoreOS, RancherOS, FreeBSD, and more are in the Azure Marketplace.

The new LFS205 course covers how to deploy virtual machines in Azure, discussing different deployment scenarios. Once the VMs are available in Azure, students need to know how to manage them in an efficient way, which is covered next. The last part of the course teaches how to troubleshoot Linux in Azure, and how to monitor Linux in Azure using various open source tools. Importantly, the course also delves into container management.

As noted here, experience with cloud infrastructure tools and open source technologies can make a substantial compensation difference for everyone from sysadmins to c-suite technology leaders. Dice data scientist Yuri Bykov has said, “as businesses have begun relying more upon open source solutions to support their business needs
employers are looking for individuals with cloud computing and networking experience and a strong working knowledge of configuration management tools.”

The new LFS205 course is taught by Sander van Vugt, author of many Linux-related video courses and books as well as course developer for The Linux Foundation. He is also a managing partner of ITGilde, a large co-operative in which about a hundred independent Linux professionals in the Netherlands have joined forces. The $299 course fee provides unlimited access to the course for one year and to all content and labs materials.

To find more open source focused training and certification opportunities, check out this post for ways to fast-track your education and certification. The Linux Foundation also offers much coursework for extending your Linux-specific skills, ranging from Developing Applications for Linux to Linux Performance Tuning.

Learn more about the Administering Linux on Azure (LFS205) course and sign up here.

7 Open-Source Serverless Frameworks Providing Functions as a Service

With virtualization, organizations began to realize greater utilization of physical hardware. That trend continued with the cloud, as organizations began to get their machines into a pay-as-you-go service. Cloud computing further evolved when Amazon Web Services (AWS) launched its Lambda service in 2014, introducing a new paradigm in cloud computing that has become commonly referred to as serverless computing. In the serverless model, organizations pay for functions as a service without the need to pay for an always-on stateful, virtual machine.

Read more at eWeek

Containers versus Operating Systems

The most popular docker base container image is either busybox, or scratch. This is driven by a movement that is equal parts puritanical and pragmatic. The puritan asks “Why do I need to run init(1) just to run my process?” The pragmatist asks “Why do I need a 700 meg base image to deploy my application?” And both, seeking immutable deployment units ask “Is it a good idea that I can ssh into my container?” But let’s step back for a second and look at the history of how we got to the point where questions like this are even a thing.

In the very beginnings, there were no operating systems. Programs ran one at a time with the whole machine at their disposal. While efficient, this created a problem for the keepers of these large and expensive machines. To maximise their investment, the time between one program finishing and another starting must be kept to an absolute minimum; hence monitor programs and batch processing was born.

Read more at DaveCheney.net

Understanding Feature Engineering (Part 1) — Continuous Numeric Data

Any intelligent system regardless of complexity needs to be powered by data. At the heart of any intelligent system, we have one or more algorithms based on machine learning, deep learning or statistical methods which consume this data to gather knowledge and provide intelligent insights over a period of time. Algorithms are pretty naive by themselves and cannot work out of the box on raw data. Hence the need for engineering meaningful features from raw data is of utmost importance which can be understood and consumed by these algorithms.

Any intelligent system basically consists of an end-to-end pipeline starting from ingesting raw data, leveraging data processing techniques to wrangle, process and engineer meaningful features and attributes from this data. Then we usually leverage techniques like statistical models or machine learning models to model on these features and then deploy this model if necessary for future usage based on the problem to be solved at hand. A typical standard machine learning pipeline based on the CRISP-DM industry standard process model is depicted below.

Read more at Towards Data Science

MapR: How Next-Gen Applications Will Change the Way We Look at Data

The data landscape is changing right in front of our eyes. We are seeing gargantuan growth in total volume of data; we are generating and consuming massive amounts of video, images, sensor inputs of all sorts.

Moreover, “the type of data that’s growing most rapidly are not the data sets we think of historically as part of the legacy enterprise IT stack,” said Crystal Valentine, vice president of technology strategy at MapR Technologies,  in this newest edition of The New Stack Makers podcast.

Read more at The New Stack

The Brutal Lifecycle of JavaScript Frameworks

JavaScript UI frameworks and libraries work in cycles. Every six months or so, a new one pops up, claiming that it has revolutionized UI development. Thousands of developers adopt it into their new projects, blog posts are written, Stack Overflow questions are asked and answered, and then a newer (and even more revolutionary) framework pops up to usurp the throne.

Using the Stack Overflow Trends tool and some of our internal traffic data, we decided to take a look at some of the more prominent UI frameworks: Angular, React, Vue.js, Backbone, Knockout, and Ember.

Framework lifecycle

Stack Overflow Trends lets us examine how each of these technologies has been asked about over time. We can start by looking at some of the larger frameworks.

Read more at StackOverflow

ZAP Provides Automated Security Tests in Continuous Integration Pipelines

Commonly, a mixture of open source and expensive proprietary tools are shoehorned into a pipeline to perform tests on nightly as well as ad hoc builds. However, anyone who has used such tests soon realizes that the maturity of a smaller number of time-honored tests is sometimes much more valuable than the extra detail you get by shoehorning too many tests into the pipe then waiting three hours for a nightly build to complete. The maturity of your battle-hardened tests is key.

The tests you require might involve interrogating the quality of code from developers or checking code for licensing issues. A continuous testing strategy can be onerous to set up but brings unparalleled value to your end product, including improvements in uptime, performance, compliance, and security.

To make any of the tests you run within your pipeline useful, you should be able to integrate them with existing tools and fire them following simple event-based hooks or triggers.

Read more at ADMIN

Top 3 Linux Distributions That ‘Just Work’

Twenty years ago, when I first started using Linux, finding a distribution that worked, out of the box, was an impossible feat. Not only did the installation take some serious mental acuity, configuring the software and getting connected to the Internet was often a challenge users were reluctant to attempt.

Today, things are quite different. Linux now offers distributions that anyone can use, right out of the box. But, even among those distros that “just work,” some rise to the top to stand as the best in breed. These particular flavors of Linux are perfect for users hoping to migrate away from Windows or mac OS and who don’t want to spend hours getting up to speed on how the platform works, or (more importantly) making the system perform as expected.

In this article, I highlight the three distributions I believe are the best bets for anyone to use, without having to put in any extra “post install” time for configuration or problem solving.

So, without further ado, let’s take a look at those distributions that qualify as the best in the “just works” category.

Ubuntu

For the longest time, Ubuntu was considered the distribution for new users. It was also the single most popular distribution. But then Canonical abandoned GNOME for Unity, and things took a downward turn. Don’t get me wrong, I was a big fan of Unity (The HUD was well ahead of its time), but the average user 
 not so much. Ubuntu has now returned to GNOME, which should go a long way to winning back some of the users it lost with Unity.

One of the great things about the latest releases of GNOME (Figure 1), is that they just work. Of every desktop on the market, you’d be hard-pressed to find one more reliable and hassle-free than GNOME. Once you understand the components of the desktop, everything works without a hitch. GNOME development is among the strongest of any desktop, so issues are resolved very quickly and the resultant interface is incredibly stable. Since the release of GNOME 3.26, I’ve yet to experience a single issue. That’s impressive.

Figure 1: The GNOME Dash in action.

Desktop interface aside, one of the things that Ubuntu has enjoyed, for years, is some of the best hardware recognition of any distribution. Install Ubuntu and the odds are very high everything will work flawlessly: Wireless, sound, video 
 everything. Unless you’re looking at peripheral hardware designed for a specific operating system, chances are all will work under Ubuntu.

Ubuntu contains just the right amount of software (such as Firefox, Thunderbird, and Libreoffice) to help users get their work done. Should there be a title missing from the mix, the Ubuntu Software Center (a rebrand of GNOME Software) is there to help users find (and easily install) the tools they need.

Linux Mint

Out of the box, Linux Mint benefits from a Ubuntu base. Because of this, it enjoys the same level of hardware recognition. However, whereas Ubuntu defaults to GNOME, Mint defaults to the Cinnamon desktop (although you can download spins with Mate, Xfce, or KDE). For some, this is ideal, as it closely resembles a very familiar interface metaphor (think WIndows XP/7). The Cinnamon desktop (Figure 2) does a great job of making interacting with the operating system and installed applications easy. Although for some, the interface will seem a bit on the outdated side, it’s as straightforward a UI as you’ll find on a modern operating system.

Figure 2: Linux Mint running the Cinnamon desktop.

One area where Linux Mint improves over Ubuntu, is the software titles installed out of the box. Whereas Ubuntu keeps things on the minimal side, Linux Mint adds a few more titles that make it even easier for users to function without having to install third-party software. Linux Mint adds to the mix:

For me, the one glaring issue for Linux Mint is the addition of the Synaptic front end for the package manager. It’s not that Synaptic isn’t a solid tool 
 it is. But with the likes of the included Software Manager (which is more in line with something like GNOME Software, the addition of Synaptic is redundant. I understand why they might be including Synaptic (for those users who might prefer the flexibility of the older tool), but Software Manager is far more user-friendly and should be considered the only option. And considering some software managers (such as Elementary’s AppCenter) can do both installs and upgrades in one location, it would behoove the Mint team to fold that feature into Software Manager. As it stands, the install/update/upgrade process is handled in two tools, which isn’t nearly as efficient as it could be.

Elementary OS

And now we get to my personal desktop of choice, as well as my winner for best distribution for 2017 and 2018. Elementary OS is another distribution based on Ubuntu (sense a theme here), that makes Linux as easy to use as any operating system on the planet. Just as Elementary OS is an easy distribution to use, it works incredibly well. I’ve been working with Elementary OS as my daily driver for three years now, and have rarely experienced an issue. In fact, of every operating system I have used over the years, Elementary OS has been, hands down, the most trouble free.

Upon installation, Elementary OS includes everything you need to get your work done—with one caveat. Out of the box, Elementary uses the Epiphany Web Browser. The reason for defaulting to this particular browser is to keep things on the lighter side. However, any browser power user will understand that Epiphany simply doesn’t cut it for everyday usage.

That being said, the developers of Elementary OS have created their own app store (called the AppCenter) that makes installing Firefox Quantum (or any other piece of software) incredibly easy. In fact, Elementary’s AppCenter is, without question, on part with GNOME Software for ease of use and stability.

The true highlight of Elementary OS, however, is the Pantheon desktop. If you like the macOS interface, you’ll love Pantheon (Figure 3). It offers a similar layout as well as one of the most consistent design schemes to be found on a Linux desktop.

Figure 3: My personal Elementary OS desktop in action.

Elementary OS is a bit light on the out of the box software. Because of this, users will need to venture into the AppCenter to install the tools they need to work. Fortunately, the AppCenter is as user-friendly as they get. Open the tool, type office, then scroll down and click to install LibreOffice.

Elementary OS “just works” on every level, for every level of Linux user.

You be the judge

I’ve been using Linux for two decades now, and in that time I have never known a better crop of distributions to work right out of the box than you’ll find here. Unlike distros from back in the late 1990s and early 2000s, these choices make using the flagship open source platform both a breeze and a joy. If you’re looking for one of the best distributions to get started with, you cannot go wrong with these three.

Learn more about Linux through the free “Introduction to Linux” course from The Linux Foundation and edX.