Home Blog Page 456

Demand for Certified SysAdmins and Developers Is On the Rise

Even with a shortage of IT workers, some employers are still discerning in their hiring requirements and are either seeking certified candidates or offering to pay for their employees to become certified.

The Linux Foundation’s 2017 Open Source Jobs Report finds that half of hiring managers are more likely to hire a certified professional, while 47 percent of companies are willing to help pay for employees’ certifications. Meanwhile, 89% of hiring managers find it difficult to find open source talent.

The demand for skills relating to cloud administration, DevOps, and continuous integration/continuous delivery is fueling interest in training and certifications related to open source projects and tools that power the cloud, according to the report. Workers find certification important, too. In fact, 76 percent of open source pros say certifications are useful to their careers.

Existing cloud certifications, such as the Certified Kubernetes Administrator exam, are expected to help address the growing demand for these skills, the report states.

Why certifications are important

Demand for certifications is on the rise. Half of hiring managers prioritize finding certified pros; 50 percent also say they are more likely to hire a certified candidate than one without a certification, up from 44 percent in 2016, according to the Open Source Jobs report. And there’s been a big jump in companies willing to pay for employees to become certified. Nearly half say they’re willing to pay, up from one-third a year ago.

Here’s why. Some employers believe training alone is not enough and perceive that overall, certified IT pros make great employees, according to a CompTIA report. In fact, the study finds 91 percent of employers view IT certifications as a differentiator and say they play a key role in the hiring process.

Then there is perception. “Certifications make a good first impression,” the CompTIA report observes, and there is the belief that certified IT employees are more confident, knowledgeable and reliable, and perform at a higher level. While not specific to a particular technology, a whopping 95 percent of employers in the CompTIA study agree IT certifications provide a baseline set of knowledge for certain IT positions.

Only 21 percent say they definitely would not pay for certifications, down from 30 percent last year, the Open Source Jobs report finds.

The good news for IT professionals is that with certification, pay premiums have consistently grown over the past year. Areas in which IT pros receive higher pay, include: information security; application development/programming languages; databases; networking and communications; and systems administration/engineering – skill sets that are among the hardest to fill.

Dice’s annual salary survey finds salaries for Linux professionals are in line with last year, at over $100,000 annually – higher than the average $92,000 for tech pros nationally.

Formal training and/or certifications are a priority for hiring managers looking for developers (55 percent, compared to 47 percent who said so in 2016) and for Systems Administrators (53 percent vs. 47 percent last year).

Trending skills and certifications in high demand

In its 2018 Salary Guide, Robert Half Technology lists the most highly sought tech skills and certifications in North America. Among them are: .NET, Agile, and Scrum certifications. For application development work, businesses are seeking certifications and skills in areas including PHP and LAMP (Linux, Apache, MySQL, and Perl/Python). In networking and telecommunications, Linux/Unix administration is in high demand as well as in technical services, help desk, and technical support.

Meanwhile, 76 percent of professionals find certifications are useful to their careers, mainly to demonstrate technical knowledge to potential employers (reported by 47 percent of respondents), and 31 percent say that certifications generally make them more employable.

Although salary, not surprisingly, is the biggest incentive for switching jobs (82 percent), certification opportunities is an incentive for 65 percent of respondents in the Open Source Jobs report.

Download the full 2017 Open Source Jobs Report now.

Watch Keynote Videos from OS Summit and ELC Europe 2017 Including a Conversation with Linus Torvalds

If you weren’t able to attend Open Source Summit and Embedded Linux Conference (ELC) Europe last week, don’t worry! We’ve recorded keynote presentations from both events and all the technical sessions from ELC Europe to share with you here.

Check out the on-stage conversation with Linus Torvalds and VMware’s Dirk Hohndel, opening remarks from The Linux Foundation’s Executive Director Jim Zemlin, a special presentation from 11-year-old CyberShaolin founder Reuben Paul, and more.

Read more at The Linux Foundation

Node.js 8 Moves into Long-Term Support and Node.js 9 Becomes the New Current Release Line

We are super excited to announce that later today Node.js 8 will be ready for production as it transfers into the de facto Long-Term Support release line opening it up to a larger user base that demands stability and security (Node.js 8.9.0 is the first official Node.js 8 release to hit LTS status). Node.js 8 is one of the biggest release lines from the Node.js community to date with features and add-ons like Async / Await and V8 JavaScript Engine 6.1. It is up to 20 percent faster than its predecessor Node.js 6 (source nearForm) in typical web applications. An early tester found that Node.js 8 cut its web response by 70 percent:

“Node.js 8 cut our web response by 70 percent across the board and unlocks ES7 features to let us write simpler, more maintainable code,” said Connor Peet, Senior Software Engineer at Mixer, a live streaming website.

Read more at Node.js

7 Non-Technical Skills You Need To Succeed In A DevOps Career

The core creed of DevOps revolves around the idea that inter-departmental collaboration, communication, and constant improvement are the keys to a successful and efficient software development cycle. Teams of DevOps engineers or managers, then, are simply engineers and managers who combine their role-specific skills with DevOps best practices.

However, working on a DevOps team is not for everyone. Engineers who prefer long stints of working alone may be frustrated by the constant feedback exchanges. It takes a specific type of person to succeed in one of these positions. These seven attributes will serve you well in a future DevOps role.

1. Self-directed learning

Since DevOps is such a constantly evolving field, the ability and motivation to teach yourself new skills is critical. Anant Agarwal, CEO of edX, says, “It’s hard to learn something that seems to evolve as quickly as the lessons are taught. Self-learners are the perfect candidates for embracing and pursuing DevOps adoption, as it requires a roll-up-your-sleeves, trial-and-error, do-it-yourself, continuous learning approach.”

Read more at Forbes

Cloud-Native, Seven Years On…

The high-level concept of cloud-native is simple: systems that give users a better experience by virtue of operating in the cloud in a genuinely cloud-centric way. In other words, the cloud may make an existing database easier to start up, but if the database doesn’t support elasticity then it can’t take advantage of the scaling capabilities of the cloud.

The motivation for defining cloud-native was driven by two distinct aspects. First, we wanted to capture the thinking and architecture that went into creating properly “cloudy” systems. Secondly, we wanted to highlight that not every system that has been rebranded “cloud” was (or is) actually taking proper advantage of cloud.

Fast forward to today and we have a new definition of cloud-native from the CNCF. The new definition is much simpler, offering three main characteristics:

  • Containerized
  • Dynamically orchestrated
  • Microservices oriented

Read more at The New Stack

New Network Security Standards Will Protect Internet’s Routing

Electronic messages traveling across the internet are under constant threat from data thieves, but new security standards created with the technical guidance of the National Institute of Standards and Technology (NIST) will reduce the risk of messages being intercepted or stolen. These standards address a security weakness that has been a part of the internet since its earliest days(link is external).

The set of standards, known as Secure Inter-Domain Routing(link is external) (SIDR), have been published by the Internet Engineering Task Force (IETF(link is external)) and represent the first comprehensive effort to defend the internet’s routing system from attack. The effort has been led by a collaboration between NIST and the Department of Homeland Security (DHS) Science and Technology Directorate, working closely with the internet industry. The new specifications provide the first standardized approach for global defense against sophisticated attacks on the internet’s routing system.

Read more at NIST

Starting Out In Development – Subversion

This is and entry in a series about Starting Out In Development. The goal of this series is to provide brief introductions to critical tools, concepts, and skills you’ll need as a developer.

By now you should be familiar with what version control is. If you’re unsure, check out my article introducing it. Now that you know what version control is in general, it’s time to get familiar with some of its specific implementations. In this article, we’ll discuss Subversion, it’s take on version control, and how to use it.

Subversion (often abbreviated as SVN) is a software implementation of version control. It was created by CollabNet and is now a major Apache project. It’s been around since the year 2000 and is fairly actively developed and updated. There are also many tools that can make using SVN a bit easier and more convenient. Among the most popular of those tools is TortoiseSVN. I’ll be using that later in the examples that follow.

Read more at Dev.to

Understanding Shared Libraries in Linux

In programming, a library is an assortment of pre-compiled pieces of code that can be reused in a program. Libraries simplify life for programmers, in that they provide reusable functions, routines, classes, data structures and so on (written by a another programmer), which they can use in their programs.

For instance, if you are building an application that needs to perform math operations, you don’t have to create a new math function for that, you can simply use existing functions in libraries for that programming language.

Examples of libraries in Linux include libc (the standard C library) or glibc (GNU version of the standard C library), libcurl (multiprotocol file transfer library), libcrypt (library used for encryption, hashing, and encoding in C) and many more.

Read more at Tecmint

Migrating to Linux: An Introduction

Computer systems running Linux are everywhere. Linux runs our Internet services, from Google search to Facebook, and more. Linux also runs in a lot of devices, including our smartphones, televisions, and even cars. Of course, Linux can also run on your desktop system. If you are new to Linux, or you would just like to try something different on your desktop computer, this series of guides will briefly cover the basics and help you in migrating to Linux from another system.

Switching to a different operating system can be a challenge because every operating system provides a different way of doing things. What is second nature on one system can take frustrating time on another as we need to look up how to do things online or in books.

Vive la différence

To getting started with Linux, one thing you’ll likely notice is that Linux is packaged differently. In other operating systems, many things are bundled together and are just a part of the package. In Linux, however, each component is called out separately. For example, under Windows, the graphical interface is just a part of Windows. With Linux, you can choose from multiple graphical environments, like GNOME, KDE Plasma, Cinnamon, and MATE, to name a few.

At a high level, a Linux installation includes the following things:

  1. The kernel

  2. System programs and files residing on disk

  3. A graphical environment

  4. A package manager

  5. Applications

The Kernel

The core of the operating system is called the kernel. The kernel is the engine under the hood. It allows multiple applications to run simultaneously, and it coordinates their access to common services and devices so everything runs smoothly.

System programs and files

System programs reside on disk in a standard hierarchy of files and directories. These system programs and files include services (called daemons) that run in the background, utilities for various operations, configuration files, and log files.

Instead of running inside the kernel, these system programs are applications that perform tasks for basic system operation — for example, set the date and time and connect on the network so you can get on the Internet.

Included here is the init program – the very first application that runs. This program is responsible to starting all the background services (like a web server), starting networking, and starting the graphical environment. This init program will launch other system programs as needed.

Other system programs provide facilities for simple tasks like adding users and groups, changing your password, and configuring disks.

Graphical Environment

The graphical environment is really just more system programs and files. The graphical environment provides the usual windows with menus, a mouse pointer, dialog boxes, status and indicators and more.

Note that you aren’t stuck with the graphical environment that was originally installed. You can change it out for others, if you like. Each graphical environment will have different features. Some look more like Apple OS X, some look more like Windows, and others are unique and don’t try to mimic other graphical interfaces.

Package Manager

The package manager used to be difficult for people to grasp coming from a different system, but nowadays there is a similar system that people are very familiar with — the App Store. The packaging system is really an app store for Linux. Instead of installing this application from that web site, and the other application from a different site, you can use the package manager to select which applications you want. The package manager then installs the applications from a central repository of pre-built open source applications.

Applications

Linux comes with many pre-installed applications. And you can get more from the package manager. Many of the applications are quite good, which others need work. Sometimes the same application will have different versions that run in Windows or Mac OS or Linux.

For example, you can use Firefox browser and Thunderbird (for email). You can use LibreOffice as an alternative to Microsoft Office and run games through Valve’s Steam program. You can even run some native Windows applications on Linux using WINE.

Installing Linux

Your first step is typically to install a Linux distribution. You may have heard of Red Hat, Ubuntu, Fedora, Arch Linux, and SUSE, to name a few. These are different distributions of Linux.

Without a Linux distribution, you would have to install each component separately. Many components are developed and provided by different groups of people, so to install each component separately would be a long, tedious task. Luckily, the people who build distros do this work for you. They grab all the components, build them, make sure they work together, and then package them up under a single installation.

Various distributions may make different choices and use different components, but it’s still Linux. Applications written to work in one distribution frequently run on other distributions just fine.

If you are a Linux beginner and want to try out Linux, I recommend installing Ubuntu. There are other distros you can look into as well: Linux Mint, Fedora, Debian, Zorin OS, elementary OS, and many more. In future articles, we will cover additional facets of a Linux system and provide more information on how to get started using Linux.

Learn more about Linux through the free “Introduction to Linux” course from The Linux Foundation and edX.

Deploy Atomically with Travis & npm

I think I am a software developer because I am lazy.

The second or third time I have to perform the same exact task, I find myself saying, “Ugh, can’t I tell the computer how to do it?” 

So imagine my reaction when our team’s deployment process started looking like this:

  1. git pull
  2. npm run build to create the minified packages
  3. git commit -am "Create Distribution" && git push
  4. Navigate to GitHub
  5. Create a new release

I was thinking steps 1–3 are easy enough to put in a shell script and steps 4–5 are probably scriptable, but is that all? What else needs to be done?

  • The version in package.json was never getting updated and it would be nice to have that in synch with the GitHub release.
  • Can this script be run after the CI build without having to task a human to manually run it?

Read more at Dev.to