Home Blog Page 367

GitOps: ‘Git Push’ All the Things

While the idea of cloud-native computing promises to change how modern IT operations work, the idea remains vague for many who work in the profession. “GitOps,” an idea that generated some buzz at the KubeCon + CloudNativeCon EU conference in Copenhagen last week, could bring some much-needed focus to this area.

GitOps is all about “pushing code, not containers,” said Alexis Richardson, Weaveworks CEO and chair of the Cloud Native Computing Foundation‘s Technical Oversight Committee, in his KubeCon keynote. The idea is to “make git the center of control,” of cloud-native operations, for both the developer and the system administrator,  Richardson said.

“The key point of the developer experience is ‘git-push,’” he said, alluding to the git command used to submit code as finished. He added that this approach is “entirely code-centric.”

Read more at The New Stack

PacVim – A CLI Game To Learn Vim Commands

Howdy, Vim users! Today, I stumbled upon a cool utility to sharpen your Vim usage skills. Vim is a great editor to write and edit code. However, some of you (including me) are still struggling with the steep learning curve. Not anymore! Meet PacVim, a CLI game that helps you to learn Vim commands. PacVim is inspired by the classic game PacMan and it gives you plenty of practice with Vim commands in a fun and interesting way. Simply put, PacVim is a fun, free way to learn about the vim commands in-depth. Please do not confuse PacMan with pacman (the arch Linux package manager). PacMan is a classic, popular arcade game released in the 1980s.

In this brief guide, we will see how to install and use PacVim in Linux.

Install PacVim

First, install Ncurses library.

On Arch-based systems, install ncurses using command:

$ sudo pacman -S ncurses

Read more at OS Technix

Learn to Create Bootable Linux Flash Drive (using Ubuntu)

To be able to install Ubuntu or any other Linux OS or even any other OS like Windows etc, we either need a bootable FlashDrive or a DVD of the OS. In this tutorial, we will discuss how we can create bootable Linux Flash Drive using a Ubuntu System.

There are many tools on almost all operating systems available with most of them being 3rd party tools, but we have an inbuilt tool on Ubuntu for creating a bootable flash drive, called ‘Startup Disk Creator’. We will be using the same tool to create bootable Linux flash drive.

But before we start we will need a Flash drive with capacity more than 2 Gb & a system with Ubuntu installed on it. Once you have the flash drive & the system, you can proceed with the tutorial.

Read more at LinuxTech Lab

New Technologies Lead to New Linux and Cloud Training Options

At KubeCon + CloudNativeCon in Copenhagen, Denmark recently, I met many users, technologists, and business leaders — all aware of the dramatic pace of innovation and eager to learn about the new technologies that are coming out of open source space. With these rapid changes, many companies are now also worried about finding skilled developers who are well versed in the latest technologies.

As the adoption of these cloud native technologies increases, there will be an even greater demand for skilled developers, and I wondered where all those new developers would come from. Who will train and certify them?

New Courses

The answer lies in various training programs and courses, including those from The Linux Foundation, which, in partnership with edX, has been steadily working to close skills gaps by offering online courses — many of which are free — on open source platforms, tools, and practices. The Foundation recently passed the one million mark for people enrolled in courses on edX  and just announced a new Certified Kubernetes Application Developer (CKAD) Exam and corresponding Kubernetes for Developers course through the Cloud Native Computing Foundation.

Many other organizations can also help developers and sysadmins achieve the skills they need.  At KubeCon, I talked with Amy Marrich, a course author at the Linux Academy, which offers courses for all major technologies — including Kubernetes, Docker, DevOps, BigData, and OpenStack — to help developers hone the skills needed to master exams and become certified developers.

In addition to being a course author, Marrich is a core reviewer of the OpenStack Ansible project and also one of leaders of the Women of OpenStack. She is extremely active in the OpenStack community and also worked on the latest user survey and user committee elections.

Community Experts

Marrich said this connection with the community brings credibility and validity. “Since we are experts in OpenStack and part of the community, if a student has an issue, we know people in the community who could help that student,” said Marrich, “Even if it’s not my project, I am able to find the right person and get help for our students.”

Often students come up with questions that are not part of the course because some of these technologies are so large and their use-cases go beyond what any course can cover. “Being an active member of the community provides us with the ability to answer the questions they have or at least help them track them down – if it’s a bug, that adds to their learning experience,” Marrich.

Instructors also come from different technological backgrounds, so there is a lot of cross-pollination. There are experts who work on AWS, specifically OpenStack distributions, Kubernetes, Docker, and Linux. As a result, they are able to solve a huge gamut of problems that students may come across. They also keep an eye on trends and hot topics, such as serverless and cloud native, so they can add courses for these new technologies.

However, teaching is as much about how you teach as it is about the technology. One of the main goals is to make it easy for students to access course material. The videos are accompanied by study guides, which follow the videos closely.

“While the wording may not be same, the exercise is the same. We want to make sure that anything in the study guide is the same in the video,” she said. “If I type in a command, you are going to see the exact same output in the study guide as in the videos.” Additionally, the iOS and Android apps of Linux Academy allow users to download those videos for offline usage, so they can study even when they have no connectivity.

If you are looking to hone your open source skills, check out the courses from Linux Academy as well as those from The Linux Foundation to help you reach your goals.

Learn more about the Kubernetes for Developers training course and certification.

Linux-Friendly Arduino Simplifies IoT Development

Arduino’s support for Linux IoT devices and single-board computers (SBCs) announced at the Embedded Linux Conference+Open IoT Summit NA in March cemented Arduino’s focus on cloud-connected IoT development, extending its reach into edge computing. This move was likely driven by multiple factors — increased complexity of IoT solutions and, secondarily, by more interest in Arduino boards running Linux.

In a “blending” of development communities for the masses — Arduino, Raspberry Pi, and BeagleBone — Arduino’s support for Linux-based boards lowers the barrier of development for IoT devices by combining Arduino’s sensor and actuator nodes with higher processor-powered boards like Raspberry Pi and BeagleBone. Top this with a user-friendly web wizard to connect the Linux boards via the cloud and it simplifies the entire process.

Read more at EETimes

How to Kill a Process from the Command Line


Interested in learning more about Linux administration? Explore these Linux training courses.


 

Learn how to kill errant processes in this tutorial from our archives.

Picture this: You’ve launched an application (be it from your favorite desktop menu or from the command line) and you start using that launched app, only to have it lock up on you, stop performing, or unexpectedly die. You try to run the app again, but it turns out the original never truly shut down completely.

What do you do? You kill the process. But how? Believe it or not, your best bet most often lies within the command line. Thankfully, Linux has every tool necessary to empower you, the user, to kill an errant process. However, before you immediately launch that command to kill the process, you first have to know what the process is. How do you take care of this layered task? It’s actually quite simple…once you know the tools at your disposal.

Let me introduce you to said tools.

The steps I’m going to outline will work on almost every Linux distribution, whether it is a desktop or a server. I will be dealing strictly with the command line, so open up your terminal and prepare to type.

Locating the process

The first step in killing the unresponsive process is locating it. There are two commands I use to locate a process: top and ps. Top is a tool every administrator should get to know. With top, you get a full listing of currently running process. From the command line, issue top to see a list of your running processes (Figure 1).

Figure 1: The top command gives you plenty of information.

From this list you will see some rather important information. Say, for example, Chrome has become unresponsive. According to our top display, we can discern there are four instances of chrome running with Process IDs (PID) 3827, 3919, 10764, and 11679. This information will be important to have with one particular method of killing the process.

Although top is incredibly handy, it’s not always the most efficient means of getting the information you need. Let’s say you know the Chrome process is what you need to kill, and you don’t want to have to glance through the real-time information offered by top. For that, you can make use of the ps command and filter the output through grep. The ps command reports a snapshot of a current process and grep prints lines matching a pattern. The reason why we filter ps through grep is simple: If you issue the ps command by itself, you will get a snapshot listing of all current processes. We only want the listing associated with Chrome. So this command would look like:

ps aux | grep chrome

The aux options are as follows:

  • a = show processes for all users

  • u = display the process’s user/owner

  • x = also show processes not attached to a terminal

The x option is important when you’re hunting for information regarding a graphical application.

When you issue the command above, you’ll be given more information than you need (Figure 2) for the killing of a process, but it is sometimes more efficient than using top.

Figure 2: Locating the necessary information with the ps command.

Killing the process

Now we come to the task of killing the process. We have two pieces of information that will help us kill the errant process:

  • Process name

  • Process ID

Which you use will determine the command used for termination. There are two commands used to kill a process:

  • kill – Kill a process by ID

  • killall – Kill a process by name

There are also different signals that can be sent to both kill commands. What signal you send will be determined by what results you want from the kill command. For instance, you can send the HUP (hang up) signal to the kill command, which will effectively restart the process. This is always a wise choice when you need the process to immediately restart (such as in the case of a daemon). You can get a list of all the signals that can be sent to the kill command by issuing kill -l. You’ll find quite a large number of signals (Figure 3).

Figure 3: The available kill signals.

The most common kill signals are:

Signal Name

Single Value

Effect

SIGHUP

1

Hangup

SIGINT

2

Interrupt from keyboard

SIGKILL

9

Kill signal

SIGTERM

15

Termination signal

SIGSTOP

17, 19, 23

Stop the process

What’s nice about this is that you can use the Signal Value in place of the Signal Name. So you don’t have to memorize all of the names of the various signals.
So, let’s now use the kill command to kill our instance of chrome. The structure for this command would be:

kill SIGNAL PID

Where SIGNAL is the signal to be sent and PID is the Process ID to be killed. We already know, from our ps command that the IDs we want to kill are 3827, 3919, 10764, and 11679. So to send the kill signal, we’d issue the commands:

kill -9 3827

kill -9 3919

kill -9 10764

kill -9 11679

Once we’ve issued the above commands, all of the chrome processes will have been successfully killed.

Let’s take the easy route! If we already know the process we want to kill is named chrome, we can make use of the killall command and send the same signal the process like so:

killall -9 chrome

The only caveat to the above command is that it may not catch all of the running chrome processes. If, after running the above command, you issue the ps aux|grep chrome command and see remaining processes running, your best bet is to go back to the kill command and send signal 9 to terminate the process by PID.

Ending processes made easy

As you can see, killing errant processes isn’t nearly as challenging as you might have thought. When I wind up with a stubborn process, I tend to start off with the killall command as it is the most efficient route to termination. However, when you wind up with a really feisty process, the kill command is the way to go.

Learn more about Linux through the free “Introduction to Linux” course from The Linux Foundation and edX.

What Is an API? Application Programming Interfaces Explained

API, for application programming interface, is one of those acronyms that is used everywhere from command-line tools to enterprise Java code to Ruby on Rails web apps. Unless you write every single line of code from scratch, you’re going to be interacting with external software components, each with its own API. Even if you do write something entirely from scratch, a well-designed software application will have internal APIs to help organize code and make components more reusable.

Diving a little deeper, an API is a specification of possible interactions with a software component. For example, if a car was a software component, its API would include information about the ability to accelerate, brake, and turn on the radio. It would also include information about how to accelerate: Put your foot on the gas pedal and push. The “what” and “how” information come together in the API definition, which is abstract and separate from the car itself.

Let’s dig in by looking at the Java API and the Twitter API as examples. First, we’ll get a quick picture of these two APIs and how they fulfill the definition of “what” and “how.” 

Read more at InfoWorld

People Are Freaking Out That PGP Is ‘Broken’—But You Shouldn’t Be Using It Anyway

Hackers that can intercept your encrypted emails, or steal your emails from your computer or a server, may be able to decrypt them taking advantage of new vulnerabilities found in the way some email clients treat HTML.

On Monday, the world was reminded once again that the almost 30-year-old encryption protocol PGP does still exist, and, yes, it still kinda sucks.

Mind you, the protocol itself is not really the problem. The crypto is solid. The problem is the way it’s implemented, and the ecosystem around it. What’s new is that a group of researchers has found a clever way for hackers to decrypt some PGP-encrypted emails….

Read more at Motherboard

Using the Command Line to Decrypt a Message on Linux

If you have disabled the PGP plugin from your mail client and saved a copy of an encrypted email to your desktop, this guide will help you read that message in as safe a way as possible given what we know about the vulnerability described by EFAIL.

Note that the first three steps (opening the terminal) will vary between desktop environments.

  1. Open the Activities view by clicking all the way in the top left corner of your screen.

2. Type “terminal” into the search bar, and press Enter. This will open the command prompt.

Read more at EFF

The First 10 Years of Software Defined Networking

In 2008, if you wanted to build a network, you had to build it from the same switch and router equipment that everyone else had, according to Nick McKeown, co-founder of Barefoot Networks, speaking as part of a panel of networking experts at Open Networking Summit North America

Equipment was closed, proprietary, and vertically integrated with features already baked in, McKeown noted. And, “network management was a dirty word. If you wanted to manage a network of switches, you had to write your own scripts over a lousy, cruddy CLI, and everybody had their own way of doing it in order to try to make their network different from everybody else’s.”

All this changed when Stanford University Ph.D. student Martin Casado had the bold idea to rebuild the Stanford network out of custom-built switches and access points, he said.

Read more at The Linux Foundation