Home Blog Page 735

10 Skills to Land Your Open Source Dream Job

In the past two years, as we’ve seen open source move even further into the mainstream of practically every organization from the large to the small, I’ve thought a bit about how the landscape for open source job skills has changed, and what, if anything, might be added to the list of proficiencies to find a career in open source.

So, in the spirit of open source, I’ve remixed Jason’s original look at seven open source skills for career readiness and added three more of my own.

“Work on stuff that matters” is a famous call to action from founder and CEO of O’Reilly Media, Tim O’Reilly. But, how about working on stuff that matters while getting paid for it? There are an abundance of open source-related jobs out there if you’ve got the right skills….

Read more at OpenSource.com

Managing Encrypted Backups in Linux: Part 1

Encrypted backups are great, but what if something goes wrong and you can’t read your encrypted files? In this two-part series, I’ll show how to use rsync and duplicity as your belt-and-suspenders protection against data loss. Part 1 shows how to create and automate simple backups. In part 2, I’ll go into more details on file selection and backing up encryption keys.

My personal backup plan uses both encrypted and unencrypted backups, because my paranoia extends to worrying about broken encryption. If something goes wrong, like a corrupted file system, good luck recovering encrypted data.

I want to always have access to my files. My risk assessment is pretty simple: The most likely cause for losing access is file system corruption or hardware failure. Other possible, but less likely, hazards are fire or theft. I don’t need to encrypt files on my main PC (though in a future installment, I’ll look at backup options for encrypted volumes). My most important files are encrypted and uploaded to remote servers. I’m stuck with capped mobile broadband, so I can’t just encrypt and stuff everything into a remote server.

These backups are all automated except for one step. I use rsync, duplicity, and GnuPG. It works like this:

  • Nightly unencrypted dump of everything to a portable hard drive.
  • Nightly selective encrypted dump to a remote server.
  • Continuous encrypted upload to SpiderOak of my most important files.
  • Weekly rotation of unencrypted drives to my bank safety deposit box.

I rotate two unencrypted portable hard drives to my safe deposit box, so if my house ever burns down I’ll lose, at most, a week’s worth of files. Sometimes I dream of losing the whole lot; what do I need all that junk for? But, I save it anyway.

Simple Unencrypted Backups

Simple unencrypted backups are easy with good old rsync. Hard drives are huge and cheap, so it is feasible to back up everything on my PC every night. I use an rsync exclude file to avoid copying crud I know I’ll never need, such as some dotfiles and certain directories. This is a brief example of an exclude file:

.adobe/
.dbus/
.macromedia/
.Xauthority
.xsession-errors
downloads/
Videos/

This command performs the backup — remember here to mind your trailing slashes. A trailing slash on the source directory copies only the contents of the directory, and not the directory. No trailing slash copies the directory and its contents. The file paths in your exclude file are relative to the directory you are copying:

$ rsync -av "ssh -i /home/carla/.ssh/backup_rsa" --exclude-from=exclude.txt 
   /home/carla/ carla@backup:/home/carla/

I use passphrase-less SSH key authentication to login to my remote server. A passphrase-less key is not vulnerable to a brute-force password attack.

The rsync command goes into a script, ~/bin/nightly-plain:

#!/bin/bash
rsync -av "ssh -i /home/carla/.ssh/backup_rsa" --exclude-from=exclude.txt 
  /home/carla/ carla@backup:/home/carla/

Remember to make it executable and limit read-write permissions to you only:

$ chmod 0700 nightly-plain

I added ~/bin/ permanently to my path by adding these lines to my ~.profile:

# set PATH so it includes user's private bin if it exists
if [ -d "$HOME/bin" ] ; then
    PATH="$HOME/bin:$PATH"
fi

Putting a directory in your path means you can call scripts in that directory without having to spell out the full path. Create your personal cron job like this example, which runs every night at 11:05 PM:

$ crontab -e
05 23 * * * nightly-plain

Encrypted Backups with duplicity

duplicity goes to work 30 minutes later. Of course, you can adjust this interval to fit your own setup.

Ubuntu users should install duplicity from the duplicity PPA, because the version in Main is old and buggy. You also need python-paramiko so you can use SCP to copy your files.

duplicity uses GPG keys, so you must create a GPG key:

$ gpg --gen-key

It is OK to accept the defaults. When you are prompted to enter your name, email address, and comment, use the comment field to give your key a useful label, such as “nightly encrypted backups.” Write down your passphrase, because you will need it to restore and decrypt your files. The worst part of creating a GPG key is generating enough entropy while it is building your key. The usual way is to wiggle your mouse for a couple of minutes. An alternative is to install rng-tools to create your entropy. After installing rng-tools, open a terminal and run this command to create entropy without having to sit and wiggle your mouse:

$ sudo rngd -f -r /dev/random

Now create your GPG key in a second terminal window. When it is finished, go back to the rngd window and stop it with Ctrl+C. Return to your GPG window and view your keys with gpg –list-keys:

$ gpg --list-keys
pub   2048R/30BFE75D 2016-07-12
uid                  Carla Schroder (nightly encrypted backups) <carla@example.com>
sub   2048R/6DFAE9E8 2016-07-12

Now you can make a trial duplicity run. This example encrypts and copies a single directory to a server on my LAN. Note the SCP syntax for the target directory; the example remote somefiles directory is /home/carla/somefiles. SCP and SSH paths are relative to the user, so you don’t need to spell out the full path. If you do you will create a new directory. Use the second part of the pub key ID to specify which GPG key to use:

$ duplicity --encrypt-key 30BFE75D /home/carla/somefiles 
    scp://carla@backupserver/somefiles

A successful run shows a bunch of backup statistics. You can view a file list of your remote files:

$ duplicity list-current-files 
   scp://carla@backupserver/somefiles

Test your ability to restore and unencrypt your files by reversing the source and target directories. You need your passphrase. This example decrypts and downloads the backups to the current directory:

$ PASSPHRASE="password" duplicity 
    scp://carla@backupserver/somefiles .

Or, restore a single file, which in this example is logfile. The file’s path is relative to the target URL, and the directory that you restore it to does not have to exist:

$ PASSPHRASE="password" duplicity --file-to-restore logfile 
  scp://carla@backupserver/somefiles logfiledir/

If you’re encrypting and backing up a single directory like the above example, you can put your duplicity command in a script, and put the script in a cron job. In part 2, I’ll show you how to fine-tune your file selection.

SpiderOak for Continual Encrypted Backups

I use SpiderOak to encrypt and upload my most important files as I work on them. This has saved my day many times from power outages and fat-fingered delete escapades. SpiderOak provides zero-knowledge offsite encrypted file storage, which mean that if you lose your encryption key, you lose access to your files, and SpiderOak cannot help you. Any vendor that can recover your files means they can snoop in them, or get hacked, or give them to law enforcement, so zero-knowledge is your strongest protection.

Come back for part 2 to learn more about file selection and backing up your keys.

Learn more skills for sysadmins in the Essentials of System Administration course from The Linux Foundation.

Mint 18 Review: “Just Works” Linux Doesn’t Get Any Better Than This

New themes and moving from GNOME/GTK 3.10 to 3.18 means two good years of Mint 18.x ahead.

The newly released Mint 18 is a major upgrade. Not only has the Linux Mint project improved Mint’s dueling desktops (Cinnamon and MATE), but the group’s latest work impacts all underlying systems. With Mint 18, Linux Mint has finally moved its base software system from Ubuntu 14.04 to the new Ubuntu 16.04.

Upgrading to the latest long-term support (LTS) release of Ubuntu means, as with the Mint 17.x series, the Mint 18.x release cycle is now locked to its base for two years. Rather than tracking alongside Ubuntu, Mint 18 and all subsequent releases will stick with Ubuntu 16.04. Mint won’t necessarily get as out of date as Ubuntu LTS releases tend to by the end of their two-year cycle, but this setup does mean nothing major is going to change for quite a while.

Read more at Ars Technica

Severe Vulnerabilities Discovered in HTTP/2 Protocol

Four high-profile bugs have been found in the protocol, potentially placing 85 million websites at risk.

On Wednesday at Black Hat USA, cybersecurity firm Imperva released new research into a number of high-profile flaws found within the latest version of HTTP, HTTP/2, which underpins the worldwide web’s underlying protocols and communication systems.

The report, HTTP/2: In-depth analysis of the top four flaws of the next generation web protocol (.PDF), details four main vulnerabilities and attack vectors related to HTTP/2, of which adoption is steadily increasing.

Read more at ZDNet

How to Deal with COTS Products in a DevOps World

The primary objective of DevOps is to increase the speed of delivery at reliable quality. To achieve this, good configuration management is crucial as the level of control at higher speed of delivery becomes more and more important (while riding a bike you might take your hands off the handle bar once in a while, but a formula one driver is practically glued to the steering wheel). Yet commercial-off-the-shelf (COTS) products often don’t provide any obvious ways to manage them like you manage your custom software. This is a real challenge for large organisations who deal with a mixed technology landscape.

In this article I will explore ways to apply modern DevOps practices when dealing with COTS products.

Read more at InfoQ

Kaminsky Aims to Secure the Internet

In a rambling, hourlong keynote address at the Black Hat USA conference here today, security luminary Dan Kaminsky detailed the risks and the opportunities inherent in the internet.

Kaminsky first shot to fame in 2008, when he revealed a flaw at that year’s Black Hat USA event in the pervasive Domain Name System (DNS) protocol. At the time, he said that a web doomsday had been averted and the continued safe operation of the internet could continue. Now in 2016, with threats of government surveillance and an ongoing public debate about the use of encryption, Kaminsky once again sees the basic fabric of the internet to be at risk.

“We have work to do to keep the internet working,” he said. “I’m here to encourage everyone to notice what is wrong, how it can get worse and what we can do about it.”

Read more at eWeek

Mesosphere’s ‘Container 2.0’ Unites Stateless and Stateful Workloads

The argument over the viability of stateful container-based applications versus stateless ones is long settled. In real-world multi-tenant production environments, applications need access to persistent data stores and volumes. Its ridiculous to make developers jump through hoops even open-source, device-agnostic, standardized hoops so that they can send messages or record entries in a key/value store or a log file.

Mesosphere has a worked out a way to manage both stateful and stateless container workloads, along with workloads not even using containers, all on the same shared infrastructure, using DC/OS (Mesosphere’s Data Center Operating System) both the commercial and open source editions.

The trick is to allow some distributed programs handle their own scheduling. Container orchestrators, such as Kubernetes and the Docker Engine, use a single “monolithic,” scheduler, noted Florian Leibert, Mesospheres CEO, in a blog post. “Because there is no single scheduler that can optimize for all workloads, users end up with non-optimal operating constraints, including being forced to create separate clusters for each service,” he wrote.

Read more at  The New Stack

Enterprises Increasingly Joining Open Source Ecosystem – Wikibon

A new wave of open source participation is growing among large traditional enterprises not normally considered technology developers, writes Wikibon Lead Cloud Analyst Brian Gracely. Companies like Capital One Financial Corp., Nike Inc., Deere & Co. and General Electric Co. are joining open source consortia both as users of and contributors to major initiatives.

They are doing this for the same basic reason that IT vendors such as IBM, Google and Intel have become major drivers of Apache open source projects – it allows them to participate with outside teams on developing software they need, creating better solutions to their needs faster and at less cost.

Read more at Silicon Angle

Using cPanel for Managing Services

WHM (Web Hosting Manager) is the core program that has administrative access to the back-end of cPanel. With WHM you can create cPanel accounts and set global settings that apply to them or the server. You can also create reseller accounts and offer hosting services to clients or you can use it for your own personal needs. cPanel on the other hand is the website management panel and allows you to create databases, email accounts, FTP users, add-on domains, subdomains etc. Each individual cPanel account controls the settings for that particular account.

WHM/cPanel with it’s easy to use interface is perfect for managing services such as MySQL, BIND, Apache etc. You can create databases, add domains to Apache’s configuration, create and modify DNS records for your domains, configure your services, manage the PHP version and modules, implement security rules and many more features with just a few clicks. The control panel, to a point, replaces the common system administrator that you will have to hire to configure and manage your server.

So in this article we will scratch the surface of cPanel and explain some basic features that services like Apache, MySQL, Exim+Dovecot (cPanel’s mail server trusted buddies) provide.

Probably at this point you are wondering, OK, how can I have cPanel? Well, there are two ways for acquiring a cPanel account. One is to purchase WHM and then create your cPanel account, another is to purchase shared hosting from companies that offer such with cPanel as control panel. We at RoseHosting offer both shared and fully-managed VPS hosting with cPanel and our support team is online 24/7.

How To Log Into Cpanel?

Logging into cPanel can be accomplished by navigating your favorite web browser to either https://your_server_IP:2083 or https://your_domain.com:2083. You’ve probably noticed that I’ve just provided the URLs that go via the HTTPS protocol. Accessing cPanel through HTTPS is recommended, even though on some servers a self-signed certificate is used. However, if you wish to access cPanel via HTTTP then use the port 2082 in the URL.

I Am Logged In. What Now?

As you can see from the interface there are several sections which correspond with the respective service on your VPS. You have Files, Databases, Domains, Email, Metrics, Security etc. Of course these sections can vary since some companies use a custom setup of cPanel.

Below is a picture with some of the sections we described into a cPanel account that uses the paper_lantern theme.

sections.png

Most of the options are pretty straightforward. You can check the account’s disk usage by clicking on (you’ve probably guessed it, yes) Disk Usage.

You need to use FTP to upload data to your server? Click on FTP accounts, create one and use FileZilla or any other FTP client to connect to your server. If you don’t know how to configure FileZilla, CoreFTP or Cyberduck then cPanel has a solution for you. Next to each FTP account you have the “Configure FTP Client” action menu and when you click on it you will get to choose from three configuration files for FTP clients to suit your needs.

Another great feature that cPanel offers is the File Manager which is exactly as the name suggests. A manager that can be used to organize and edit your account files and directories. Think of it as some kind of an FTP client GUI that with it’s ease of use enables you to modify files, change permissions, extract and compress files/directories, upload and download data etc.

What if you want to password protect some directories in the account? Then use Directory Privacy. It will allow you to prevent access to directories of your choice so when a user tries to view the content in the protected directory, he will be prompted for a username and password.

And of course we cannot forget one of the most important things to have when running a server – backup. Using cPanel’s Backup Wizard you can make a partial or a full backup of the account in question. So when something goes horribly wrong, then you can easily restore the account/website/app functionality from the latest working backup.

Once you are in the backup wizard you will find two options as shown below:

backup wizard

So by clicking on the Backup button you will start a creating backup process which later in the second step you can choose whether to be full or partial and then download it to your local machine. The backup will be stored in the /home directory on your server.

Restoring a backup is easy-peasy. Just click on Restore, select the restore type and then upload the backup from your local machine.

How To Create A Database From Cpanel?

It is pretty simple actually. Locate the Database section and click on ‘MySQL Database Wizard’. A window will open where you can create a new database, database user, modify user privileges over a database and add a user to it’s respective database. The steps are presented with the below images:

2.png

3.png

4.png

When in need of editing a database use phpMyadmin. You just need to click on phpMyAdmin and you will be redirected to this very useful tool’s interface.

How Can I Create Another Domain?

Let’s go to our next topic, Domains. This sections is all about the names that represent your website, domains. Every cPanel account has one main domain that it is configured during the account creation.

Now what if you want to host another domain (website) in the same cPanel account? The name of this feature pretty much says it all: Addon Domains. Using Addon Domains you can create another domain and configure the document root for the data according to your needs.

You can also create subdomains if that is needed.

Aliases serve in cases when you want to use an alternative domain that has no content to point to another that you use. Cases such as holding a domain that you want to sell later or when you want to redirect traffic to another domain.

The Redirects are similar to an extent. Using them you can send all your domain visitors to a particular page or a different URL. For example, you have a long URL but you want your visitors to enter a shorter URL, They can do that if you configure the redirect from the short to the long URL.

There are two types of redirects:

  • Permanent (301);
  • Temporary (302).

This feature is very useful for webmasters. To a user these types of redirects seem to work the same way, but as far as search engines are concerned they are very different. A 301 redirect means that the page has permanently moved to a new location while a 302 redirect is only temporary. Search engines need to know whether to keep the old page or to replace it with the one that is hosted elsewhere. SEO wise, almost always it is recommended to use a 301 redirect.

DNS

When managing domains DNS is essential. The DNS records for a given domain define to which server is the domain pointed, which server handles the domain emails, whether the domain has DKIM and SPF records etc. In cPanel there is the Simple Zone Editor that can be used to manage the records if the domain’s authoritative DNS servers are set to the ones from the server on which the cPanel account is hosted.

Email

Creating and using email accounts for your domain has never be easier. cPanel has a default email server in place which is very useful for users that are not that Linux savvy. Configuring a mail server on your own can cause some headaches along the way.

email.png

From the Email section you can create email accounts for your domain/s. Your users can create autoresponders for when they cannot be reached and set email filters for specific accounts. What a great and easy way to have email for your domain, right?

Metrics

The Metrics section is useful when the webmaster or user needs to view the logs of the domain and especially useful when debugging a problem. One can also monitor the bandwidth that the cPanel account uses and access some raw log files if needed.

Security

A good security is a must considering the many exploits and attacks that are happening on a daily basis. In the security section you can configure a firewall and block malicious IP addresses, leech protect a directory which allows you to detect unusual levels of activity in password-restricted directories on your site.

An SSL certificate for the domains that are hosted in the cPanel account can be installed using the SSL/TLS option where you also manage the existing SSL certificates and sites.

 

This article clearly shows you why cPanel is so popular and why it is used by users all around the world. There’s a lot of resources and documentation you can read on their official website or you can just ask your hosting provider to help you.

7 Essential Open Source DevOps Projects

As more and more enterprises adopt a “cloud-like” infrastructure, a shift in work culture and practices — known as DevOps — is also occurring. According to Puppet’s 2016 State of DevOps report, the number of people working in DevOps teams has increased from 16 percent in 2014 to 22 percent in 2016.

That said, it’s difficult to give a true definition of DevOps because the market dynamics are changing along with the emergence of new technologies, and the term is still evolving. It may not be easy to define DevOps, but it’s quite easy to pinpoint some of the core focus areas of the DevOps movement, including automation, continuous integration, continuous deployment, and, of course, collaboration between development and operations. 

Given the diversity of organizations building their IT infrastructure, it’s hard to find one project or tool that does it all. Different projects may seem to be doing the same thing but are actually trying to solve different problems. As usual, there can be some overlapping of functionality, but, diversity is the beauty of open source, and the basic idea behind all these projects is to assist DevOps pros in doing their jobs more accurately and efficiently.

I spoke with some industry players to learn about the open source DevOps projects they like or use. What follows is a handpicked list of such projects. Please bear in mind this is not a comprehensive list of all DevOps projects out there, but those often cited as essential to DevOps teams.

The participants included: Sam Guckenheimer, Product Owner and Group Product Planner at Microsoft; Mike Fiedler, Director of Technical Operations at Datadog; Thomas Hatch, SaltStack CTO; Amit Nayar, VP of Engineering at Media Temple; Amar Kapadia, Senior Director, Product Marketing at Mirantis; Jason Hand, DevOps Evangelist and Incident & Alerting specialist at VictorOps; and Greg Bruno, VP Engineering, Co-Founder at StackIQ.

Without further ado, here are the top seven DevOps projects mentioned by these experts:

Chef

Chef is a powerful configuration management tool to automate the process of configuring, deploying and managing applications across a network. Chef does this through repeatable scripts that they aptly call “recipes” and “cookbooks” that bring these recipes together as pluggable configuration modules. Chef works across platforms including AWS, Google Cloud Platform, OpenStack, IBM SoftLayer, Microsoft Azure, Rackspace, etc. Users of Chef include Facebook, Disney, Airbnb, and Mozilla.

Puppet

Puppet is a popular DevOps project that competes with Chef. Puppet Enterprise is an automation software that automates the provisioning, configuration, and management of servers, networks and storage devices. Puppet is used by CERN, Wikimedia, Mozilla, Reddit, Oracle, and PayPal.

Ansible

Ansible is simpler IT automation software. According to its GitHub page, “Ansible handles configuration management, application deployment, cloud provisioning, ad hoc task-execution, and multinode orchestration — including trivializing things like zero downtime rolling updates with load balancers.” Ansible offers an agentless approach (which means all you need is an SSH shell or APIs), eliminating the needs of third-party software. Ansible was recently acquired by Red Hat and is seen as Red Hat’s answer to Puppet and Chef, giving the company their own tool for the stack. Recently Ansible gained the capability to also automate network infrastructure using SSH and APIs.

SaltStack

SaltStack (Salt) competes with all three products mentioned above. Salt treats infrastructure as code and automates the management and configuration of any infrastructure or application at scale. Thomas Hatch, founder of SaltStack, said, “SaltStack software is used for data-driven, intelligent orchestration of converged infrastructure at scale and to configure the most complex application environments. SaltStack also offers support subscriptions and professional services SaltStack Enterprise customer and Salt Open users.”

Docker

Container technology has been around for quite some time, but Docker popularized it to an extent that it has sort of become a revolution. Go to any Linux conference these days, and you’ll hear container talk everywhere. Docker allowed developers to package, ship, and run any application as a lightweight container that can easily move across platforms. Docker containers are hardware and platform agnostic, which means you can run them anywhere — from your dirt cheap laptop to your monstrous mainframe.

“Docker, and containerization in general, is going to significantly change how DevOps teams work. Containers will be the new package format, and the CI/CD pipelines will change accordingly,” said Guckenheimer.

Kubernetes

Kubernetes is a great example of a big company turning a byproduct of their operations into a product. Kubernetes is what Google internally uses to manage a cluster of containers spread across multiple nodes. As a container management solution, Kubernetes enables DevOps by controlling containerized applications across nodes. It provides a very efficient mechanism to deploy, maintain, and scale applications.

Jenkins

The widely-known Jenkins project is a continuous integration tool that automates the integration of the commits made to the current code base into the mainline.  It’s widely used for building new projects, for running tests for detecting bugs, for code analysis, and then for deployment.

Conclusion

This is very short list of projects in the DevOps space; many other projects are available, with each one catering to a certain use case. What’s most impressive is that all of these projects are fully open sourced. It’s more or less become a phenomenon. The success of the Linux development model has made even hard-core proprietary companies comfortable with the idea of open sourcing such projects.

When you talk about the DevOps movement, open source is the de facto development model. It has become so commonplace that no one even really mentions it. We have started to take it for granted that “it has to be open source.”