Home Blog Page 735

Mint 18 Review: “Just Works” Linux Doesn’t Get Any Better Than This

New themes and moving from GNOME/GTK 3.10 to 3.18 means two good years of Mint 18.x ahead.

The newly released Mint 18 is a major upgrade. Not only has the Linux Mint project improved Mint’s dueling desktops (Cinnamon and MATE), but the group’s latest work impacts all underlying systems. With Mint 18, Linux Mint has finally moved its base software system from Ubuntu 14.04 to the new Ubuntu 16.04.

Upgrading to the latest long-term support (LTS) release of Ubuntu means, as with the Mint 17.x series, the Mint 18.x release cycle is now locked to its base for two years. Rather than tracking alongside Ubuntu, Mint 18 and all subsequent releases will stick with Ubuntu 16.04. Mint won’t necessarily get as out of date as Ubuntu LTS releases tend to by the end of their two-year cycle, but this setup does mean nothing major is going to change for quite a while.

Read more at Ars Technica

Severe Vulnerabilities Discovered in HTTP/2 Protocol

Four high-profile bugs have been found in the protocol, potentially placing 85 million websites at risk.

On Wednesday at Black Hat USA, cybersecurity firm Imperva released new research into a number of high-profile flaws found within the latest version of HTTP, HTTP/2, which underpins the worldwide web’s underlying protocols and communication systems.

The report, HTTP/2: In-depth analysis of the top four flaws of the next generation web protocol (.PDF), details four main vulnerabilities and attack vectors related to HTTP/2, of which adoption is steadily increasing.

Read more at ZDNet

How to Deal with COTS Products in a DevOps World

The primary objective of DevOps is to increase the speed of delivery at reliable quality. To achieve this, good configuration management is crucial as the level of control at higher speed of delivery becomes more and more important (while riding a bike you might take your hands off the handle bar once in a while, but a formula one driver is practically glued to the steering wheel). Yet commercial-off-the-shelf (COTS) products often don’t provide any obvious ways to manage them like you manage your custom software. This is a real challenge for large organisations who deal with a mixed technology landscape.

In this article I will explore ways to apply modern DevOps practices when dealing with COTS products.

Read more at InfoQ

Kaminsky Aims to Secure the Internet

In a rambling, hourlong keynote address at the Black Hat USA conference here today, security luminary Dan Kaminsky detailed the risks and the opportunities inherent in the internet.

Kaminsky first shot to fame in 2008, when he revealed a flaw at that year’s Black Hat USA event in the pervasive Domain Name System (DNS) protocol. At the time, he said that a web doomsday had been averted and the continued safe operation of the internet could continue. Now in 2016, with threats of government surveillance and an ongoing public debate about the use of encryption, Kaminsky once again sees the basic fabric of the internet to be at risk.

“We have work to do to keep the internet working,” he said. “I’m here to encourage everyone to notice what is wrong, how it can get worse and what we can do about it.”

Read more at eWeek

Mesosphere’s ‘Container 2.0’ Unites Stateless and Stateful Workloads

The argument over the viability of stateful container-based applications versus stateless ones is long settled. In real-world multi-tenant production environments, applications need access to persistent data stores and volumes. Its ridiculous to make developers jump through hoops even open-source, device-agnostic, standardized hoops so that they can send messages or record entries in a key/value store or a log file.

Mesosphere has a worked out a way to manage both stateful and stateless container workloads, along with workloads not even using containers, all on the same shared infrastructure, using DC/OS (Mesosphere’s Data Center Operating System) both the commercial and open source editions.

The trick is to allow some distributed programs handle their own scheduling. Container orchestrators, such as Kubernetes and the Docker Engine, use a single “monolithic,” scheduler, noted Florian Leibert, Mesospheres CEO, in a blog post. “Because there is no single scheduler that can optimize for all workloads, users end up with non-optimal operating constraints, including being forced to create separate clusters for each service,” he wrote.

Read more at  The New Stack

Enterprises Increasingly Joining Open Source Ecosystem – Wikibon

A new wave of open source participation is growing among large traditional enterprises not normally considered technology developers, writes Wikibon Lead Cloud Analyst Brian Gracely. Companies like Capital One Financial Corp., Nike Inc., Deere & Co. and General Electric Co. are joining open source consortia both as users of and contributors to major initiatives.

They are doing this for the same basic reason that IT vendors such as IBM, Google and Intel have become major drivers of Apache open source projects – it allows them to participate with outside teams on developing software they need, creating better solutions to their needs faster and at less cost.

Read more at Silicon Angle

Using cPanel for Managing Services

WHM (Web Hosting Manager) is the core program that has administrative access to the back-end of cPanel. With WHM you can create cPanel accounts and set global settings that apply to them or the server. You can also create reseller accounts and offer hosting services to clients or you can use it for your own personal needs. cPanel on the other hand is the website management panel and allows you to create databases, email accounts, FTP users, add-on domains, subdomains etc. Each individual cPanel account controls the settings for that particular account.

WHM/cPanel with it’s easy to use interface is perfect for managing services such as MySQL, BIND, Apache etc. You can create databases, add domains to Apache’s configuration, create and modify DNS records for your domains, configure your services, manage the PHP version and modules, implement security rules and many more features with just a few clicks. The control panel, to a point, replaces the common system administrator that you will have to hire to configure and manage your server.

So in this article we will scratch the surface of cPanel and explain some basic features that services like Apache, MySQL, Exim+Dovecot (cPanel’s mail server trusted buddies) provide.

Probably at this point you are wondering, OK, how can I have cPanel? Well, there are two ways for acquiring a cPanel account. One is to purchase WHM and then create your cPanel account, another is to purchase shared hosting from companies that offer such with cPanel as control panel. We at RoseHosting offer both shared and fully-managed VPS hosting with cPanel and our support team is online 24/7.

How To Log Into Cpanel?

Logging into cPanel can be accomplished by navigating your favorite web browser to either https://your_server_IP:2083 or https://your_domain.com:2083. You’ve probably noticed that I’ve just provided the URLs that go via the HTTPS protocol. Accessing cPanel through HTTPS is recommended, even though on some servers a self-signed certificate is used. However, if you wish to access cPanel via HTTTP then use the port 2082 in the URL.

I Am Logged In. What Now?

As you can see from the interface there are several sections which correspond with the respective service on your VPS. You have Files, Databases, Domains, Email, Metrics, Security etc. Of course these sections can vary since some companies use a custom setup of cPanel.

Below is a picture with some of the sections we described into a cPanel account that uses the paper_lantern theme.

sections.png

Most of the options are pretty straightforward. You can check the account’s disk usage by clicking on (you’ve probably guessed it, yes) Disk Usage.

You need to use FTP to upload data to your server? Click on FTP accounts, create one and use FileZilla or any other FTP client to connect to your server. If you don’t know how to configure FileZilla, CoreFTP or Cyberduck then cPanel has a solution for you. Next to each FTP account you have the “Configure FTP Client” action menu and when you click on it you will get to choose from three configuration files for FTP clients to suit your needs.

Another great feature that cPanel offers is the File Manager which is exactly as the name suggests. A manager that can be used to organize and edit your account files and directories. Think of it as some kind of an FTP client GUI that with it’s ease of use enables you to modify files, change permissions, extract and compress files/directories, upload and download data etc.

What if you want to password protect some directories in the account? Then use Directory Privacy. It will allow you to prevent access to directories of your choice so when a user tries to view the content in the protected directory, he will be prompted for a username and password.

And of course we cannot forget one of the most important things to have when running a server – backup. Using cPanel’s Backup Wizard you can make a partial or a full backup of the account in question. So when something goes horribly wrong, then you can easily restore the account/website/app functionality from the latest working backup.

Once you are in the backup wizard you will find two options as shown below:

backup wizard

So by clicking on the Backup button you will start a creating backup process which later in the second step you can choose whether to be full or partial and then download it to your local machine. The backup will be stored in the /home directory on your server.

Restoring a backup is easy-peasy. Just click on Restore, select the restore type and then upload the backup from your local machine.

How To Create A Database From Cpanel?

It is pretty simple actually. Locate the Database section and click on ‘MySQL Database Wizard’. A window will open where you can create a new database, database user, modify user privileges over a database and add a user to it’s respective database. The steps are presented with the below images:

2.png

3.png

4.png

When in need of editing a database use phpMyadmin. You just need to click on phpMyAdmin and you will be redirected to this very useful tool’s interface.

How Can I Create Another Domain?

Let’s go to our next topic, Domains. This sections is all about the names that represent your website, domains. Every cPanel account has one main domain that it is configured during the account creation.

Now what if you want to host another domain (website) in the same cPanel account? The name of this feature pretty much says it all: Addon Domains. Using Addon Domains you can create another domain and configure the document root for the data according to your needs.

You can also create subdomains if that is needed.

Aliases serve in cases when you want to use an alternative domain that has no content to point to another that you use. Cases such as holding a domain that you want to sell later or when you want to redirect traffic to another domain.

The Redirects are similar to an extent. Using them you can send all your domain visitors to a particular page or a different URL. For example, you have a long URL but you want your visitors to enter a shorter URL, They can do that if you configure the redirect from the short to the long URL.

There are two types of redirects:

  • Permanent (301);
  • Temporary (302).

This feature is very useful for webmasters. To a user these types of redirects seem to work the same way, but as far as search engines are concerned they are very different. A 301 redirect means that the page has permanently moved to a new location while a 302 redirect is only temporary. Search engines need to know whether to keep the old page or to replace it with the one that is hosted elsewhere. SEO wise, almost always it is recommended to use a 301 redirect.

DNS

When managing domains DNS is essential. The DNS records for a given domain define to which server is the domain pointed, which server handles the domain emails, whether the domain has DKIM and SPF records etc. In cPanel there is the Simple Zone Editor that can be used to manage the records if the domain’s authoritative DNS servers are set to the ones from the server on which the cPanel account is hosted.

Email

Creating and using email accounts for your domain has never be easier. cPanel has a default email server in place which is very useful for users that are not that Linux savvy. Configuring a mail server on your own can cause some headaches along the way.

email.png

From the Email section you can create email accounts for your domain/s. Your users can create autoresponders for when they cannot be reached and set email filters for specific accounts. What a great and easy way to have email for your domain, right?

Metrics

The Metrics section is useful when the webmaster or user needs to view the logs of the domain and especially useful when debugging a problem. One can also monitor the bandwidth that the cPanel account uses and access some raw log files if needed.

Security

A good security is a must considering the many exploits and attacks that are happening on a daily basis. In the security section you can configure a firewall and block malicious IP addresses, leech protect a directory which allows you to detect unusual levels of activity in password-restricted directories on your site.

An SSL certificate for the domains that are hosted in the cPanel account can be installed using the SSL/TLS option where you also manage the existing SSL certificates and sites.

 

This article clearly shows you why cPanel is so popular and why it is used by users all around the world. There’s a lot of resources and documentation you can read on their official website or you can just ask your hosting provider to help you.

7 Essential Open Source DevOps Projects

As more and more enterprises adopt a “cloud-like” infrastructure, a shift in work culture and practices — known as DevOps — is also occurring. According to Puppet’s 2016 State of DevOps report, the number of people working in DevOps teams has increased from 16 percent in 2014 to 22 percent in 2016.

That said, it’s difficult to give a true definition of DevOps because the market dynamics are changing along with the emergence of new technologies, and the term is still evolving. It may not be easy to define DevOps, but it’s quite easy to pinpoint some of the core focus areas of the DevOps movement, including automation, continuous integration, continuous deployment, and, of course, collaboration between development and operations. 

Given the diversity of organizations building their IT infrastructure, it’s hard to find one project or tool that does it all. Different projects may seem to be doing the same thing but are actually trying to solve different problems. As usual, there can be some overlapping of functionality, but, diversity is the beauty of open source, and the basic idea behind all these projects is to assist DevOps pros in doing their jobs more accurately and efficiently.

I spoke with some industry players to learn about the open source DevOps projects they like or use. What follows is a handpicked list of such projects. Please bear in mind this is not a comprehensive list of all DevOps projects out there, but those often cited as essential to DevOps teams.

The participants included: Sam Guckenheimer, Product Owner and Group Product Planner at Microsoft; Mike Fiedler, Director of Technical Operations at Datadog; Thomas Hatch, SaltStack CTO; Amit Nayar, VP of Engineering at Media Temple; Amar Kapadia, Senior Director, Product Marketing at Mirantis; Jason Hand, DevOps Evangelist and Incident & Alerting specialist at VictorOps; and Greg Bruno, VP Engineering, Co-Founder at StackIQ.

Without further ado, here are the top seven DevOps projects mentioned by these experts:

Chef

Chef is a powerful configuration management tool to automate the process of configuring, deploying and managing applications across a network. Chef does this through repeatable scripts that they aptly call “recipes” and “cookbooks” that bring these recipes together as pluggable configuration modules. Chef works across platforms including AWS, Google Cloud Platform, OpenStack, IBM SoftLayer, Microsoft Azure, Rackspace, etc. Users of Chef include Facebook, Disney, Airbnb, and Mozilla.

Puppet

Puppet is a popular DevOps project that competes with Chef. Puppet Enterprise is an automation software that automates the provisioning, configuration, and management of servers, networks and storage devices. Puppet is used by CERN, Wikimedia, Mozilla, Reddit, Oracle, and PayPal.

Ansible

Ansible is simpler IT automation software. According to its GitHub page, “Ansible handles configuration management, application deployment, cloud provisioning, ad hoc task-execution, and multinode orchestration — including trivializing things like zero downtime rolling updates with load balancers.” Ansible offers an agentless approach (which means all you need is an SSH shell or APIs), eliminating the needs of third-party software. Ansible was recently acquired by Red Hat and is seen as Red Hat’s answer to Puppet and Chef, giving the company their own tool for the stack. Recently Ansible gained the capability to also automate network infrastructure using SSH and APIs.

SaltStack

SaltStack (Salt) competes with all three products mentioned above. Salt treats infrastructure as code and automates the management and configuration of any infrastructure or application at scale. Thomas Hatch, founder of SaltStack, said, “SaltStack software is used for data-driven, intelligent orchestration of converged infrastructure at scale and to configure the most complex application environments. SaltStack also offers support subscriptions and professional services SaltStack Enterprise customer and Salt Open users.”

Docker

Container technology has been around for quite some time, but Docker popularized it to an extent that it has sort of become a revolution. Go to any Linux conference these days, and you’ll hear container talk everywhere. Docker allowed developers to package, ship, and run any application as a lightweight container that can easily move across platforms. Docker containers are hardware and platform agnostic, which means you can run them anywhere — from your dirt cheap laptop to your monstrous mainframe.

“Docker, and containerization in general, is going to significantly change how DevOps teams work. Containers will be the new package format, and the CI/CD pipelines will change accordingly,” said Guckenheimer.

Kubernetes

Kubernetes is a great example of a big company turning a byproduct of their operations into a product. Kubernetes is what Google internally uses to manage a cluster of containers spread across multiple nodes. As a container management solution, Kubernetes enables DevOps by controlling containerized applications across nodes. It provides a very efficient mechanism to deploy, maintain, and scale applications.

Jenkins

The widely-known Jenkins project is a continuous integration tool that automates the integration of the commits made to the current code base into the mainline.  It’s widely used for building new projects, for running tests for detecting bugs, for code analysis, and then for deployment.

Conclusion

This is very short list of projects in the DevOps space; many other projects are available, with each one catering to a certain use case. What’s most impressive is that all of these projects are fully open sourced. It’s more or less become a phenomenon. The success of the Linux development model has made even hard-core proprietary companies comfortable with the idea of open sourcing such projects.

When you talk about the DevOps movement, open source is the de facto development model. It has become so commonplace that no one even really mentions it. We have started to take it for granted that “it has to be open source.”

 

Howdy, Ubuntu on Windows! Writing for Compiled Languages

Microsoft’s addition of the Bash shell and Ubuntu user space in Windows 10 is a real win for developers everywhere. Dozens of compilers and thousands of libraries are now immediately available on Windows desktops.  In this article, we’re going to write the classic “hello world” application in several different compiled languages, install any necessary dependencies, and execute our compiled code.

If you’d like to follow along this article and try out these examples, you can grab all of the source code from git:

$ sudo apt update

$ sudo apt install -y git

$ git clone https://github.com/dustinkirkland/howdy-windows.git

$ cd howdy-windows

$ make

Now, let’s look at each language:

  1. C

  • Installation

$ sudo apt install -y gcc

  • Code: c/howdy.c

#include <stdio.h>

int main() {

       printf(”    ====> C: Howdy, Windows!n”);

       return 0;

}

  • Compilation

$ gcc -o c/howdy c/howdy.c

  • Execution

        $ ./c/howdy

   ====> C: Howdy, Windows!

  1. C++

  • Installation

$ sudo apt install -y g++

  • Code: cpp/howdy.cpp

        #include <iostream>

int main() {

       std::cout << ”    ====> C++: Howdy, Windows!n”;

}

  • Compilation

$ g++ -o cpp/howdy cpp/howdy.cpp

  • Execution

        $ ./cpp/howdy

   ====> C++: Howdy, Windows!

  1. Golang

  • Installation

$ sudo apt install -y golang

  • Code: golang/howdy.go

        package main

import “fmt”

func main() {

       fmt.Printf(”    ====> Golang: Howdy, Windows!n”)

}

  • Compilation

$ go build -o golang/howdy golang/howdy.go

  • Execution

        $ ./golang/howdy

   ====> Golang: Howdy, Windows!

  1. Fortran

  • Installation

$ sudo apt install -y gfortran

  • Code: fortran/howdy.f90

            program howdy

 print *, ”    ====> Fortran: Howdy, Windows!”

end program howdy

  • Compilation

$ gfortran fortran/howdy.f90 -o fortran/howdy

  • Execution

        $ ./fortran/howdy

    ====> Fortran: Howdy, Windows!

  1. Pascal

  • Installation

$ sudo apt install -y fp-compiler

  • Code: pascal/howdy.pas

        program Howdy;

Begin

       writeln(‘    ====> Pascal: Howdy, Windows!’);

end.

  • Compilation

$ pc pascal/howdy.pas

  • Execution

        $ ./pascal/howdy

   ====> Pascal: Howdy, Windows!

  1. Erlang

  • Installation

$ sudo apt install -y erlang-base

  • Code: erlang/howdy.erl

        -module(howdy).

-export([howdy/0]).

howdy() -> io:fwrite(”    ====> Erlang: Howdy, Windows!n”).

  • Compilation

$ erlc erlang/howdy.erl

  • Execution

        $ erl -noshell -s howdy howdy -s init stop

   ====> Erlang: Howdy, Windows!

 

Cheers,

Dustin

 

Read the next article: Howdy, Ubuntu on Windows! How Fast Is It?

Read previous articles in the series:

Howdy, Ubuntu on Windows! An Intro From Canonical’s Dustin Kirkland

Howdy, Ubuntu on Windows! Getting Started

Howdy, Ubuntu on Windows! Ubuntu Commands Every Windows User Should Learn

Howdy, Ubuntu on Windows! Write and Execute Your First Program

Learn more about Running Linux Workloads on Microsoft Azure in this on-demand webinar with guest speaker Ian Philpot of Microsoft. Watch Now >> 

When to Containerize Legacy Applications — And When Not to

Sitting across the table in early talks with new customers, I often find myself thinking of the Blendtec marketing campaign that started almost 10 years ago, whereby the creator of Blendtec blenders discovers through his YouTube channelwhat objects will blend in his line of blenders. The opening tagline was always, “Will it blend?” He then goes on to show that the chosen object of the day will, in fact, blend in the Blendtec. What results is most usually a smokey, soupy mess, but it did, in fact, blend!

In many of BoxBoats new engagements, we are presented with legacy applications and asked a simple question, Can it be containerized? With few exceptions, my answer is most always, yes. Our goal is then to demonstrate a viable path forward to migrate these applications and deliver the incredible benefits of containerization without the hot, soupy mess.

Anything can be containerized. Just because it can be, however, doesnt mean it should be.

Applications running in a container at the end of the day is still a Linux process being managed by the host. There are now fairly robust mechanisms for handling networking, monitoring, and persistent storage for stateful applications for containerized applications. Here are some areas that container technologies handle fairly well:

Read more at  The New Stack