Home Blog Page 641

Lightweight Docker Images in 5 Steps

Deploying your services packaged in lightweight Docker images has many practical benefits. In a container, your service usually comes with all the dependencies it needs to run, it’s isolated from the rest of the system, and deployment is as simple as running a docker run command on the target system.

However, most of the benefits of dockerized services can be negated if your Docker images are several gigabytes in size and/or they take several minutes to boot up. Caching Docker layers can help, but ideally you want to have small and fast containers that can be deployed and booted in a mater of minutes, or even seconds.

The first time we used Docker at Rendered Text to package one of Semaphore services, we made many mistakes that resulted in a huge Docker image that was painful for deployment and maintenance. However, we didn’t give up, and, step by step, we improved our images.

Read more at Semaphore Blog

Systems We Love: How the Past Informs Our Present

logo

It started with a July tweet asking if there was interest and after dozens of responses (and 161 “likes”) it was on. By mid-October, organizers for the first-ever “Systems We Love” conference had received 162 submissions for just 19 speaking slots, and “I marked 70 as ‘would love to see’,” committee member and Joyent software engineer Ryan Zezeski posted on Twitter.

This Tuesday, the conference finally happened… Inspired by Papers We Love where people give talks on their favorite computer science papers this event featured talks on favorite systems, conference organizer, and Joyent chief technology officer, Bryan Cantrill explained in a blog post. While papers are nice, “it is ultimately the artifacts that we develop the systems themselves that represent the tangible embodiment of our ideas.”

It began with Ryan Zezeski a software engineer a Joyent, fondly remembering his interactions Roger Faulkner, one of the pioneering programmers from the days when Unix was still spelled with all capital letters “an engineer’s engineer,” and the creator of its /proc filesystem.

Read more at The New Stack

Open Source Diversity Efforts Gain Momentum in 2016

If software is pervasive, shouldn’t the people building it be from everywhere and represent different voices? The broadly accepted answer is yes, that we need a diverse set of developers and technologists to build the new digital world. Further, when you look at communities that thrive, they are those that evolve and grow and bring in new voices and perspectives. Because much of the software innovation happening today involves open source software, the open source community can be an entry point for new people in technology roles. This means that the open source community must evolve to stay relevant. There has never been a better time for the open source community to welcome new community members from underrepresented groups than now, and the community is rising to the challenge. Efforts to increase diversity in open source are showing results, so let’s look at a few examples:
Read more at OpenSource.com

Docker containerd Ups the Open Source Container Management Ante

Enterprise container users may see improved Docker container management tools in the cloud once the Docker containerd utility is open sourced next quarter. Access to Docker containerd, which includes methods to transfer container images, container execution and supervision, low-level local storage and network interfaces for both Linux and Windows, could allow cloud provider partners to improve container orchestration service products, analysts say.

“It may result in more options for customization,” said Jay Lyman, analyst with 451 Research. “It may make it easier for, say, AWS, to connect Docker and containers to some of its other services.”

Read more at TechTarget

Five Key Takeaways from KubeCon 2016

Now that KubeCon 2016 is over, we have some time to reflect on the State of the Kubernetes project and communities, the event itself, and the marketplace going forward into 2017. Red Hat has been a part of this community since well before it was launched, but it’s incredibly important to understand how a community evolves over time. If you we’re able to attend the events in Seattle, the Cloud Native Computing Foundation (CNCF) has posted all of the videos online.

Looking back on the events in Seattle, here are five key takeways that will continue to shape the community and market for the next few years.

The communities that develop around open source projects are always interesting to watch. Sometimes they are driven almost entirely by a single person or company. Other times they are highly fragmented. The Kubernetes community is neither of those. …

Read more at OpenShift

4 Hindrances in Implementing DevOps

4 Hindrances in Implementing DevOps

DevOps is becoming the new gaga everywhere. According to Puppet 2014 State of DevOps Report DevOps organizations are implementing code frequently more than 30 times compared to their counterparts. This shows the increasing adoption of DevOps. However, mistakes and errors do take place while implementing DevOps. Especially new businesses and small sized companies are making mistakes. Nothing is easy to achieve and so challenges are a part and parcel of something new you wish to deploy.

Gradually, DevOps is becoming an effective and efficient way to deploy and develop cloud hosting applications, but as of now it is just the start of the adoption. DevOps implementation helps in eliminating the barrier between operational teams and development team, thereby reducing the backlogs of enterprise applications. Organizations deploying DevOps tools and processes frequently realize too late that they have committed mistakes. Many of them are such which require them to stop, take back up and then start again. So where are organizations going wrong? Mistakes vary between organizations to organization, however some mistakes are found to be common amongst organizations regarding DevOps failure.

Keeping Technology before People

Removing the barrier between operations staff and developers is the core purpose of deploying DevOps. One general mistake which companies make while deploying DevOps is concentrating on technology too soon instead on processes and people. With this, organizations choose DevOps tools which are do not have a longer life. Overlooking the fact of training the staff and changing IT processes can be lethal. Due investment should be made on training programs focusing on optimum utilization of technology, ways to adopt uninterrupted development, integration, testing, operations and deployment. At a point the DevOps tools will change but processes and people will not change.

Ignoring Security and Governance

Often organizations ignore in considering security and governance which is systematic to the applications. Now, security can no longer be separated from applications. Make sure to integrate security in every process carried out, even continuous examining and development. Gone are the days of erecting walls around applications. Government should be systematic for cloud application development and everything that is involved in each and every step of DevOps processes, even those policies that restricts the use of services or API’s, also service dependencies and service discovery.

Resistance to Change

Executing DevOps signifies changing in terms of development, testing, deployment and the way in which applications are operated. Technology, tools and processes should change for which the organizations should set metrics to measure the productivity of the changes so implemented. It is inevitable for DevOps to change evolve with the changing ideas and technology. The designing of DevOps process should be carried keeping in mind the change aspect.

Work Disruption

New workflows will take place when DevOps implemented which will bring a change throughout the organization. To take care of new workflows, investment is to be done in new tools as existing tools will become obsolete. Regulatory compliance and security issues will be implied on new and compound workflows. For that sake, it is important that the implementation team of DevOps has a broad skill set and interests of all stakeholders are put forward during planning. A holistic view is required for complete value chain for software delivery, starting from conception ranging to monitoring in production. DevOps implementation team will undergo a good exercise of performing value-stream mapping for existing as well as proposed processes and overlapping those maps for identifying areas causing friction.

DevOps is still WIP (Work in progress) no matter if you are an enterprise development workshop or vendor. While implementing DevOps, the lessons we learn in the coming years will enable us to enhance the process and thereby launch better applications.

5 Best Screen Recorder Apps For Linux

best screen recorder apps for linuxThere can be so many reasons when you need to record your screen. When I started using Linux, I used to record my screen whenever I had any problem and upload it to Linux communities. Today I sometimes record my screen for demonstration purposes in my tutorials. Similarly, you can record your screen in Linux. In this article, I’m going to mention 5 Best screen recording app for Linux. So let’s get started!

Read More At LinuxAndUbuntu

Creating Your Own Webserver and Hosting A Website from Your Linux Box

Most people assume that running your own webserver requires an incredible set of skills, something that only a cast member out of Mr. Robot would be capable of doing. Not true. It’s relatively straightforward; assuming you have the right equipment (and you don’t need much!), you can get your very own web server up and running in just a couple of hours.

While this is technically a ‘no experience in managing servers necessary’ tutorial, it is useful to know your way around your average Ubuntu OS already. The entire process falls into theintermediate category.

Ready to begin? The process is split up into three major parts:

  • System requirements (does your box have what it takes?)
  • Setting up the required web server software (MySQL, PHP, Apache)
  • Connecting your server to the world

System Requirements

The exact system requirements depend on which particular OS you decide to run. We’re going to recommend you go with Ubuntu OS, but to be honest, you can use any Linux distribution. However, we’ve found Ubuntu to be the most user-friendly and stable version out there – but it’s down to personal preference, as your installation will work fine regardless.

The beauty of using Linux is that it doesn’t require much when it comes to hardware. The official Ubuntu website recommends the following:

  • 2 GHZ dual core processor
  • 25 GB of (free) hard drive space
  • 2 GB RAM
  • USB port or DVD drive for installation purposes
  • Internet access

To guarantee a smooth ride, we recommend going a little higher than the minimum requirements, but you don’t need to spend much to get something that more than fits the bill.

Setting Up Web Server Software

If you have any experience in setting up your development environment, you’ve probably heard of the LAMP (Linux, Apache, MySQL, PHP) stack – these are the set of applications that you’ll need to get your server up and running:

  • Linux: We’re going to assume you can get Ubuntu OS loaded on your own (if not, check out the official guide).
  • Apache2: The most popular HTTP server out there – and no surprise, it’s open-source! The Ubuntu website has an excellent installation tutorial.
  • MySQL: your data will be going into your run-of- the-mill MySQL database. If you need, you can install PHPMyAdmin to make it easier to manage your data.
  • PHP and PHP SQLite component: PHP is the server side language that will interact with your databases. Again, Ubuntu’s website comes in handy when installing PHP. Go for PHP 7, it’s easily the best option – if you want to run the latest open-source solutions such as WordPress or Drupal, it’s key. There are also various other reasons why PHP 7 is our recommended route.

Think you’ve cracked installing LAMP? If you’ve ticked all the boxes, it’s time to test your work. Create the following file:

/var/www/html/info.php

Within it, paste the following code:

<php

phpinfo();

?>

Restart apache, and then open the file in your web browser. You should see your standard PHP information page, giving you the specs of your installation. Get an error? Go back to the beginning and make sure you have followed all of the steps.

Connecting Your Server to the World

You’re almost there! It’s now time to connect your creation with the world. The first step you’ll need to cover is fixing the pesky changing IP issue. There are ways to do this that will have you pulling your hair out, but we can recommend an alternative route.

To ensure localhost is available online without interruption we’re going to use ngrok, a nifty little client you can download for free.Ngrok will allow you create a ‘tunnel’ to your web server, giving access to anyone who loads your unique URL. Going into the nitty gritty of how to achieve this with ngrok is beyond the scope of this tutorial, but you can find the exact steps on the client website.

Once you’ve done that, it’s time to crack open your favorite bottle. You’ve done it, congratulations!

4 Ways to Send Email Attachment from Linux Command Line

Once you get familiar to using the Linux terminal, you wish to do everything on your system by simply typing commands including sending emails and one of the important aspects of sending emails is attachments.

Especially for Sysadmins, can attach a backup file, log file/system operation report or any related information, and send it to a remote machine or workmate.

In this post, we will learn ways of sending an email with attachment from the Linux terminal. Importantly, there are several command line email clients for Linux that you can use to process emails with simple features.

Read the complete article at Tecmint

 

Thwarting Unknown Bugs: Hardening Features in the Mainline Linux Kernel

This presentation by kernel developer Mark Rutland at Embedded Linux Conference will cover hardening features available in the mainline Linux kernel, what they protect against, and their limitations.