Home Blog Page 446

5 Tricks for Using the sudo Command

The sudoers file can provide detailed control over user privileges, but with very little effort, you can still get a lot of benefit from sudo. In this post, we’re going to look at some simple ways to get a lot of value out of the sudo command in Linux.

Trick 1: Nearly effortless sudo usage

The default file on most Linux distributions makes it very simple to give select users the ability to run commands as root. In fact, you don’t even have to edit the /etc/sudoers file in any way to get started. Instead, you just add the users to the sudo or admin group on the system and you’re done.

Adding users to the sudo or admin group in the /etc/group file gives them permission to run commands using sudo.

Read more at Network World

Top 10 Moments in 2017 Linux Foundation Events

See the Top 10 moments of 2017 Linux Foundation events, including a conversation with Linus Torvalds, a video created by actor Joseph Gordon-Levitt through his colloborative production company, the Diversity Empowerment Summit, and Auto Grade Linux in the new Toyota Camry.

And, you can look forward to more exciting events in 2018. Check out the newly released 2018 Events calendar and make plans now to attend or to speak at an upcoming conference.

Read more at The Linux Foundation

Finding Files with mlocate: Part 3

In the previous articles in this short series, we introduced the mlocate (or just locate) command, and then discussed some ways the updatedb tool can be used to help you find that one particular file in a thousand.

You are probably also aware of xargs as well as the find command. Our trusty friend locate can also play nicely with the –null option of xargs by outputting all of the results onto one line (without spaces which isn’t great if you want to read it yourself) by using the -0 switch like this:

# locate -0 .bash

An option I like to use (if I remember to use it — because the locate command rarely needs to be queried twice thanks to its simple syntax) is the -e option.

# locate -e .bash

For the curious, that -e switch means “existing.” And, in this case, you can use -e to ensure that any files returned by the locate command do actually exist at the time of the query on your filesystems.

It’s almost magical, that even on a slow machine, the mastery of the modern locate command allows us to query its file database and then check against the actual existence of many files in seemingly no time whatsoever. Let’s try a quick test with a file search that’s going to return a zillion results and use the time command to see how long it takes both with and without the -e option being enabled.

I’ll choose files with the compressed .gz extension. Starting with a count, you can see there’s not quite a zillion but a fair number of files ending in .gz on my machine, note the -c for “count”:

# locate -c .gz

7539

This time, we’ll output the list but time it and see the abbreviated results as follows:

# time locate .gz

real    0m0.091s

user    0m0.025s

sys     0m0.012s

That’s pretty swift, but it’s only reading from the overnight-run database. Let’s get it to do a check against those 7,539 files, too, to see if they truly exist and haven’t been deleted or renamed since last night:

# time locate -e .gz

real    0m0.096s

user    0m0.028s

sys     0m0.055s

The speed difference is nominal as you can see. There’s no point in talking about lightning or blink-and-you-miss-it, because those aren’t suitable yardsticks. Relative to the other indexing service I mentioned previously, let’s just say that’s pretty darned fast.

If you need to move the efficient database file used by the locate command (in my version it lives here: /var/lib/mlocate/mlocate.db) then that’s also easy to do. You may wish to do this, for example, because you’ve generated a massive database file (it’s only 1.1MB in my case so it’s really tiny in reality), which needs to be put onto a faster filesystem.

Incidentally, even the mlocate utility appears to have created an slocate group of users on my machine, so don’t be too alarmed if you see something similar, as shown here from a standard file listing:

-rw-r-----. 1 root slocate 1.1M Jan 11 11:11 /var/lib/mlocate/mlocate.db

Back to the matter in hand. If you want to move away from /var/lib/mlocate as your directory being used by the database then you can use this command syntax (and you’ll have to become the “root” user with sudo -i or su – for at least the first command to work correctly):

# updatedb -o /home/chrisbinnie/my_new.db

# locate -d /home/chrisbinnie/my_new.db SEARCH_TERM

Obviously, replace your database name and path. The SEARCH_TERM element is the fragment of the filename that you’re looking for (wildcards and all).

If you remember I mentioned that you need to run updatedb command as the superuser to reach all the areas of your filesystems.

This next example should cover two useful scenarios in one. According to the manual, you can also create a “private” database for standard users as follows:

# updatedb -l 0 -o DATABASE -U source_directory

Here the previously seen -o option means that we output our database to a file (obviously called DATABASE). The -l 0 addition apparently means that the “visibility” of the database file is affected. It means (if I’m reading the docs correctly) that my user can read it but, otherwise, without that option, only the locate command can.

The second useful scenario for this example is that we can create a little database file specifying exactly which path its top-level should be. Have a look at the database-root or -U source_directory option in our example. If you don’t specify a new root file path, then the whole filesystem(s) is scanned instead.

If you want to get clever and chuck a couple of top-level source directories into one command, then you can manage that having created two separate databases. Very useful for scripting methinks.

You can achieve that with this command:

# locate -d /home/chrisbinnie/database_one -d /home/chrisbinnie/database_two SEARCH_TERM

The manual dutifully warns however that ALL users that can read the DATABASE file can also get the complete list of files in the subdirectories of the chosen source_directory. So use these commands with some care.

Priced To Sell

Back to the mind-blowing simplicity of the locate command in use on a day-to-day basis. There are many times when newbies may confused with case-sensitivity on Unix-type systems. Simply use the conventional -i option to ignore case entirely when using the flexible locate command:

# locate -i ChrisBinnie.pdf

If you have a file structure that has a number of symlinks holding it together, then there might be occasion when you want to remove broken symlinks from the search results. You can do that with this command:

# locate -Le chrisbinnie_111111.xml

If you needed to limit the search results then you could use this functionality, also in a script for example (similar to the -c option for counting), as so:

# locate -l25 *.gz

This command simply stops after outputting the first 25 files that were found. When piped through the grep command, it’s very useful on a super busy system.

Popular Area

We briefly touched upon performance earlier, and I happened to see this nicely written blog entry, where the author discusses thoughts on the trade-offs between the database size becoming unwieldy and the speed at which results are delivered.

What piqued my interest are the comments on how the original locate command was written and what limiting factors were considered during its creation. Namely how disk space isn’t quite so precious any longer and nor is the delivery of results even when 700,000 files are involved.

I’m certain that the author(s) of mlocate and its forebears would have something to say in response to that blog post. I suspect that holding onto the file permissions to give us the “secure” and “slocate” functionality in the database might be a fairly big hit in terms of overhead. And, as much as I enjoyed the post, I won’t be writing a Bash script to replace mlocate any time soon. I’m more than happy with the locate command and extol its qualities at every opportunity.

Sold

I hope you’ve acquired enough insight into the superb locate command to prune, tweak, adjust, and tune it to your unique set of requirements. As we’ve seen, it’s fast, convenient, powerful, and efficient. Additionally, you can ignore the “root” user demands and use it within scripts for very specific tasks.

My favorite aspect, however, is when I’m awakened in the middle of the night because of an emergency. It’s not a good look, having to remember the complex find command and typing it slowly with bleary eyes (and managing to add lots of typos):

# find . -type f -name "*.gz"

Instead of that, I can just use the simple locate command:

# locate *.gz

As has been said, any fool can create something bigger, bolder, and tougher, but it takes a bit of genius to create something simpler. And, in terms of introducing more people to the venerable Unix-type command line, there’s little argument that the locate command welcomes them with open arms.

Learn more about essential sysadmin skills: Download the Future Proof Your SysAdmin Career ebook now.

Chris Binnie’s latest book, Linux Server Security: Hack and Defend, shows how hackers launch sophisticated attacks to compromise servers, steal data, and crack complex passwords, so you can learn how to defend against these attacks. In the book, he also talks you through making your servers invisible, performing penetration testing, and mitigating unwelcome attacks. You can find out more about DevSecOps and Linux security via his website (http://www.devsecops.cc).

Why the Open Source Community Needs a Diverse Supply Chain

At this year’s Opensource.com Community Moderator’s meeting in Raleigh, North Carolina, Red Hat CEO Jim Whitehurst made a comment that stuck with me.

“Open source’s supply chain is source code,” he said, “and the people making up that supply chain aren’t very diverse.”

Diversity and inclusivity in the technology industry—and in open source communities more specifically—have received a lot of coverage, both on Opensource.com and elsewhere. One approach to the issue foregrounds arguments about concepts that are more abstract—like human decency, for example.

But the “supply chain” metaphor works, too. And it can be an effective argument for championing greater inclusivity in our open organizations, especially when people dismiss arguments based on appeals to abstract concepts. Open organizations require inclusivity, which is a necessary input to get the diversity that reduces the risk in our supply chain.

Read more at OpenSource.com

 

 

Exploring the Linguistics Behind Regular Expressions

Little did I know that learning about Chomsky would drag me down a rabbit hole back to regular expressions, and then magically cast regular expressions into something that fascinated me. What enchanted me about regular expressions was the homonymous linguistic concept that powered them.

I hope to spellbind you, too, with the linguistics behind regular expressions, a a backstory unknown to most programmers. Though I won’t teach you how to use regular expressions in any particular programming language, I hope that my linguistic introduction will inspire you to dive deeper into how regular expressions work in your programming language of choice.

To begin, let’s return to Chomsky: what does he have to do with regular expressions? Hell, what does he even have to do with computer science?

Read more at Dev.to

Introducing BuildKit

BuildKit is a new project under the Moby umbrella for building and packaging software using containers. It’s a new codebase meant to replace the internals of the current build features in the Moby Engine.

BuildKit emerged from the discussions about improving the build features in Moby Engine. We received a lot of positive feedback for the multi-stage build feature introduced in April and had proposals and user requests for many similar additions. But before that, we needed to make sure that we have capabilities to continue adding such features in the future and a solid foundation to extend on. Quite soon it was clear that we would need to redefine most of the fundamentals about how we even define a build operation and needed a clean break from the current codebase.

Read more at Moby Project

Introducing Fn: “Serverless Must Be Open, Community-Driven, and Cloud-Neutral”

Fn, a new serverless open source project was announced at this year’s JavaOne. There’s no risk of cloud lock-in and you can write functions in your favorite programming language. “You can make anything, including existing libraries, into a function by packaging it in a Docker container.” We invited Bob Quillin, VP for the Oracle Container Group to talk about Fn, its best features, next milestones and more.

JAXenter: Oracle’s Mike Lehmann told us recently that “Oracle sees serverless as a natural next step from where the industry has gone from app server-centric models to containers and microservices and more recently with serverless.” At JavaOne 2017, Mark Cavage discussed Java’s pervasiveness in the cloud and the need to support container-centric microservices and serverless architectures. Why the sudden interest in serverless?

Bob Quillin: Developer efficiency, economics, and ease of use will drive serverless forward.  We believe serverless technology will drive a new, more efficient economic model – for both development teams and cloud providers while making a developer’s life that much easier.  

Read more at Jaxenter

AT&T Wants White Box Routers with an Open Operating System

AT&T says it’s not enough to deploy white box hardware and to orchestrate its networks with the Open Network Automation Platform (ONAP) software. “Each individual machine also needs its own operating system,” writes Chris Rice, senior vice president of AT&T Labs, Domain 2.0 Architecture, in a blog post. To that end, AT&T announced its newest effort — the Open Architecture for a Disaggregated Network Operating System (dNOS).

“If we want to take full advantage of the benefits of white box routers and other hardware, we need an equally open and flexible operating system for those machines,” writes Rice.

DNOS appears to be in the visionary phase. “Our goal is to start an industry discussion on technical feasibility … and determine suitable vehicles (standards bodies, open source efforts, consortia, etc.) for common specification and architectural realization,” according to an AT&T white paper, introducing dNOS.

Read more at SDxCentral

LiFT Scholarship Recipients Advance Open Source Around the World

Fifteen people from 13 different countries have received Linux Foundation Training Scholarships (LiFT) in the category of Linux Newbies. This year, 27 people received scholarships across all categories  the most ever awarded by the Foundation.

Now in its seventh year, the program awards training scholarships to current and aspiring IT professionals worldwide who may not otherwise have the means for specialized training. The Foundation has awarded 75 scholarships worth more than $168,000 since the program began.

This year, The Linux Foundation received a record 1,238 applications for 14 scholarships, which are typically given to two people in seven categories. However, the quality of the applications in the Linux Newbies category was so high that the Foundation chose to award an unprecedented 15 scholarships in that category alone.

To qualify, candidates must demonstrate they want to contribute to the advancement of Linux and open source software and help influence their future. Applicants can be located anywhere in the world and must show a passion for these technologies and a proven interest in becoming an open source professional. 

Linux Newbies

The 15 recipients in the Linux Newbies category are new to Linux but have learned the basics by completing the Intro to Linux online course. These recipients have been awarded scholarships to take the next course in this career-focused series: Essentials of System Administration along with the Linux Foundation Certified System Administrator exam.

Alexander Anderson

They are:

Alexander Anderson, 28, United Kingdom

Alexander starting coding in PHP at 14 years old, and currently runs Debian as his primary operating system. He hopes this scholarship will help him establish a career in open source so he can better care for his wife, who is disabled.

Fatma Aymaz, 28, Turkey

Fatma Aymaz

Fatma was raised in a region where most people believed women should not receive an education. She fought for the ability to attend school and eventually received a university degree in international relations. She is interested in programming and Linux and hopes to take this opportunity to make a move and establish a career in open source computer science. Her dream is to study and change herself and the world.

Jules Bashizi Irenge, 36, United Kingdom

Jules Bashizi Irenge

Jules is seeking asylum from Congo. He has earned a Masters in Computer Science from the University of Liverpool. He is a long-time Linux user, having completed his undergraduate project on CentOS 6. He hopes to go on to complete a Ph.D. program in computer science and use Linux for future research projects. Jules says he is passionate about Linux and in the future he wants to learn about kernel programming.

Cruzita Thalia Cabunoc, 21, Philippines

Cruzita Thalia Cabunoc

Cruzita is a student at the Technological Institute of the Philippines-Quezon City, where she is currently studying Computer Engineering. She found shell scripting intriguing while taking Intro to Linux and now has the goal of becoming a Linux systems administrator after she completes her studies. Cruzita has no experience developing open source software but is interested in taking additional Linux training courses and is open to new knowledge.

Dimitris Grendas, 27, Greece

Dimitris Grendas

Dimitris has studied informatics and cybersecurity at two different universities and hopes to eventually earn a Master’s degree in cybersecurity. He is preparing to start an internship where he will compile a custom Linux system based on systemd from source code instead of using pre-compiled binary packages. His interests in IT are cybersecurity and networking and looks forward to advancing his IT skills with more education and practice. “I can’t wait to move to my next education step!” he said.

Valentin Ilco, 25, Moldova 

Valentin Ilco

Valentin works at the Center of Space Technologies of Technical University of Moldova, where he has helped with development of the first Moldovan satellite, which utilized open source software. He hopes to use even more open source in his future projects. The most interesting about the Intro to Linux to him was learning how Linux can make complex operations and interactions with hardware easier. “The Intro to Linux course put everything in its own place and made a really good base for future development,” he said.

Andreea Ilie, 28, Romania

Andreea Ilie

Andreea studied Japanese and East Asian culture during her university years, but had a strong interest in IT. She taught herself using free online resources in her free time, and eventually managed to secure an IT job despite her lack of formal training and experience. She has a few Python projects hosted on GitHub, and she hopes this scholarship provides her with a stronger knowledge of Linux, and the certification to demonstrate it.

Andreea said she liked that the Linux course “tried to introduce a wide set of topics geared towards a beginner background, explained in a concise way that was easy to understand and not daunting to someone not accustomed to Linux.” She found the background information about the history of Linux and the different distributions interesting “and a welcome addition, which made appreciating the origins and usefulness of this often-dreaded operating system much easier.”

Carlo Martini
Carlo Martini, 27, Italy

Carlo is pursuing a computer science degree at night school. He is active in the Venice Linux Users Group, volunteers to write documentation for the Mozilla Developer Network and is part of the Amara Translating Team for the GitHub. His day job is working for a government-sponsored program for Italian youth, but he hopes this course will help him become a full-time open source professional.

Emmanuel Ongogo, 24, Kenya

Emmanuel Ongogo

Emmanuel holds a computer science degree and is a fan of Ubuntu, using it as his primary operating system since 2013. He has seen an increase in the use of open source in Kenya, including using CentOS to run the recent elections and hopes to encourage it to spread further in the country. “Apart from advancing my career, I would wish to support others who might also have an interest in Linux,” he said.

Darius Palmer, 47, United States

Darius Palmer

Darius is a ward of the state of California, currently living in a halfway house with other formerly incarcerated men. He always had an interest in computers when he was younger, and after completing the free Intro to Linux course, he wants to learn more and eventually become active as a contributor to the Linux community.

Darius said the most interesting of the Intro to Linux course was “the growth and impact that Linux has had on society.” He noted that development of the operating system has provided millions of people with careers and a livelihood. “It’s been involved in the creation of multitudes of extremely profitable entrepreneurial endeavors, such as Google. The Linux kernel has been with this company from day one. But, despite all the influence that Linux has had on the world anyone can download, use and become involved with this software for free. I want to be a part of this phenomenon.” 

Andi Rowley

Andi Rowley, 25, United States

Andi has a drive to learn more about open source as she wants to live in a world where individuals collaborate to improve technology. Since completing Intro to Linux, she has wanted to become a Linux systems administrator and hopes this scholarship will help her accomplish that goal. In her scholarship application, Andi said the most interesting section of the course was how the Linux kernel generates random numbers. “My experience using open source software has been awesome!” she said.

Sara Santos, 46, Portugal

Sara Santos

Sara recently completed a specialized course in managing computer systems and networks but has been unable to secure a job. She is passionate about open source, and would like to work in systems administration, especially with web servers, so expects this scholarship will help her achieve that. “I like the concept of open source software and the community also. I intend to continue for many many years and forward my studies to become an expert,” she said.

Sokunrotanak Srey, 28, Cambodia

Sokunrotanak Srey

Sokunrotanak didn’t study computers but he decided after school to go into the field anyway. He works as an IT technician at non-profit Asian Hope, where he first encountered open source in the form of Ubuntu. He hopes this scholarship will help him improve his skills to better serve Asian Hope and the people it helps. “I love how the Linux community collaborates and helps each other out. I have received lots of help from the community to solve problems in my organization, and I’m looking forward to improving my skill so that I can contribute to the community,” he said.

George Udosen

George Udosen, 42, Nigeria

George originally studied biochemistry, but now has a passion for open source. He would like to learn more and become certified in Linux, so he can pursue a career teaching it and encouraging more people to join the open source community.

Glenda Walter, 28, Dominica

Glenda Walter

Glenda studied building and civil engineering in college, but now wants to become a Linux systems administrator. She has started running CentOS 7 but knows she needs more training before she can make a big career change into open source.

Learn more about the LiFT Scholarship program from The Linux Foundation.

Introducing container-diff, a Tool for Quickly Comparing Container Images

The Google Container Tools team originally built container-diff, a new project to help uncover differences between container images, to aid our own development with containers. We think it can be useful for anyone building containerized software, so we’re excited to release it as open source to the development community.

Containers and the Dockerfile format help make customization of an application’s runtime environment more approachable and easier to understand. While this is a great advantage of using containers in software development, a major drawback is that it can be hard to visualize what changes in a container image will result from a change in the respective Dockerfile. This can lead to bloated images and make tracking down issues difficult.

Imagine a scenario where a developer is working on an application, built on a runtime image maintained by a third-party. During development someone releases a new version of that base image with updated system packages. The developer rebuilds their application and picks up the latest version of the base image, and suddenly their application stops working; it depended on a previous version of one of the installed system packages, but which one? What version was it on before? 

Read more at Google Open Source Blog