Home Blog

Review of Container-to-Container Communications in Kubernetes

This article was originally posted at TheNewStack.

By Matt Zand and Jim Sullivan

Kubernetes is a containerized solution. It provides virtualized runtime environments called Pods, which house one or more containers to provide a virtual runtime environment. An important aspect of Kubernetes is container communication within the Pod. Additionally, an important area of managing the Kubernetes network is to forward container ports internally and externally to make sure containers within a Pod communicate with one another properly. To manage such communications, Kubernetes offers the following four networking models:

  • Container-to-Container communications
  • Pod-to-Pod communications
  • Pod-to-Service communications
  • External-to-Internal communications

In this article, we dive into Container-to-Container communications, by showing you ways in which containers within a pod can network and communicate.

Communication Between Containers in a Pod

Having multiple containers in a single Pod makes it relatively straightforward for them to communicate with each other. They can do this using several different methods. In this article, we discuss two methods: i- Shared Volumes and ii-Inter-Process Communications in more detail.

I- Shared Volumes in a Kubernetes Pod

In Kubernetes, you can use a shared Kubernetes Volume as a simple and efficient way to share data between containers in a Pod. For most cases, it is sufficient to use a directory on the host that is shared with all containers within a Pod.

Kubernetes Volumes enables data to survive container restarts, but these volumes have the same lifetime as the Pod. This means that the volume (and the data it holds) exists exactly as long as that Pod exists. If that Pod is deleted for any reason, even if an identical replacement is created, the shared Volume is also destroyed and created from scratch.

A standard use case for a multicontainer Pod with a shared Volume is when one container writes logs or other files to the shared directory, and the other container reads from the shared directory. For example, we can create a Pod like so:

In this example, we define a volume named html. Its type is emptyDir, which means that the Volume is first created when a Pod is assigned to a node and exists as long as that Pod is running on that node; as the name says, it is initially empty. The first container runs the Nginx server and has the shared Volume mounted to the directory /usr/share/nginx/html. The second container uses the Debian image and has the shared Volume mounted to the directory /html. Every second, the second container adds the current date and time into the index.html file, which is located in the shared Volume. When the user makes an HTTP request to the Pod, the Nginx server reads this file and transfers it back to the user in response to the request. Here is a good article for reading more on similar Kubernetes topics.

You can check that the pod is working either by exposing the nginx port and accessing it using your browser, or by checking the shared directory directly in the containers:

II- Inter-Process Communications (IPC)

Containers in a Pod share the same IPC namespace, which means they can also communicate with each other using standard inter-process communications such as SystemV semaphores or POSIX shared memory. Containers use the strategy of the localhost hostname for communication within a pod.

In the following example, we define a Pod with two containers. We use the same Docker image for both. The first container is a producer that creates a standard Linux message queue, writes a number of random messages, and then writes a special exit message. The second container is a consumer which opens that same message queue for reading and reads messages until it receives the exit message. We also set the restart policy to “Never”, so the Pod stops after the termination of both containers.

To check this out, create the pod using kubectl create and watch the Pod status:

Now you can check logs for each container and verify that the second container received all messages from the first container, including the exit message:

There is one major problem with this Pod, however, and it has to do with how containers start up.

Conclusion

The primary reason that Pods can have multiple containers is to support helper applications that assist a primary application. Typical examples of helper applications are data pullers, data pushers and proxies. An example of this pattern is a web server with a helper program that polls a git repository for new updates.

The Volume in this exercise provides a way for containers to communicate during the life of the Pod. If the Pod is deleted and recreated, any data stored in the shared Volume is lost. In this article, we also discussed the concept of Inter-Process Communications among containers within a Pod, which is an alternative to shared Volume concepts. Now that you learn how containers inside a Pod can communicate and exchange data, you can move on to learn other Kubernetes networking models — such as Pod-to-Pod or Pod-to-Service communications. Here is a good article for learning more advanced topics on Kubernetes development.

About the Authors

Matt Zand
Matt is is a serial entrepreneur and the founder of three successful tech startups: DC Web Makers, Coding Bootcamps and High School Technology Services. He is a leading author of Hands-on Smart Contract Development with Hyperledger Fabric book by O’Reilly Media.
Jim Sullivan
Jim has a bachelor’s degree in Electrical Engineering and a Master’s Degree in Computer Science. Jim also holds an MBA. Jim has been a practicing software engineer for 18 years. Currently, Jim leads an expert team in Blockchain development, DevOps, Cloud, application development, and the SAFe Agile methodology. Jim is an IBM Master Instructor.

The post Review of Container-to-Container Communications in Kubernetes appeared first on Linux Foundation – Training.

How to Create and Manage Archive Files in Linux

By Matt Zand and Kevin Downs

In a nutshell, an archive is a single file that contains a collection of other files and/or directories. Archive files are typically used for a transfer (locally or over the internet) or make a backup copy of a collection of files and directories which allow you to work with only one file (if compressed, it has a lower size than the sum of all files within it) instead of many. Likewise, archives are used for software application packaging. This single file can be easily compressed for ease of transfer while the files in the archive retain the structure and permissions of the original files.

We can use the tar tool to create, list, and extract files from archives. Archives made with tar are normally called “tar files,” “tar archives,” or—since all the archived files are rolled into one—“tarballs.”

This tutorial shows how to use tar to create an archive, list the contents of an archive, and extract the files from an archive. Two common options used with all three of these operations are ‘-f’ and ‘-v’: to specify the name of the archive file, use ‘-f’ followed by the file name; use the ‘-v’ (“verbose”) option to have tar output the names of files as they are processed. While the ‘-v’ option is not necessary, it lets you observe the progress of your tar operation.

For the remainder of this tutorial, we cover 3 topics: 1- Create an archive file, 2- List contents of an archive file, and 3- Extract contents from an archive file. We conclude this tutorial by surveying 6 practical questions related to archive file management. What you take away from this tutorial is essential for performing tasks related to cybersecurity and cloud technology.

1- Creating an Archive File

To create an archive with tar, use the ‘-c’ (“create”) option, and specify the name of the archive file to create with the ‘-f’ option. It’s common practice to use a name with a ‘.tar’ extension, such as ‘my-backup.tar’. Note that unless specifically mentioned otherwise, all commands and command parameters used in the remainder of this article are used in lowercase. Keep in mind that while typing commands in this article on your terminal, you need not type the $ prompt sign that comes at the beginning of each command line.

Give as arguments the names of the files to be archived; to create an archive of a directory and all of the files and subdirectories it contains, give the directory’s name as an argument.

 To create an archive called ‘project.tar’ from the contents of the ‘project’ directory, type:

$ tar -cvf project.tar project

This command creates an archive file called ‘project.tar’ containing the ‘project’ directory and all of its contents. The original ‘project’ directory remains unchanged.

Use the ‘-z’ option to compress the archive as it is being written. This yields the same output as creating an uncompressed archive and then using gzip to compress it, but it eliminates the extra step.

 To create a compressed archive called ‘project.tar.gz’ from the contents of the ‘project’ directory, type:

$ tar -zcvf project.tar.gz project

This command creates a compressed archive file, ‘project.tar.gz’, containing the ‘project’ directory and all of its contents. The original ‘project’ directory remains unchanged.

NOTE: While using the ‘-z’ option, you should specify the archive name with a ‘.tar.gz’ extension and not a ‘.tar’ extension, so the file name shows that the archive is compressed. Although not required, it is a good practice to follow.

Gzip is not the only form of compression. There is also bzip2 and and xz. When we see a file with an extension of xz we know it has been compressed using xz. When we see a file with the extension of .bz2 we can infer it was compressed using bzip2. We are going to steer away from bzip2 as it is becoming unmaintained and focus on xz. When compressing using xz it is going to take longer for the files to compressed. However, it is typically worth the wait as the compression is much more effective, meaning the resulting file will usually be smaller than other compression methods used. Even better is the fact that decompression, or expanding the file, is not much different between the different methods of compression. Below we see an example of how to utilize xz when compressing a file using tar

  $ tar -Jcvf project.tar.xz project

We simply switch -z for gzip to uppercase -J for xz. Here are some outputs to display the differences between the forms of compression:

As you can see xz does take the longest to compress. However it does the best job of reducing files size, so it’s worth the wait. The larger the file is the better the compression becomes too!

2- Listing Contents of an Archive File

To list the contents of a tar archive without extracting them, use tar with the ‘-t’ option.

 To list the contents of an archive called ‘project.tar’, type:

$ tar -tvf project.tar  

This command lists the contents of the ‘project.tar’ archive. Using the ‘-v’ option along with the ‘-t’ option causes tar to output the permissions and modification time of each file, along with its file name—the same format used by the ls command with the ‘-l’ option.

 To list the contents of a compressed archive called ‘project.tar.gz’, type:

$ tar -tvf project.tar

 3- Extracting contents from an Archive File

To extract (or unpack) the contents of a tar archive, use tar with the ‘-x’ (“extract”) option.

 To extract the contents of an archive called ‘project.tar’, type:

$ tar -xvf project.tar

This command extracts the contents of the ‘project.tar’ archive into the current directory.

If an archive is compressed, which usually means it will have a ‘.tar.gz’ or ‘.tgz’ extension, include the ‘-z’ option.

 To extract the contents of a compressed archive called ‘project.tar.gz’, type:

$ tar -zxvf project.tar.gz

NOTE: If there are files or subdirectories in the current directory with the same name as any of those in the archive, those files will be overwritten when the archive is extracted. If you don’t know what files are included in an archive, consider listing the contents of the archive first.

Another reason to list the contents of an archive before extracting them is to determine whether the files in the archive are contained in a directory. If not, and the current directory contains many unrelated files, you might confuse them with the files extracted from the archive.

To extract the files into a directory of their own, make a new directory, move the archive to that directory, and change to that directory, where you can then extract the files from the archive.

Now that we have learned how to create an Archive file and list/extract its contents, we can move on to discuss the following 9 practical questions that are frequently asked by Linux professionals.

  • Can we add content to an archive file without unpacking it?

Unfortunately, once a file has been compressed there is no way to add content to it. You would have to “unpack” it or extract the contents, edit or add content, and then compress the file again. If it’s a small file this process will not take long. If it’s a larger file then be prepared for it to take a while.

  • Can we delete content from an archive file without unpacking it?

This depends on the version of tar being used. Newer versions of tar will support a –delete.

For example, let’s say we have files file1 and file2 . They can be removed from file.tar with the following:

$ tar -vf file.tar –delete file1 file2

To remove a directory dir1:

$ tar -f file.tar –delete dir1/*

  • What are the differences between compressing a folder and archiving it?

The simplest way to look at the difference between archiving and compressing is to look at the end result. When you archive files you are combining multiple files into one. So if we archive 10 100kb files you will end up with one 1000kb file. On the other hand if we compress those files we could end up with a file that is only a few kb or close to 100kb.

  • How to compress archive files?

As we saw above you can create and archive files using the tar command with the cvf options. To compress the archive file we made there are two options; run the archive file through compression such as gzip. Or use a compression flag when using the tar command. The most common compression flags are- z for gzip, -j for bzip and -J for xz. We can see the first method below:

$ gzip file.tar

Or we can just use a compression flag when using the tar command, here we’ll see the gzip flag “z”:

$ tar -cvzf file.tar /some/directory

  • How to create archives of multiple directories and/or files at one time?

It is not uncommon to be in situations where we want to archive multiple files or directories at once. And it’s not as difficult as you think to tar multiple files and directories at one time. You simply supply which files or directories you want to tar as arguments to the tar command:

$ tar -cvzf file.tar file1 file2 file3

or

$ tar -cvzf file.tar /some/directory1 /some/directory2

  • How to skip directories and/or files when creating an archive?

You may run into a situation where you want to archive a directory or file but you don’t need certain files to be archived. To avoid archiving those files or “exclude” them you would use the –exclude option with tar:

$ tar –exclude ‘/some/directory’ -cvf file.tar /home/user

So in this example /home/user would be archived but it would exclude the /some/directory if it was under /home/user. It’s important that you put the –exclude option before the source and destination as well as to encapsulate the file or directory being excluded with single quotation marks.

Summary

The tar command is useful for creating backups or compressing files you no longer need. It’s good practice to back up files before changing them. If something doesn’t work how it’s intended to after the change you will always be able to revert back to the old file. Compressing files no longer in use helps keep systems clean and lowers the disk space usage. There are other utilities available but tar has reigned supreme for its versatility, ease of use and popularity.

Resources

If you like to learn more about Linux, reading the following articles and tutorials are highly recommended:

About the Authors

Matt Zand is a serial entrepreneur and the founder of 3 tech startups: DC Web Makers, Coding Bootcamps and High School Technology Services. He is a leading author of Hands-on Smart Contract Development with Hyperledger Fabric book by O’Reilly Media. He has written more than 100 technical articles and tutorials on blockchain development for Hyperledger, Ethereum and Corda R3 platforms. At DC Web Makers, he leads a team of blockchain experts for consulting and deploying enterprise decentralized applications. As chief architect, he has designed and developed blockchain courses and training programs for Coding Bootcamps. He has a master’s degree in business management from the University of Maryland. Prior to blockchain development and consulting, he worked as senior web and mobile App developer and consultant, angel investor, business advisor for a few startup companies. You can connect with him on LI: https://www.linkedin.com/in/matt-zand-64047871

Kevin Downs is Red Hat Certified System Administrator or RHCSA. At his current job at IBM as Sys Admin, he is in charge of administering hundreds of servers running on different Linux distributions. He is a Lead Linux Instructor at Coding Bootcamps where he has authored 5 self-paced Courses.

The post How to Create and Manage Archive Files in Linux appeared first on Linux Foundation – Training.

Prepr Partners with the Linux Foundation to Provide Digital Work-Integrated Learning through the F.U.N.™ Program

December 14th, 2020 – Toronto, Canada – Prepr is excited to announce a new partnership with The Linux Foundation, the nonprofit organization enabling mass innovation through open source, that will give work-integrated learning experiences to youth facing employment barriers. The new initiative, the Flexible Upskilling Network (F.U.N.™) program, launches in collaboration with the Magnet Network and the Network for the Advancement of Black Communities (NABC). The F.U.N.™ program is a blended learning program, where participants receive opportunities to combine valuable work experience with digital skill development over a 16-week journey. The objective of the F.U.N.™ program is to support youth, with a focus on women and visible minority groups who are involuntarily not in employment, education, or training (NEET) in Ontario, by helping them gain employability skills, including soft skills like communication, collaboration, and problem-solving.

Caitlin McDonough, Chief Education Officer at Prepr, says about the F.U.N.™ program, “Digital skills are essential for the workforce of the future. We at Prepr, are looking forward to the opportunity to support youth capacity development for the future of work.”

With The Linux Foundation, Prepr is committed to supporting over 180 youth participants enrolling and completing the F.U.N™ program between July 2020 and March 2021. Prepr will be using its signature PIE ® method to train the participants in Project Leadership, Innovation, and Entrepreneurship to expose them to real-world business challenges. The work-integrated learning experience Prepr provides will support participants in developing both soft and hard skills, with a focus on digital skills to help them secure gainful employment for the uncertain future of work.

“In this day and age, it is essential to have a good educational foundation in technology to maximize your chances of career success,” said Clyde Seepersad, SVP and GM, Training & Certification at The Linux Foundation. “We are thrilled to partner with Prepr to bring The Linux Foundation’s vendor-neutral, expert training in the open source technologies that serve as the backbone of modern technologies to communities that will truly benefit from it. I look forward to seeing how these promising students perform and hope to partner with Prepr on future initiatives to train even more in the future.”

The program will explore digital career pathways through multiple work-related challenges. These work challenges will bring creative approaches to gaining innovative skills that are invaluable in today’s new normal of remote work and learn while allowing individuals to become more competitive in today’s digital workforce.

Stephen Crawford, MPP for Oakville, speaking about the government’s commitment to supporting youth facing employment barriers: “This government is committed to supporting our youth, notably visible minorities, as they prepare to enter the workforce. The youth of today will be the leaders of tomorrow.” The Ontario government funding for the F.U.N. program is part of a $37 million investment in training initiatives across the province.

Through the program’s blended learning approach, participants will learn how to use Prepr’s signature PIE ® tool, which addresses three essential skills gaps facing the business services sector today: expertise in innovation, project management, and business development (entrepreneurship, sales, and commercialization). At the end of the program, participants will gain a certification, along with 12 weeks of hands-on work experience, which will foster valuable, future-proof skills to secure gainful employment.

The Linux Foundation will also support participants through an introductory course to Linux and related tools: LFS101x: Introduction to Linux. The program will help to develop the digital skills essential for our new normal of work, with beginner-level challenges to fill obvious skills gaps and foster a mentality of problem-solving. With the support of open Linux Foundation resources, these challenges will be an opportunity for participants to ideate and create project solutions ready for real-world implementation.

About Prepr

Prepr provides the tools, resources, and technology to empower individuals to become lifelong problem solvers. Through triangular cooperation between the public and private sectors as well as government, Prepr aims to strengthen the collaboration on challenges that affect individuals, communities, businesses, and infrastructure to create a more sustainable future for everyone.

About The Linux Foundation

Founded in 2000, the Linux Foundation is supported by more than 1,000 members and is the world’s leading home for collaboration on open source software, open standards, open data, and open hardware. Linux Foundation’s projects are critical to the world’s infrastructure including Linux, Kubernetes, Node.js, and more. The Linux Foundation’s methodology focuses on leveraging best practices and addressing the needs of contributors, users, and solution providers to create sustainable models for open collaboration. For more information, please visit us at linuxfoundation.org.

The Linux Foundation has registered trademarks and uses trademarks. For a list of trademarks of The Linux Foundation, please see its trademark usage page: www.linuxfoundation.org/trademark-usage. Linux is a registered trademark of Linus Torvalds.

The post Prepr Partners with the Linux Foundation to Provide Digital Work-Integrated Learning through the F.U.N.™ Program appeared first on Linux Foundation – Training.

Open Source Jobs Remain Secure During COVID-19 Pandemic and More Findings From Linux Foundation and Laboratory for Innovation Science at Harvard Report

A new report from The Linux Foundation and Laboratory for Innovation Science at Harvard (LISH) has found that 56% of survey respondents reported involvement in open source projects was important in getting their current job, and 55% feel that participating in open source projects has increased their salary or otherwise improved their job prospects. The “Report on the 2020 FOSS Contributor Survey” compiled the answers of 1,196 contributors to free and open source software (FOSS), and also found that 81% stated the skills and knowledge gained by working on open source were valuable to their employer.

One highlight of the report was the finding that, “[d]espite the survey being administered during the economic downturn resulting from the COVID-19 pandemic, very few respondents were out of the workforce.” This aligns with our 2020 Open Source Jobs Report from earlier this year, in which only 4% of hiring managers reported they have laid off open source professionals due to the pandemic, and a further 2% furloughed open source staff.

In terms of why these individuals contribute to open source projects, respondents were unsurprisingly most likely to say because they use open source software and need certain features added, so they build and add said features. The next top answers provided some more insight into what motivates these open source professionals though. Those were “I enjoy learning” and “Contributing allows me to fulfill a need for creative, challenging, and/or enjoyable work”. This also aligns with the recent jobs report, where open source pros reported they decided to work in the open source community because “Open source runs everything” and “I am passionate about open source”. Both reports suggested that compensation, while important, is not a dominant source of motivation.

Focusing more on what open source projects can do to be successful, the new report goes on to suggest that, “FOSS projects could also provide some educational materials (such as tutorials or getting started guides) about their projects to help those motivated by a desire to learn.” This gets to the heart of our mission at LF Training & Certification – to make quality training materials about open source technologies accessible to everyone. 

One area of opportunity for projects, employers and open source pros according to the report is around secure development practices. The survey respondents overwhelmingly reported that they spend little time focusing on security issues, despite both the quantity and sophistication of attacks increasing year in and year out, and goes on to suggest that “a free online course on how to develop secure software as a desirable contribution from an external source” may help. LF Training & Certification released just such a training program recently in the form of our Secure Software Development Fundamentals Professional Certificate program created in partnership with the Open Source Security Foundation and hosted by non-profit learning platform edX. The program consists of three courses which can all be audited for free, or those who wish to obtain the Professional Certificate may receive such by paying a fee and passing a series of tests aligned to each course. Employers concerned about software development security issues should consider mandating that staff take training like this, and projects should consider requiring it of maintainers as well.

This is just the tip of the iceberg in terms of the findings of the FOSS Contributor Survey; we encourage you to download and review the full document for ever more insight and recommendations.

The post Open Source Jobs Remain Secure During COVID-19 Pandemic and More Findings From Linux Foundation and Laboratory for Innovation Science at Harvard Report appeared first on Linux Foundation – Training.

Tips for Starting Your New IT Career in 2021!

2020 was a difficult year for all of us, and for many it continues in 2021. Jobs have been lost, and whole industries have been forced to revamp their entire business models, leaving many out of work or facing new ways of working. While significant challenges remain, think of this as an opportunity to consider a new career in the new year. 

Pick the right path for you

The first thing to consider when looking at moving into an IT career is deciding what area of IT to pursue. The 2020 Open Source Jobs Report found the most in demand position to DevOps practitioners followed by developers. The top areas of expertise being sought by hiring managers are Linux, cloud, and security. While it’s good to consider what skills are in demand, it’s just as important to figure out which subject areas will interest you most. If you find a role that not only offers great career opportunities but that you will also enjoy, you are that much more likely to be successful. Our Career Path Quiz is a great place to start, and can point you in the direction of a technology focus that aligns with your existing interests.

Start with free training to ensure there’s a fit

Before jumping head first into a training and/or certification program, take advantage of free training courses to gain baseline knowledge and also ensure this path is really one you want to pursue. Our Plan Your Training page outlines suggested courses and certifications depending on the subject area you’ve chosen to pursue. Many paths, including System Administration, Cloud & Containers, and DevOps & Site Reliability Engineering all start with LFS101 – Introduction to Linux, which is a good starting point for just about anyone looking to start an IT career. Other popular free courses included LFS151 – Introduction to Cloud Infrastructure Technologies, LFS158 – Introduction to Kubernetes, and LFS162 – Introduction to DevOps & Site Reliability Engineering.

Begin learning about intermediate and advanced topics

Once you’ve selected a path and taken some free courses to confirm it’s right for you, it’s now time to move into intermediate and advanced training courses. The Plan Your Training page is still a great resource as it lists the courses that will be most beneficial to learn about a particular topic area. Keep in mind that you typically will not need to complete every single course in a given area to be ready to begin working; concentrate on ensuring that you have the basic skills needed and you can always come back later in your career to pursue more advanced courses.

Think about certifications

While planning the training courses you wish to complete, keep certifications top of mind as well. Especially for those who are new to IT and do not have past experience to fall back on, holding a certification gives potential employers confidence that you have the skills needed to succeed in a given role. Many Linux Foundation training courses complement and help prepare for specific certification exams, so work both into your learning plan. And we offer certifications for those just starting out, like the Linux Foundation Certified IT Associate (LFCA), in addition to more specialized certifications like the Certified Kubernetes Administrator (CKA). Be sure to take advantage of the digital badges awarded for successfully completing a certification, which can be linked to social media profiles like LinkedIn and also can be independently verified, providing confidence for employers of your skills. The Open Source Jobs Report also found that a majority of hiring managers give preference to certified candidates, so these certifications really can open doors.

More structured options

For those who want a bit more structure and support in achieving their learning goals, we also offer two bootcamps. If you’re just getting started and are interested in pursuing a cloud career, the Cloud Engineer Bootcamp meets all your training and certification needs in one organized package. One major benefit of the bootcamps is they include instructor office hours five days per week, enabling you to actually speak to one of our expert instructors to answer questions and get tips on how to be most successful. 

As we move forward into 2021, countless new career opportunities will be available for those who take the steps to pursue them. Get started today and enroll in training to gain the skills you need to be successful in an IT career, then take those skills and gain the certification to prove it!

The post Tips for Starting Your New IT Career in 2021! appeared first on Linux Foundation – Training.

New, Free Training Course Covering Basics of the WebAssembly Now Available

Introduction to WebAssembly is the newest training course from The Linux Foundation! This course, offered on the non-profit edX learning platform, can be audited by anyone at no cost. The course is designed for web developers, Dweb, cloud, and blockchain developers, architects, and CTOs interested in learning about the strengths and limitations of WebAssembly, the fourth “official” language of the web (alongside JavaScript, HTML and CSS), and its potential applications in blockchain, serverless, edge/IoT, and more. WebAssembly has been rapidly growing in popularity thanks to its security, simplicity and the lightweight nature of the runtime. It is also language-agnostic, being a suitable compilation target for a wide range of modern languages.

The six hour course uses video content, written material and hands-on labs to delve into how WebAssembly runs ‘under the hood’, and how you can leverage its capabilities in and beyond the browser. It also explores a series of potential applications in different industries, and takes a quick peek at upcoming features. Enrollees will walk away from the course with an understanding of what the WebAssembly runtime is, and how it provides a secure, fast and efficient compilation target for a wide range of modern programming languages, allowing them to target the browser and beyond. 

The course was developed by Colin Eberhardt, the Technology Director at Scott Logic, a UK-based software consultancy which creates complex applications for financial services clients. Colin is an avid technology enthusiast, spending his evenings contributing to open source projects, writing blog posts and learning as much as he can.

“WebAssembly is one of the most exciting technologies I have come across for years,” said Eberhard. “Its initial promise was a fast and efficient multi-language runtime for the web, but it has the potential to be so much more. We are already seeing this runtime being used for numerous applications beyond the browser, including serverless and blockchain, with more novel uses and applications appearing each week!”

The course is available for immediate enrollment. Those requiring a verified certificate of completion may upgrade their enrollment for $149. Start gaining skills in WebAssembly today!

The post New, Free Training Course Covering Basics of the WebAssembly Now Available appeared first on Linux Foundation – Training.

Kubernetes Security Essentials Course Now Available

Today Linux Foundation Training & Certification and the Cloud Native Computing Foundation are announcing the availability of our newest training course, LFS260 – Kubernetes Security Essentials. The course provides skills and knowledge on a broad range of best practices for securing container-based applications and Kubernetes platforms during build, deployment and runtime. It is also a great way to prepare to take the recently launched Certified Kubernetes Security Specialist (CKS) certification exam. 

As production environments become more decoupled and agile, keeping the entire environment secure has become more complex. This challenge will only become more acute as cloud adoption accelerates. Additionally, we saw from the 2020 Open Source Jobs Report that cloud and security skills have the biggest and third biggest impact on hiring decisions respectively, further highlighting the talent gap for these skills. All of these are primary reasons that CNCF and The Linux Foundation are launching this course. By making training and certification related to cloud and container security widely accessible, the hope is to help close that talent gap.

The 30 hour self-paced course is conducted online, and includes learning from industry experts and hands-on labs to give participants the experience they need to secure their container-based applications. It covers more than just container security, exploring topics from before a cluster has been configured through deployment, and ongoing and agile use, including where to find ongoing security and vulnerability information. 

By the end of the course, participants will understand security concerns for cloud production environments and be able to harden systems and clusters, secure the container supply chain, monitor and log security events, and more. 

The course was developed by Tim Serewicz, Senior Instructor and courseware developer at The Linux Foundation. Tim is responsible for writing and updating the Kubernetes Fundamentals, Kubernetes for Developers, and Kubernetes Security Essentials courses for The Linux Foundation, among others, and was involved in creation of the CKS exam.

Enroll today and begin bolstering your cloud security chops!

The post Kubernetes Security Essentials Course Now Available appeared first on Linux Foundation – Training.

Open Source Management & Strategy Training Program Launched by The Linux Foundation

Program consists of seven modular courses, and can be tailored to suit the needs of different audiences within an organization

 SAN FRANCISCO, January 12, 2021The Linux Foundation, the nonprofit organization enabling mass innovation through open source, today announced the availability of a new training program designed to introduce open source best practices to management and technical staff within organizations, Open Source Management & Strategy.

This 7-module course series is designed to help executives, managers, software developers and engineers understand and articulate the basic concepts for building effective open source practices within their organization. It is also helpful for a leadership audience responsible for setting up effective program management of open source in their organization, including explaining how to create an Open Source Program Office (OSPO). 

The program builds on the accumulated wisdom of many previous training modules on open source best practices, while adding fresh and updated content to explain all of the critical elements of working effectively with open source in enterprises. The courses are designed to be self-paced, and reasonably high-level, but with enough detail to get new open source practitioners up and running quickly.

The courses in the program are designed to be modular, so participants only need to take those of relevance to them. The courses included are:

  • LFC202 – Open Source Introduction – covers the basic components of open source and open standards
  • LFC203 – Open Source Business Strategy – discusses the various open source business models and how to develop practical strategies and policies for each
  • LFC204 – Effective Open Source Program Management – explains how to build an effective OSPO and the different types of roles and responsibilities needed to run it successfully
  • LFC205 – Open Source Development Practices – talks about the role of continuous integration and testing in a healthy open source project
  • LFC206 – Open Source Compliance Programs – covers the importance of effective open source license compliance and how to build programs and processes to ensure safe and effective consumption of open source
  • LFC207 – Collaborating Effectively with Open Source Projects – discusses how to work effectively with upstream open source projects and how to get the maximum benefit from working with project communities
  • LFC208 – Creating Open Source Projects – explains the rationale and value for creating new open source projects as well as the required legal, business and development processes needed to launch new projects

The courses were developed by Guy Martin, Executive Director of OASIS Open, an internationally recognized standards development and open source projects consortium.

Guy has a unique blend of 25+ years’ experience as both software engineer and open source strategist. He has built open source programs for companies like Red Hat, Samsung and Autodesk and was instrumental in founding the Academy Software Foundation while Director of the Open Source Office at Autodesk. He was also a founding member of the team that built the Open Connectivity Foundation while at Samsung, and has contributed to several best practices and learning guides from the Linux Foundation’s TODO Group, a resource for OSPO personnel.

“Open source is not only commonplace in enterprises today, but actually is impossible to avoid as much modern technology including the cloud and networking systems are based on it,” said Chris Aniszczyk, co-founder of the TODO Group and VP of Developer Relations at The Linux Foundation. “This means organizations must prepare their teams to use it properly, ensuring compliance with licensing requirements, how to implement continuous delivery and integration, processes for working with and contributing to the open source community, and related topics. This program provides a structured way to do that which benefits everyone from executive management to software developers.”

The Open Source Management & Strategy program is available to begin immediately. The $499 enrollment fee provides unlimited access to all seven courses for one year, as well as a certificate upon completion. Interested individuals may enroll here. The program is also included in all corporate training subscriptions.

About the Linux Foundation

Founded in 2000, the Linux Foundation is supported by more than 1,000 members and is the world’s leading home for collaboration on open source software, open standards, open data, and open hardware. Linux Foundation’s projects are critical to the world’s infrastructure including Linux, Kubernetes, Node.js, and more. The Linux Foundation’s methodology focuses on leveraging best practices and addressing the needs of contributors, users and solution providers to create sustainable models for open collaboration. For more information, please visit us at linuxfoundation.org.

The Linux Foundation has registered trademarks and uses trademarks. For a list of trademarks of The Linux Foundation, please see its trademark usage page: www.linuxfoundation.org/trademark-usage. Linux is a registered trademark of Linus Torvalds.

# # #

The post Open Source Management & Strategy Training Program Launched by The Linux Foundation appeared first on Linux Foundation – Training.

Start 2021 Off With a New Career in the Cloud! Cloud Engineering Bootcamps are on Sale

The 2020 Open Source Jobs Report found that cloud skills are the most in demand by hiring managers, with 70% reporting they are more likely to hire someone with a solid foundation in cloud and container technologies. Additionally, a D2iQ study found that “only 23% of organizations believe they have the talent required to successfully complete their cloud native journey”. If you’re looking to move to a new career this year, gaining cloud skills and knowledge is the place to start, and now is the time to make it happen.

Last summer, The Linux Foundation and Cloud Native Computing Foundation launched our first ever bootcamp programs to help individuals become trained and certified in cloud technologies in a structured, supported way. The Cloud Engineer Bootcamp and Advanced Cloud Engineer Bootcamp contain the training courses and exams to get you prepared with the knowledge and skills to succeed in a career as a cloud administrator or engineer in as little as six months, and the verifiable, industry-leading certifications to demonstrate those skills. 

The Cloud Engineer Bootcamp is designed for relative newbies, who want to start an IT career with little or no prior experience. As cloud technologies are underpinned by Linux, the program starts with two courses giving you the Linux skills you need to get started, and the Linux Foundation Certified System Administrator (LFCS) exam to help you prove it. The program then continues with three courses focused on cloud and container technologies, finishing with the highly-sought Certified Kubernetes Administrator (CKA) exam.

The Advanced Cloud Engineer Bootcamp assumes you already possess the requisite Linux knowledge to make a start with learning about cloud technologies, so jumps right in with two cloud and containers courses and the CKA exam. This bootcamp then moves into more advanced cloud concepts and technologies, including service mesh, monitoring, logging and application management, in four additional courses. 

Both bootcamps include office hours with live instructors daily via Zoom, giving you the opportunity to ask questions and get help understanding the course material. There are also bootcamp forums, providing the chance to interact with fellow enrollees and discuss lessons and topics. Upon completion, you will receive a verifiable, digital badge for completing the bootcamp, as well as badges for passing each certification exam.

Through January 19, both bootcamps are reduced in price from their usual price of $999 (already a substantial discount from the $2,300 list price of the bootcamp components) to only $599. Those wishing to take both bootcamps can choose that option for only $899. Take advantage of this limited time offer to propel yourself into a new, highly lucrative career in 2021!

The post Start 2021 Off With a New Career in the Cloud! Cloud Engineering Bootcamps are on Sale appeared first on Linux Foundation – Training.