Learn how to diagnose and address routine SELinux policy violations that may be causing problems with your web server.
Read More at Enable Sysadmin
Learn how to diagnose and address routine SELinux policy violations that may be causing problems with your web server.
Read More at Enable Sysadmin
At last week’’s Open Source Summit North America, Robin Ginn, Executive Director of the OpenJS Foundation, relayed a principle her mentor taught: “1+1=3”. No, this isn’t ‘new math,’ it is demonstrating the principle that, working together, we are more impactful than working apart. Or, as my wife and I say all of the time, teamwork makes the dream work.
This principle is really at the core of open source technology. Turns out it is also how I look at the Open Programmable Infrastructure project.
Stepping back a bit, as “the new guy” around here, I am still constantly running across projects where I want to dig in more and understand what it does, how it does it, and why it is important. I had that very thought last week as we launched another new project, the Open Programmable Infrastructure Project. As I was reading up on it, they talked a lot about data processing units (DPUs) and infrastructure processing units (IPUs), and I thought, I need to know what these are and why they matter. In the timeless words of The Bobs, “What exactly is it you do here?”
First – and this is important – they are basically the same thing, they just have different names. Here is my oversimplified explanation of what they do.
In most personal computers, you have a separate graphic processing unit(s) that helps the central processing unit(s) (CPU) handle the tasks related to processing and displaying the graphics. They offload that work from the CPU, allowing it to spend more time on the tasks it does best. So, working together, they can achieve more than each can separately.
Servers powering the cloud also have CPUs, but they have other tasks that can consume tremendous computing power, say data encryption or network packet management. Offloading these tasks to separate processors enhances the performance of the whole system, as each processor focuses on what it does best.
In order words, 1+1=3.
While separate processing units have been around for some time, like your PC’s GPU, their functionally was primarily dedicated to a particular task. Instead, DPUs/IPUs combine multiple offload capabilities that are highly customizable through software. That means a hardware manufacturer can ship these units out and each organization uses software to configure the units according to their specific needs. And, they can do this on the fly.
Core to the cloud and its continued advancement and growth is the ability to quickly and easily create and dispose of the “hardware” you need. It wasn’t too long ago that if you wanted a server, you spent thousands of dollars on one and built all kinds of infrastructure around it and hoped it was what you needed for the time. Now, pretty much anyone can quickly setup a virtual server in a matter of minutes for virtually no initial cost.
DPUs/IPUs bring this same type of flexibility to your own datacenter because they can be configured to be “specialized” with software rather than having to literally design and build a different server every time you need a different capability.
OPI is focused on utilizing open software and standards, as well as frameworks and toolkits, to allow for the rapid adoption and use of DPUs/IPUs. The OPI Project is both hardware and software companies coming together to establish and nurture an ecosystem to support these solutions. It “seeks to help define the architecture and frameworks for the DPU and IPU software stacks that can be applied to any vendor’s hardware offerings. The OPI Project also aims to foster a rich open source application ecosystem, leveraging existing open source projects, such as DPDK, SPDK, OvS, P4, etc., as appropriate.”
In other words, competitors are coming together to agree on a common, open ecosystem they can build together and innovate, separately, on top of. The are living out 1+1=3.
I, for one, can’t wait to see the innovation.
A special thanks to Yan Fisher of Red Hat for helping me understand open programmable infrastructure concepts. He and his colleague, Kris Murphy, have a more technical blog post on Red Hat’s blog. Check it out.
For more information on the OPI Project, visit their website and start contributing at https://github.com/opiproject/opi.
Click here to add your own text
The post Open Programmable Infrastructure: 1+1=3 appeared first on Linux Foundation.
Learn how to install software with RHEL’s package manager using the dnf command or the GNOME Software app.
Read More at Enable Sysadmin
Make your system boot the way you want it to by editing your Grand Unified Bootloader (GRUB) file.
Read More at Enable Sysadmin
In a new white paper, the Cardea Project at Linux Foundation Public Health demonstrates a complete, decentralized, open source system for sharing medical data in a privacy-preserving way with machine readable governance for establishing trust.
The Cardea Project began as a response to the global Covid-19 pandemic and the need for countries and airlines to admit travelers. As Covid shut down air travel and presented an existential threat to countries whose economies depended on tourism, SITA Aero, the largest provider of IT technology to the air transport sector, saw decentralized identity technology as the ideal solution to manage a proof of Covid test status for travel.
With a verifiable credential, a traveler could hold their health data and not only prove they had a specific test at a specific time, they could use it—or a derivative credential—to prove their test status to enter hotels and hospitality spaces without having to divulge any personal information. Entities that needed to verify a traveler’s test status could, in turn, avoid the complexity of direct integrations with healthcare providers and the challenge of complying with onerous health data privacy law.
Developed by Indicio with SITA and the government of Aruba, the technology was successfully trialed in 2021 and the code specifically developed for the project was donated to Linux Foundation Public Health (LFPH) as a way for any public health authority to implement an open source, privacy-preserving way to manage Covid test and vaccination data. The Cardea codebase continues to develop at LFPH as Indicio, SITA, and the Cardea Community Group extend its features and applications beyond Covid-related data.
On May 22, 2022 at the 15th KuppingerCole European Identity and Cloud Conference in Berlin, SITA won the Verifiable Credentials and Decentralized Identity Award for its implementation of decentralized identity in Aruba.
The new white paper from the Cardea Project provides an in-depth examination of the background to Cardea, the transformational power of decentralized identity technology, how it works, the implementation in Aruba, and how it can be deployed to authenticate and share multiple kinds of health data in privacy-preserving ways. As the white paper notes:
“…Cardea is more than a solution for managing COVID-19 testing; it is a way to manage any health-related process where critical and personal information needs to be shared and verified in a way that enables privacy and enhances security. It is able to meet the requirements of the 21st Century Cures Act and Europe’s General Data Protection Regulation, and in doing so enable use cases that range from simple proof of identity to interoperating ecosystems encompassing multiple cloud services, organizations, and sectors, where data needs to be, and can be, shared in immediately actionable ways.
Open source, interoperable decentralized identity technology is the only viable way to manage both the challenges of the present—where entire health systems can be held at ransom through identity-based breaches—and the opportunities presented by a digital future where digital twins, smart hospitals, and spatial web applications will reshape how healthcare is managed and delivered.”
The white paper is available here. The community development group meets weekly on Thursdays at 9:00am PST—please join us!
This article was originally published on the Linux Foundation Public Health project’s blog.
The post Sharing Health Data while Preserving Privacy: The Cardea Project appeared first on Linux Foundation.
So, I am old enough to remember when the U.S. Congress temporarily intervened in a patent dispute over the technology that powered BlackBerries. A U.S. Federal judge ordered the BlackBerry service to shutdown until the matter was resolved, and Congress determined that BlackBerry service was too integral to commerce to be allowed to be turned off. Eventually, RIM settled the patent dispute and the BlackBerry rode off into technology oblivion.
I am not here to argue the merits of this nearly 20-year-old case (in fact, I coincidentally had friends on both legal teams), but it was when I was introduced to the idea of companies that purchase patents with the goal of using this purchased right to extract money from other companies.
Patents are an important legal protection to foster innovation, but, like all systems, it isn’t perfect.
At this week’s Open Source Summit North America, we heard from Kevin Jakel with Unified Patents. Kevin is a patent attorney who saw the damage being done to innovation by patent trolls – more kindly known as non-practicing entities (NPEs).
Kevin points out that patents are intellectual property designed to protect inventions, granting a time-bound legal monopoly, but they are only a sword, not a shield. You can use it to stop people, but it doesn’t give you a right to do anything. He emphasizes, “You are vulnerable even if you invented something. Someone can come at you with other patents.”
Kevin has watched a whole industry develop where patents are purchased by other entities, who then go after successful individuals or companies who they claim are infringing on the patents they now legally own (but is not something they invented). In fact, 88% of all high-tech patent litigation is from an NPE.
NPEs are rational actors using the legal system to their advantage, and they are driven by the fact that almost all of the time the defendant decides to settle to avoid the costs of defending the litigation. This perpetuates the problem by both reducing the risk to the NPEs and also giving them funds to purchase additional patents for future campaigns.
In regards to open source software, the problem is on the rise and is only going to get worse without strategic, consistent action to combat it.
Kevin started Unified Patents with the goal of solving this problem without incentivizing further NPE activity. He wants to increase the risk for NPEs so that they are incentivized to not pursue non-existent claims. Because NPEs are rational actors, they are going to weigh risks vs. rewards before making any decisions.
How does Unified Patents do this? They use a three-step process:
Detect – Patent Troll Campaigns
Disrupt – Patent Troll Assertions
Deter – Further Patent Troll Investment
Unified Patents works on behalf of 11 technology areas (they call them Zones). They added an Open Source Zone in 2019 with the help of the Linux Foundation, Open Invention Network, and Microsoft. They look for demands being filed in court, and then they selectively pick patent trolls out of the group and challenge them, attempting to disrupt the process. They take the patent back to the U.S. Patent and Trademark Office and see if the patent should have ever existed in the first place. Typically, patent trolls look for broad patents so they can sue lots of companies, making their investment more profitable and less risky. This means it is so broad that it probably should never have been awarded in the first place.
The result – they end up killing a lot of patents that should have never been issued but are being exploited by patent trolls, stifling innovation. The goal is to slow them down and eventually bring them to a stop as quickly as they can. Then, the next time they go to look for a patent, they look somewhere else.
And it is working. The image below shows some of the open source projects that Unified Patents has actively protected since 2019.
The Linux Foundation participates in Unified Patents’ Open Source Zone to help protect the individuals and organizations innovating every day. We encourage you to join the fight and create a true deterrence for patent trolls. It is the only way to extinguish this threat.
Learn more at unifiedpatents.com/join.
And if you are a die-hard fan of the BlackBerry’s iconic keyboard, my apologies for dredging up the painful memory of your loss.
The post Ensuring Patents Foster Innovation in Open Source appeared first on Linux Foundation.
You can use the numeric codes returned by shell scripts or Ansible playbooks to identify problems and test the code.
Read More at Enable Sysadmin
If you are interested in online and in-person training and certifications in open source software development and key open source software, such as Linux and Kubernetes, see our special discount just for readers of this post. Scroll to the end.
Tomorrow night, in the skies over Congress Bridge in Austin, Texas, 300 drones will work in concert to provide a lightshow to entertain but also inform about the power of open source software to drive innovation in our world, making an impact in every life, every day.
Backing up a bit, open source software often conjures up inaccurate visions and presumptions that just aren’t true. No need to conjure those up – we all know what they are. The reality is that open source software (OSS) has transformed our world and become the backbone of our digital economy and the foundation of our digital world.
The reality is that open source software (OSS) has transformed our world and become the backbone of our digital economy and the foundation of our digital world.
Some quick, fun facts:
In vertical software stacks across industries, open source penetration ranges from 20 to 85 percent of the overall software used
Linux fuels 90%+ of web servers and Internet-connected devices
The Android mobile operating system is built on the Linux kernel
Immensely popular libraries and tools to build web applications, such as: AMP, Appium, Dojo, jQuery, Marko, Node.js and so many more are open source
The world’s top 100 supercomputers run Linux
100% of mainframe customers use Linux
The major cloud-service providers – AWS, Google, and Microsoft – all utilize open-source software to run their services and host open-source solutions delivered through the cloud
Open source software is about organizations coming together to collectively solve common problems so they can separately innovate and differentiate on top of the common baseline. They see they are better off pooling resources to make the baseline better. Sometimes it is called “coopetition.” It generally means that while companies may be in competition with each other in certain areas, they can still cooperate on others.
I borrowed from a well-known tagline from my childhood in the headline – open source does bring good things to life.
Drones were introduced to the world through military applications and then toys we could all easily fly (well, my personal track record is abysmal). But the reality is that drones are seeing a variety of commercial applications, such as energy facility inspection for oil, gas, and solar, search and rescue, firefighting, and more, with new uses coming online all of the time. We aren’t at The Jetsons level yet, but they are making our lives easier and safer (and some really cool aerial shots).
Much of that innovation comes from open source coopetition.
The Linux Foundation hosts the Dronecode Foundation, which fosters open source code and standards critical to the worldwide drone industry. In a recent blog post, the general manager, Ramón Roche, discusses some of the ways open source has created an ecosystem of interoperability, which leads to users having more choice and flexibility.
Ramón recounts how it all started with the creation of Pixhawk, open standards for drone hardware, with the goal to make drones fly autonomously using computer vision. Working to overcome the lack of computing power and technology in 2008, Lorenz Meier, then a student, set out to build the necessary flight control software and hardware. Realizing the task’s scale, he sought the help of fourteen fellow students, many of whom were more experienced than him, to make it happen. They built Pixhawk and kick started an open source community around various technologies. It, “enabled talented people worldwide to collaborate and create a full-scale solution that was reusable and standardized. By giving their technology a permissive open source license, they opened it to everyone for use and collaboration.”
The innovation and technological backbone we see in drones is thanks to open software, hardware, and standards. Dronecode’s blog has interviews with Max Tubman of Freefly Systems talks about how open standards are enabling interoperability of various payloads amongst partners in the Open Ecosystem. Also, Bobby Watts of Watts Innovation explains the power of standardization and how it has streamlined their interoperability with other ecosystem partners like Gremsy and Drone Rescue Systems.
The innovation and technological backbone we see in drones is thanks to open software, hardware, and standards
Check out both interviews here and read about what is next.
The story of open source driving innovation in the drone industry is just one of thousands of examples of how open source is driving global innovation. Whether you know it or not, you use open source software every minute of every hour of every day.
Use promo code DRONE25 here to receive up to 25% off of Linux Foundation’s training, taken by millions of students around the world. Expires on June 30, 2022. View the whole catalog, from AI and blockchain to web and application development, we have something for you.
The post Open Source Brings Good Things to Life appeared first on Linux Foundation.
In recent years, DevOps, which aligns incentives and the flow of work across the organization, has become the standard way of building software. By focusing on improving the flow of value, the software development lifecycle has become much more efficient and effective, leading to positive outcomes for everyone involved. However software development and IT operations aren’t the only teams involved in the software delivery process. With increasing cybersecurity threats, it has never been more important to unify cybersecurity and other stakeholders into an effective and united value stream aligned towards continuous delivery.
At the most basic level, there is nothing separating DevSecOps from the DevOps model. However, security, and a culture designed to put security at the forefront has often been an afterthought for many organizations. But in a modern world, as costs and concerns mount from increased security attacks, it must become more prominent. It is possible to provide continuous delivery, in a secure fashion. In fact, CD enhances the security profile. Getting there takes a dedication to people, culture, process, and lastly technology, breaking down silos and unifying multi-disciplinary skill sets. Organizations can optimize and align their value streams towards continuous improvement across the entire organization.
To help educate and inform program managers and software leaders on secure and continuous software delivery, the Linux Foundation is releasing a new, free online training course, Introduction to DevSecOps for Managers (LFS180x) on the edX platform. Pre-enrollment is now open, though the course material will not be available to learners until July 20. The course focuses on providing managers and leaders with an introduction to the foundational knowledge required to lead digital organizations through their DevSecOps journey and transformation.
LFS180x starts off by discussing what DevSecOps is and why it is important. It then provides an overview of DevSecOps technologies and principles using a simple-to-follow “Tech like I’m 10” approach. Next, the course covers topics such as value stream management, platform as product, and engineering organization improvement, all driving towards defining Continuous Delivery and explaining why it is so foundational for any organization. The course also focuses on culture, metrics, cybersecurity, and agile contracting. Upon completion, participants will understand the fundamentals required in order to successfully transform any software development organization into a digital leader.
The course was developed by Dr. Rob Slaughter and Bryan Finster. Rob is an Air Force veteran and the CEO of Defense Unicorns, a company focused on secure air gap software delivery, he is the former co-founder and Director of the Department of Defense’s DevSecOps platform team, Platform One, co-founder of the United States Space Force Space CAMP software factory, and current member of the Navy software factory Project Blue. Bryan is a software engineer and value stream architect with over 25 years experience as a software engineer and leading development teams delivering highly available systems for large enterprises. He founded and led the Walmart DevOps Dojo which focused on a hands-on, immersive learning approach to helping teams solve the problem of “why can’t we safely deliver today’s changes to production today?” He is the co-author of “Modern Cybersecurity: Tales from the Near-Distant Future”, the author of the “5 Minute DevOps” blog, and one of the maintainers of MinimumCD.org. He is currently a value stream architect at Defense Unicorns at Platform One.
Enroll today to start your journey to mastering DevSecOps practices on July 20!
The post Learn the Principles of DevSecOps in New, Free Training Course appeared first on Linux Foundation.
Many software projects are not prepared to build securely by default, which is why the Linux Foundation and Open Source Security Foundation (OpenSSF) partnered with technology industry leaders to create Sigstore, a set of tools and a standard for signing, verifying and protecting software. Sigstore is one of several innovative technologies that have emerged to improve the integrity of the software supply chain, reducing the friction developers face in implementing security within their daily work.
To make it easier to use Sigstore’s toolkit to its full potential, OpenSSF and Linux Foundation Training & Certification are releasing a free online training course, Securing Your Software Supply Chain with Sigstore (LFS182x). This course is designed with end users of Sigstore tooling in mind: software developers, DevOps engineers, security engineers, software maintainers, and related roles. To make the best use of this course, you will need to be familiar with Linux terminals and using command line tools. You will also need to have intermediate knowledge of cloud computing and DevOps concepts, such as using and building containers and CI/CD systems like GitHub Actions, many of which can be learned through other free Linux Foundation Training & Certification courses.
Upon completing this course, participants will be able to inform their organization’s security strategy and build software more securely by default. The hope is this will help you address attacks and vulnerabilities that can emerge at any step of the software supply chain, from writing to packaging and distributing software to end users.
Enroll today and improve your organization’s software development cybersecurity best practices.
The post Free Training Course Teaches How to Secure a Software Supply Chain with Sigstore appeared first on Linux Foundation.