Some great insight into how a kernel sta
Click to Read More at Oracle Linux Kernel Development
Some great insight into how a kernel sta
Click to Read More at Oracle Linux Kernel Development
How to get started with scripting in Python
Image
WOCinTech Chat, CC BY 2.0
Learn to use functions, classes, loops, and more in your Python scripts to simplify common sysadmin tasks.
Posted:
March 30, 2022
|
by
Peter Gervase (Red Hat, Sudoer), Bart Zhang (Red Hat)
Topics:
Programming
Python
Read the full article on redhat.com
Read More at Enable Sysadmin
Why it makes sense to write Kubernetes webhooks in Golang
When to choose Golang versus Python and YAML for writing Kubernetes webbooks.
lseelye
Mon, 3/21/2022 at 10:16pm
Image
Photo by Rachel Claire from Pexels
Towards the end of 2019, OpenShift Dedicated site reliability engineers (SREs) on the SRE-Platform (SREP) team had a problem ahead of a new feature release: Kubernetes role-based authentication controls (RBAC) wasn’t working. Or, rather, it wasn’t working for us.
Topics:
Kubernetes
Programming
OpenShift
Read More at Enable Sysadmin
A software bill of materials (SBOM) is a way of summarizing key facts about the software on a system. At the heart of it, it describes the set of software components and the dependency relationships between these components that are connected together to make up a system.
Modern software today consists of modular components that get reused in different configurations. Components can consist of open source libraries, source code or other external, third-party developed software. This reuse lets innovation of new functionality flourish, especially as a large percentage of those components being connected together to form a system may be open source. Each of these components may have different limitations, support infrastructure, and quality levels. Some components may be obsolete versions with known defects or vulnerabilities. When software runs a critical safety system, such as life support, traffic control, fire suppression, chemical application, etc., being able to have full transparency about what software is part of a system is an essential first step for being able to do effective analysis for safety claims.
When a system has functionality incorporated that could have serious consequences in terms of a person’s well being or significant loss, the details matter. The level of transparency and traceability may need to be at different levels of details based on the seriousness of the consequences.
Source: NTIA’s Survey of Existing SBOM Formats and Standards
Safety Standards, and the claims necessarily made against them, come in a variety of different forms. The safety standards themselves mostly vary according to the industry that they target: Automotive uses ISO 26262, Aviation uses DO 178C for software and DO 254 for hardware, Industrial uses IEC 61508 or ISO 13849, Agriculture uses ISO 25119, and so on. From a software perspective, all of these standards work from the same premise that the full details of all software is known: The software should be developed according to a software quality perspective, with additional measures added for safety. In some instances these additional safety measures come in the form of a software FMEA (Failure Modes and Effects Analysis), but in all of them, there are specific code coverage metrics to demonstrate that as much of the code as possible has been tested and that the code complies with the requirements.
Another item that all safety standards have in common is the expectation that the system configuration is going to be managed as part of any product release. Configuration management (CM) is an inherent expectation in software already, but with safety this becomes even more crucial because of the need to track exactly what the configuration of a system (and its software) is if there is a subsequent incident in the field while the system is being used. From a software perspective, this means we need several things:
The goal, then, is to be able to rebuild exactly what the executable or binary was at the time of release.
From the above, it is inherently obvious how the SBOM fits into the need for CM. The safety standards CM requirements, from a source code and configuration standpoint, are greatly simplified by following an effective SBOM process. An SBOM supports capturing the details of what is in a specific release and supports determining what went wrong if a failure occurs.
Because software often relies upon reusable software components written by someone other than the author of the main system/application, the safety standards also have a specific expectation and a given set of criteria for software that you end up including in your final product. This can be something as simple as a library of run-time functions as we might expect to see from a run-time library, to something as extensive as a middleware that manages communication between components. While the safety standards do not always require that the included software be developed in accordance with a safety standard, there are still expectations that you can prove that the software was developed at least in compliance with a quality management framework such that you can demonstrate that the software fulfills its requirements. This is still predicated on the condition that you know all of the details about the software component and that it fulfills its intended purpose.
The included software components can be from:
Regardless of the source or current usage of the software, the SBOM should describe all of the included software in the release.
To this end, the safety standards expect that the following is available for each software component included in your project:
At a minimum, the SBOM describes the software component, supplier and version number, with an enumeration of the included dependent components. This is what is being called for in the minimum viable definition of an SBOM to support cyber security[1] or safety critical software[2].
Having a minimum level of information, while better than nothing, is not sufficient for the level of analysis that safety claims expect. Knowing exactly which source files were included in the build is a better starting point. Even better still is knowing the configuration options that were used to create the image (and be able to reproduce it), and being able to check via some form of integrity check (like a hash) that the built components haven’t changed is going to be key to having a sound foundation for the safety case. SBOMs need to scale from the minimum, to the level of detail necessary to satisfy the safety analysis.
While SBOM tooling may not be able to populate all of this information today, the tools are continuing to evolve so that the facts necessary to support safety analysis can be made available. An international open SBOM standard, like SPDX[3] can become the baseline for modern configuration management and effective documentation of safety critical systems.
[1] The Minimum Elements For a Software Bill of Materials (SBOM) from NTIA
[2] ISO 26262:2018, Part 8, Clause 12 – Qualification of Software Components
[3] ISO/IEC 5962:2021 – Information technology — SPDX® Specification V2.2.1
Peter Brink, Functional Safety Engineering Leader, kVA by UL, Underwriters Laboratories (UL)
Kate Stewart, VP Dependable Embedded Systems, The Linux Foundation
Find out what’s stopping you from accessing a server, printer, or another network resource with these four Linux troubleshooting commands.
Read More at Enable Sysadmin
Install, configure, and test a very basic web server deployment in just eight steps.
Read More at Enable Sysadmin
Arresting climate change is no longer an option but a must to save the planet for future generations. The key to doing so is to transition off fossil fuels to renewable energy sources and to do so without tanking economies and our very way of life.
The energy industry sits at the epicenter of change because energy makes everything else run. And inside the energy industry is the need for a rapid transition to electrification and our vast power grids. Like it or not, utilities face existential decisions on transforming themselves while delivering ever more power to more people without making energy unaffordable or unavailable.
The challenges are daunting:
How to move away from fossil fuels without crashing the global economy that is fueled by energy?Is it possible to speed up the modernization of the electric grid without spending trillions of dollars?Can this be done while ensuring that power is safe, reliable, and affordable for all?
These are all significant problems to solve and represent 75% of the problem in combating climate change through decarbonization. In the Linux Foundation’s latest case study, Paving the Way to Battle Climate Change: How Two Utilities Embraced Open Source to Speed Modernization of the Electric Grid, LF Energy explores the opportunities for digital transformation within electric utility providers and the role of open source technologies in accelerating the transition.
The growth of renewable energy sources is making the challenges of modernizing the modern grid more complicated. In the past, energy flowed from coal and gas generating plants onto the big Transmission System Operator (TSO) lines and then to the smaller Distribution System Operator (DSO) lines to be transformed into a lower voltage suitable for homes and businesses.
But now, with solar panels and wind turbines increasingly feeding electricity back into the grid, the flow of power is two-way.
This seismic shift requires a new way of thinking about generating, distributing, and consuming energy. And it’s one that open source can help us navigate.
Today, energy travels in all directions, from homes and businesses, and from wind and solar farms, through the DSOs to the TSOs, and back again. This fundamental change in how power is generated and consumed has resulted in a much more complicated system that utilities must administer. They’ll require new tools to guarantee grid stability and manage the greater interaction between TSOs and DSOs as renewables grow.
Open source software allows utilities to keep up with the times while lowering expenses. It also gives utilities a chance to collaborate on common difficulties rather than operating in isolation.
The communities developing LF Energy’s various software projects provide those tools. It’s helping utilities to speed up the modernization of the grid while reducing costs. And it’s giving them the ability to collaborate on shared challenges rather than operate in silos.
Two European utility providers, the Netherlands’ Alliander and France’s RTE are leading the change by upgrading their systems – markets, controls, infrastructure, and analytics – with open source technology.
RTE (a TSO) and Alliander (a TSO) joined forces initially (as members of the Linux Foundation’s LF Energy projects) because they faced the same problem: accommodating more renewable energy sources in infrastructures not originally designed for them and doing it at the speed and scale required. And while they are not connected due to geography, the problems they are tackling apply to all TSOs and DSOs worldwide.
The way that Alliander and RTE collaborated via LF Energy on a project known as Short Term Forecasting, or OpenSTEF, illustrates the benefits of open source collaboration to tackle common problems.
“Short-term forecasting, for us, is the core of our existence,” According to Alliander’s Director of System Operations, Arjan Stam. “We need to know what will be happening on the grid. That’s the only way to manage the power flows,” and to configure the grid to meet customer needs.“The same is true for RTE and “every grid operator across the world,” says Lucian Balea, RTE’s Director of Open Source.
Alliander has five people devoted to OpenSTEF, and RTE has two.
Balea says that without joining forces, OpenSTEF would develop far less quickly, and RTE may not have been able to work on such a solution in the near term.
Since their original collaboration on OpenSTEF, they have collaborated on additional LF Energy Projects, CoMPAS, and SEAPATH.
CoMPAS is Configuration Modules for Power industry Automation Systems, which addresses a core need to develop open source software components for profile management and configuration of a power industry protection, automation, and control system. ComPAS is critical for the digital transformation of the power industry and its ability to move quickly to new technologies. It will enable a wide variety of utilities and technology providers to work together on developing innovative new solutions.
SEAPATH, Software Enabled Automation Platform and Artifacts (THerein): aims to develop a platform and reference design for an open source platform built using a virtualized architecture to automate the management and protection of electricity substations. The project is led by Alliander, with RTE and other consortium members contributing.
As we move to a decarbonized future, open source will play an increasingly important role in helping utilities meet their goals. It’s already helping them speed up the grid’s modernization, reduce costs, and collaborate on shared challenges. And it’s only going to become essential as we move toward a cleaner, more sustainable energy system.
Read Paving the Way to Battle Climate Change: How Two Utilities Embraced Open Source to Speed Modernization of the Electric Grid to see how it works and how you and your organization may leverage Open Source. Together, we can develop solutions.
The post LF Energy: Solving the Problems of the Modern Electric Grid Through Shared Investment appeared first on Linux Foundation.
Last year’s Jobs Report generated interesting insights into the nature of the open source jobs market – and informed priorities for developers and hiring managers alike. The big takeaway was that hiring open source talent is a priority, and that cloud computing skills are among the top requested by hiring managers, beating out Linux for the first time ever in the report’s 9-year history at the Linux Foundation. Here are a few highlights:
Now in its 10th year, the jobs survey and report will uncover current market data in a post-COVID (or what could soon feel like it) world.
This year, in addition to determining which skills job seekers should develop to improve their overall employability prospects, we also seek to understand the nature and impact of the “Great Resignation.” Did such a staffing exodus occur in the IT industry in 2021, and do we expect to feel additional effects of it in 2022? And what can employers do to retain their employees under such conditions? Can we hire to meet our staffing needs, or do we have to increase the skill sets of our existing team members?
The jobs market has changed, and in open source it feels hotter than ever! We’re seeing the formation of new OSPOs and the acceleration of open source projects and standards across the globe. In this environment, we’re especially excited to uncover what the data will tell us this year, to confirm or dispel our hypothesis that open source talent is much in demand, and that certain skills are more sought after than others. But which ones? And what is it going to take to keep skilled people on the job?
Only YOU can help us to answer these questions. By taking the survey (and sharing it so that others can take it, too!) you’ll contribute to a valuable dataset to better understand the current state of the open source jobs market in 2022. The survey will only take a few minutes to complete, with your privacy and confidentiality protected.
Thank you for participating!
The project will be led by Clyde Seepersad, SVP & General Manager of Linux Foundation Training & Certification, and Hilary Carter, VP Research at the Linux Foundation.
The post Looking to Hire or be Hired? Participate in the 10th Annual Open Source Jobs Report and Tell Us What Matters Most appeared first on Linux Foundation.
The Linux Foundation’s Board of Directors represents a cross-section of our membership–from different industries with different backgrounds and expertises. This broad, diverse group works hard to ensure the Linux Foundation is achieving its mission to unlock the power of open technology to drive shared innovation for the collective benefit. Their expertise, passion, and work is essential to our joint successes.
Some of the Board is elected by the other members and their terms are limited. The Board also has turnover as executives in our members’ companies change roles. This year we welcome five new members to the Board. We are excited for the breadth of experience that will make the work we all do more impactful. Read more about each one:
Suzanne Ambiel is an 11-year veteran of VMware and “experienced traveler” in the technology space, Suzanne caught the open source bug late in life, but now considers herself “all in” thanks to a few inspiring, influential, and patient leaders. During work hours, you’ll find Suzanne playing dual roles — behind the scenes of VMware’s Open Source Program Office and in VMware’s Brand & Creative team. But when the whistle blows, she’s likely out riding the trails, walking her two dachshunds, or pondering why her sourdough didn’t rise (again).
Tim Bird is a longtime Linux kernel developer, with over 25 years experience with the Linux kernel and open source community. He is a principal software engineer and general Open Source technologist at Sony Corporation. Over the last 2 decades he has been involved with many projects in the Linux Foundation and other trade associations to enhance Linux for use in embedded and consumer electronics products. Tim is the founder of the Embedded Linux Conference and the elinux wiki. He recently served on the Linux Foundation Technical Advisory Board, and was previously the CTO of Lineo, an early embedded Linux company.
Ben Maurer is a software engineer at Meta focusing on privacy and security. He joined Meta in 2010 as a member of the infrastructure team where he played a key role in driving the performance and reliability of Meta’s products. Over the course of his time at the company, Ben has worked on several technologies that Meta has open sourced, including jemalloc, Folly, Thrift, and HHVM. He has also built deep partnerships with the open source community such as bringing Restartable Sequences to the Linux kernel and building a team within Meta dedicated to contributing to open source web browsers. Ben is one of the co-creators of the Diem blockchain and led Meta’s technical contributions to the project.
Ben also worked at the White House in 2014 as part of the U.S. Digital Service where he improved the communication tools used by the President and his staff.
Before joining Meta, Ben was an engineer at Google after the company acquired the startup he co-founded, reCAPTCHA, a system that determines if a user is human while simultaneously digitizing books. Ben has also contributed to the Mono and GNOME open source projects.
Shojiro Nakao is a general manager of the R & D Division of the Automotive Company of Panasonic. He is responsible for the development and management of automotive software platforms. He has been working with Linux for over 15 years, in a variety of product development, including mobile, IoT, and automotive devices. Responsible for software platform development, he has been promoting Panasonic’s collaboration with various open source communities. In addition, he is a steering committee member of Automotive Grade Linux.
Phil Robb is the Acting Head of Ericsson Software Technology (EST), where he leads a passionate group of engineers developing open source software across a wide range of projects including Linux, OpenStack, Kubernetes, and ONAP among many others.
Prior to Ericsson, Phil was the V.P. of Operations for the Networking Projects at the Linux Foundation including ORAN, ONAP, OpenDaylight, and Anuket. In that role, Phil led a team of technical staff who oversaw community software development based on DevOps and open source best practices. Prior to the Linux Foundation, Phil spent 12 years with Hewlett Packard working on Linux and Open Source starting in 2001. There, Phil formed and led HP’s Open Source Program Office responsible for open source strategy, tools, processes, and investments as HP transitioned from Unix to Linux in the Enterprise Server market.
The post The Linux Foundation Welcomes New Board Members from Ericsson, Fujitsu, Meta, Panasonic, Sony, and VMWare appeared first on Linux Foundation.
Every historical moment is associated with its technology. The social, political, and economic practice of human beings depends mainly on the tools at their disposal. Their relationship with the environment around them, their knowledge of the world, and their worldview depend on technological prostheses that enable them to improve their perception, to understand the past, and above all, to foresee a plan for the future.
Moreover, technological development is the backbone of our understanding of the 21st century. The new technologies define our era and make it possible to understand the creations of contemporary human beings: their science, their way of living, their cultural habits, so on.
Contemporary humankind is impossible to be understood without the constant company of technology that allows them to improve and expand their senses. Today, technology has become an extension of the human body.
The human being today is a subject made by technology. Men and women have thus shown themselves to be irremediably permeable to technology, a technology that goes beyond them. The main question is: who is in charge of this factory? With what interests? Who is making us? Who is designing us?
Utopia and dystopia
Since August 1991, when the World Wide Web was made public as a service, the Internet has spread all over the planet and become famous. The first heroic times led to the birth of an autonomous, free zone outside the control of the power of governments and large companies. A place in which to create and communicate without restrictions. A vast blank page on which to start building.
However, the idea of collective intelligence was gradually eroded by the advance of simplifying globalization, ultimately capable of fostering unidirectional and unidimensional currents of communication contrary to the illusionary possibilities from which it started. At the same time, we assist in the triumphal beginning of “democracy,” a new kind of power based on the control (monopoly) of specific tools.
In the end, the medium prevailed over the messages, and the “wall” of the social networks ended up being a torrent of disinformation and banalities. The Internet thus became a way of looking at oneself in a mirror, of forming a continuous selfie based on not seeing beyond prejudice and the saturation of information and images—a tool of individualism and separation from reality. And, what is even worse, an infinite source of personal data to be traded.
Collaboration against individualism
Against the main currents determined by tech giants, there are still many possibilities of using the ways in alternative ways, understanding the vast opportunities open by technologies, some usually forgotten by people.
Facing the idea of an Internet of massive and monolithic movements, there is another open possibility: the return to the collective, to the community, and its needs. Open source thus becomes the counterweight to the dangers of indiscriminate use and the generation of information noise on the Internet.
Open-source projects refer to software development with a license that gives access to the source code so that programmers can freely use, write, modify, and redistribute the code. Open source is developed collaboratively and distributed over the Internet. It allows generating a product with a continuous improvement from the hand of a dedicated community.
AsyncAPI is a project that considers that human beings have naturalized the tools that enhance their capacities. Technology can guide us and define our steps. It’s not merely daily practice. We should be more aware that we humans make technology and that we are the ones who create and determine the future it holds for us. We are the ones who build the future.
The light between the walls
Transparency and horizontality are the axes on which this movement is settled. The absolute exposure of a project or company’s work, economy, and politics in a society with such opaque foundations represent a return to the initial roots of the Internet. In this autonomous zone, projects are built from the ground up.
Doing things following the open-source philosophy promotes a working system based on the open exchange of ideas, teamwork-based on flexibility, and constant innovation.
This working model clashes head-on with a system based mainly on competitiveness and concealment. It appears a stepping stone to repeal opaque and oppressive government models and a step towards a freer, more transparent, and equitable society.
Collaborating in an open-source project offers you the opportunity of a community of people involved and enthusiastic about the same goal. At AsyncAPI, everyone is welcome, regardless of their sector, their experience… all you need is enthusiasm. This enthusiasm is based on the idea that another world is possible, even more so in times of climate emergency and global crisis. Technology is the only tool in our hands to shape it. It is the key element of cultural, social, and political struggle. It cannot be left in other hands. It must be built collectively, with the community’s needs and users in mind.
You don’t take technology; you make it.
The author, Barbaño González, is Education Program Manager at AsyncAPI.