Home Blog Page 71

AI is better with open source

Open Source Software (OSS) is a proven model that delivers tangible benefits to businesses, including improved time-to-market, reduced costs, and increased flexibility. OSS is pervasive in the technology landscape and beyond it, with adoption across multiple industries. In a 2022 survey by Red Hat, 95 percent of IT leaders said they are using open source in their IT infrastructure, which will only increase.

Artificial intelligence (AI) is no different from any other technology domain where OSS dominates. In a recent paper published by Linux Foundation Research, written by Dr. Ibrahim Haddad, General Manager of the LF AI & Data Foundation, over 300 critical open source projects have been identified offering over 500 million lines of code, contributed by more than 35,000 developers who work side by side to advance the state of technology in an open, collaborative, and transparent way.

As with other industries, OSS adoption in the AI field has increased the use of open source in products and services, contributions to existing projects, the creation of projects fostering collaboration, and the development of new technologies due to this amazing success story.

In this paper, you will read that while AI in open source has followed a similar model to other industries embracing the popular methodology, Dr. Haddad has some unique observations to share, which include:

An incubation model for AI open source projects is effective when appropriately executed by neutral organizations that can scale them, such as the Linux Foundation.Consolidation is bound to happen around platforms, frameworks, and libraries that address similar challenges. Unlike typical fragmentation scenarios, where there are winning and losing projects, Dr. Haddad believes the net result will be a win-win as successful projects grab their share of contributors.License choices can affect a project’s growth — and licenses approved by the Open Source Initiative (OSI) are most preferred because developers and enterprises are already familiar with them.Open data licenses such as Community Data License Agreement (CDLA) have begun to commoditize training data. These license terms will help democratize the overall AI marketplace by lowering the barriers to entry when offering an AI-backed service. Proprietary datasets will continue to exist, but data availability under the CDLA licenses (two versions exist) should allow everyone to build credible products, including smaller players.

So what does this mean for the future of AI? It means that businesses will continue to rely on open source software to power their AI initiatives and that collaboration will be key to success. The open source model has been successful in AI because it allows developers to come together and share code, data, and ideas. This type of collaboration is essential for advancing any technology, and we can expect to see even more impressive innovations come out of the AI community in the years to come. Ultimately, we are faster and more innovative together.

The post AI is better with open source appeared first on Linux Foundation.

An introduction to pvpanic

An introduction to the paravirtualized d

Click to Read More at Oracle Linux Kernel Development

What sysadmins need to know about using Bash

You’ve logged into Linux; now what? Here’s how to use Bash, the command-line interpreter on most modern Linux machines.

Read More at Enable Sysadmin

What sysadmins need to know about using Bash

You’ve logged into Linux; now what? Here’s how to use Bash, the command-line interpreter on most modern Linux machines.

Read More at Enable Sysadmin

How to find out what a Linux command does

Learn how to locate, read, and use Linux system documentation with man, info, and /usr/share/doc files.

Read More at Enable Sysadmin

How to find out what a Linux command does

Learn how to locate, read, and use Linux system documentation with man, info, and /usr/share/doc files.

Read More at Enable Sysadmin

How to use ‘podman save’ to share container images

The `podman save` and `podman load` commands let you share images across multiple servers and systems when they aren’t available locally or remotely.

Read More at Enable Sysadmin

How to use ‘podman save’ to share container images

The `podman save` and `podman load` commands let you share images across multiple servers and systems when they aren’t available locally or remotely.

Read More at Enable Sysadmin

Question from the New Guy!

“Here’s a question from the new guy”. I have been using this a lot the past few weeks after starting here at the Linux Foundation as the lead editor and content manager. How long can I pull that off? 

The reality is that I am new to working professionally in open source software – and really the software/technology industry. But, it has been a long time passion of mine. I spent my formative years in the 1980s and had a drive to learn to program computers. When I was 12, I asked my mom for a computer. Her response, “you have to learn to type first”. 

I went to the library, checked out typing books, and taught myself on our electronic typewriter. We couldn’t afford a computer, but I received a hand-me-down TI-994A and then a Commodore 64 with a tape drive. I taught myself BASIC and also dialed into bulletin board systems (BBS) at a mind-blowing 300bps. If you have never experienced 300bps, imagine yourself reading at 10% of your normal pace. 

I mention BBSs because, in many ways, they were the precursor to open source software. Someone dedicated their PC and a phone line for others to dial in, share messages, exchange software, answer technical questions, etc. 

Fast forward a bit – I taught myself to code enough to get a couple of coding jobs in high school but ended up getting a business degree in college and then working in politics for 15+ years. My passion for software and technology didn’t lapse, but it was mostly a tech hobbyist – taking classes in front-end web development and writing a couple basic web apps, teaching myself some PHP, Python and WordPress development, and reading/writing about software development. And, for the record, I already had a GitHub repo before starting here. 

With that bit of background, let me say that I am very excited about working at the Linux Foundation and diving into the open source community. I am a self-driven, life-long learner, and I want to take you along my journey here to learn about what we do, all of our projects, what open source is, how to advance it, and more. 

At LF, we embrace what we call the three H’s: humble, helpful, and hopeful. It isn’t just lip service. I see it lived out every day, in every interaction I have with my coworkers. My goal with this journey is to be: 

Humble: There is so much I don’t know about the open source community and the LF. I am learning every day. 
Helpful: I want to be helpful by sharing what I am learning. Much you may already know, but some you may not.
Hopeful: My hope is two-fold: I hope others learn too; I am hopeful that our community will continue to grow and thrive and solve some of the world’s toughest challenges. 

The three H’s are perfectly aligned with the general culture of open source. One of the LF’s onboarding tasks for new employees is to take a class entitled Open Source 101. Within that class they teach us Ten Open Source Culture Cores: 

Be open. Openness breeds authenticity. Be consistently authentic in all of your work. 
Be pragmatic. Action > talk. Work towards measurable value, not obscure, abstract, or irrelevant ideas. (Side note: when I worked in politics, my go-to line when speaking to groups was that I was a bit of an anomaly in Washington, I was long on action and short of talk.) 
Be personal. Always focus on a personal level of service and interaction. People don’t join open source communities to talk to computers. 
Be positive. Highly positive environments generate positive engagement.
Be collaborative. Involve people, gather their feedback, get a gut check, and validate your ideas. The only problem silos solve is how to store grain. 
Be a leader. Be open and collaborative–focus on the other 9 Culture Cores too. 
Be a role model. Be the person you want to be and you will be the leader other people want you to be. 
Be empathetic. Don’t just be empathetic in the privacy of your own mind. Say it, demonstrate it visibly. This all builds trust. Empathy is a powerful driver for building inclusion, which is a powerful driver for innovation.
Be down-to-earth. Leave your ego at the door. 
Be imperfect. We all make mistakes. Acknowledge them, share them, and learn from them. 

What a great synopsis of the culture of open source technology. 

With that, let me close out this week by first stating the obvious – a lot has transpired in technology since my first TI-994A (never mind the fact that my network speed is literally one million times faster). I hope you will join me on my “Questions from the New Guy” journey. Look for weekly-ish blog posts diving into all aspects of The Linux Foundation, our projects, and open source technology. 

The post Question from the New Guy! appeared first on Linux Foundation.

Classic SysAdmin: Writing a Simple Bash Script

This is a classic article written by Joe “Zonker” Brockmeier from the Linux.com archives. For more great SysAdmin tips and techniques check out our free intro to Linux course.

The first step is often the hardest, but don’t let that stop you. If you’ve ever wanted to learn how to write a shell script but didn’t know where to start, this is your lucky day.

If this is your first time writing a script, don’t worry — shell scripting is not that complicated. That is, you can do some complicated things with shell scripts, but you can get there over time. If you know how to run commands at the command line, you can learn to write simple scripts in just 10 minutes. All you need is a text editor and an idea of what you want to do. Start small and use scripts to automate small tasks. Over time you can build on what you know and wind up doing more and more with scripts.

Starting Off

Each script starts with a “shebang” and the path to the shell that you want the script to use, like so:

#!/bin/bash

The “#!” combo is called a shebang by most Unix geeks. This is used by the shell to decide which interpreter to run the rest of the script, and ignored by the shell that actually runs the script. Confused? Scripts can be written for all kinds of interpreters — bash, tsch, zsh, or other shells, or for Perl, Python, and so on. You could even omit that line if you wanted to run the script by sourcing it at the shell, but let’s save ourselves some trouble and add it to allow scripts to be run non-interactively.

What’s next? You might want to include a comment or two about what the script is for. Preface comments with the hash (#) character:

#!/bin/bash
# A simple script

Let’s say you want to run an rsync command from the script, rather than typing it each time. Just add the rsync command to the script that you want to use:

#!/bin/bash
# rsync script
rsync -avh –exclude=”*.bak” /home/user/Documents/ /media/diskid/user_backup/Documents/

Save your file, and then make sure that it’s set executable. You can do this using the chmod utility, which changes a file’s mode. To set it so that a script is executable by you and not the rest of the users on a system, use “chmod 700 scriptname” — this will let you read, write, and execute (run) the script — but only your user. To see the results, run ls -lh scriptname and you’ll see something like this:

-rwx—— 1 jzb jzb 21 2010-02-01 03:08 echo

The first column of rights, rwx, shows that the owner of the file (jzb) has read, write, and execute permissions. The other columns with a dash show that other users have no rights for that file at all.

Variables

The above script is useful, but it has hard-coded paths. That might not be a problem, but if you want to write longer scripts that reference paths often, you probably want to utilize variables. Here’s a quick sample:

#!/bin/bash
# rsync using variables

SOURCEDIR=/home/user/Documents/
DESTDIR=/media/diskid/user_backup/Documents/

rsync -avh –exclude=”*.bak” $SOURCEDIR $DESTDIR

There’s not a lot of benefit if you only reference the directories once, but if they’re used multiple times, it’s much easier to change them in one location than changing them throughout a script.

Taking Input

Non-interactive scripts are useful, but what if you need to give the script new information each time it’s run? For instance, what if you want to write a script to modify a file? One thing you can do is take an argument from the command line. So, for instance, when you run “script foo” the script will take the name of the first argument (foo):

#!/bin/bash

echo $1

Here bash will read the command line and echo (print) the first argument — that is, the first string after the command itself.

You can also use read to accept user input. Let’s say you want to prompt a user for input:

#!/bin/bash

echo -e “Please enter your name: “
read name
echo “Nice to meet you $name”

That script will wait for the user to type in their name (or any other input, for that matter) and use it as the variable $name. Pretty simple, yeah? Let’s put all this together into a script that might be useful. Let’s say you want to have a script that will back up a directory you specify on the command line to a remote host:

#!/bin/bash

echo -e “What directory would you like to back up?”
read directory

DESTDIR=
This e-mail address is being protected from spambots. You need JavaScript enabled to view it
:$directory/

rsync –progress -avze ssh –exclude=”*.iso” $directory $DESTDIR

That script will read in the input from the command line and substitute it as the destination directory at the target system, as well as the local directory that will be synced. It might look a bit complex as a final script, but each of the bits that you need to know to put it together are pretty simple. A little trial and error and you’ll be creating useful scripts of your own.

Of course, this is just scraping the surface of bash scripting. If you’re interested in learning more, be sure to check out the Bash Guide for Beginners.

Ready to continue your Linux journey? Check out our free intro to Linux course!

The post Classic SysAdmin: Writing a Simple Bash Script appeared first on Linux Foundation.