Home Blog Page 437

Challenges and Solutions in Edge Computing: The Future

This article was sponsored by Intel and written by Linux.com.

This year’s Open Source Summit Europe (formerly LinuxCon) took place in Prague. The conference is part of a series of annual events that are always popular in the open source community, and the lineup featured many different tracks, reflecting the upsurge in the number of open source projects and adoptions. One of the more popular topics there was edge computing and IoT.

Earlier we spoke with Imad Sousou, vice president of the Software and Services Group and general manager of the Open Source Technology Center at Intel Corporation. about his thoughts on edge computing.  Following Open Source Summit Europe, we spoke with Sousou again to learn more about the future of edge networks and the new technologies needed to handle the demands of the growing number of connected devices.

Linux.com: Earlier, you talked about pushing intelligence to the edge of the network. What benefits does edge computing offer?

Sousou: Billions of new connected devices are creating and collecting data that needs to be processed. Imagine if all that processing takes place in the cloud. The network would very quickly get overwhelmed, regardless of how powerful it is. We will run into issues with network congestion and latency. With edge computing, we push some processing closer to the devices to help eliminate the latency and congestion problems, and improve performance of the applications running on those devices. This is a bit of an oversimplification, but you can see the benefit. Another advantage has to do with the availability of services on devices.  Because slow response time and outages are unacceptable, if you move the computation as close to the device or even on the device, we can improve availability.  

Linux.com:  What is driving this move to the edge? Why does it matter?

Sousou: That’s a great question. You can boil it down to four key reasons. The first reason is speed. I touched on this before, but edge computing reduces latency because data doesn’t have to travel over a network to a remote data center or the cloud for processing. The second reason is security. We could see improved security at the edge because the data stays closer to where it was created. The third is related to scalability. Edge computing is fundamentally ‘distributed computing,’ meaning it improves the resiliency, reduces network load, and is easier to scale. And finally, it matters because it lowers cost. Data transmission costs are lower because the amount of data transferred back to a central location for storage is reduced.

Linux.com: What kinds of technologies are needed to make edge computing successful?

Sousou: At Intel, we think applying current cloud technologies for use at the edge is the right approach. For example, we can use existing cloud container solutions such as Intel Clear Containers. Today, applications at the edge run on bare metal. This creates security concerns if an application gets compromised. With Intel Clear Containers, we can provide hardware-based isolation with virtualization technology. If the application controlling your device gets hacked, your application will still be safe and other applications won’t be able to read/write your memory or data packets.

That is just one example. Of course we’ll innovate to address new use cases. We can use advancements in machine learning and artificial intelligence at the edge. It will really be a mix of new and existing technologies that will deliver edge computing.

Linux.com: You mentioned artificial intelligence and machine learning. Can you provide more detail on how that relates to edge computing?

Sousou:  With the access and amount of data being generated, it’s becoming more important for edge devices to know what data is relevant and what isn’t. Devices must be more than smart. They must also to be powerful enough to both train themselves and infer direction on the same small device or within a sensor. New artificial intelligence and machine learning technologies are making this possible. Machine learning algorithms that connect multiple points of input require powerful processing that support the data movement needed to best take advantage of information. At Intel, we want to ensure machine learning frameworks are optimized on Intel architecture.

Linux.com:  How does 5G wireless technology help enable edge computing?

Sousou: 5G networks support growing data rates, the increasing number of terminals, the need for higher service availability, and the desire for enhanced edge network coverage. To support new use cases, the new standard has identified three primary requirements: massive machine-to-machine (M2M) communications for Internet of Things (IoT) applications, ultra-low latency enabling life-saving car-to-car connectivity, for example, and gigabit speeds (high-bandwidth mobile broadband). No single wireless technology will meet these characteristics, so 5G goes beyond a single air interface and will be defined by a heterogeneous network that integrates 5G, 4G, Wi-Fi, and other wireless technologies.

Linux.com: We have heard a bit about time-sensitive networking (TSN). Can you explain what it is and how it relates to the edge?

Sousou: Intel has invested in time-sensitive networking for more than 5 years. TSN is a set of technologies that allows a network to deliver low-latency or guaranteed bandwidth to applications that require it while simultaneously supporting other less demanding applications. TSN uses packet prioritization, filtering, and network virtualization to support edge compute use cases on existing networking infrastructure. There are examples of uses in industrial infrastructures, autonomous devices, data centers, and communications infrastructures where the open implementations of core TSN infrastructure will help companies lower costs, make things easier to maintain, and provide the scale and accessibility needed for broad market acceleration and adoption. We are working with the industry, including groups like the Avnu Alliance, to deliver a maintainable, deterministic, open source network stack and associated hardware that can provide a coordinated time synchronization from cloud to fog to the edge.

Linux.com: Is there anything else that you would like to share with readers that we haven’t already covered?

Sousou: There is a lot of excitement about this connected world. It has the potential to change how we create, consume, and take advantage of information and could radically change how we live. Open source software is front and center in driving this change and provides the foundation for making edge computing a reality. I look forward  to continuing to work with the community to innovate and help achieve a safer, smarter world.

Linux Foundation Continues to Emphasize Diversity and Inclusiveness at Events

This has been a pivotal year for Linux Foundation events. Our largest gatherings, which include Open Source Summit, Embedded Linux Conference, KubeCon + CloudNativeCon, Open Networking Summit, and Cloud Foundry Summit, attracted a combined 25,000 people from 4,500 different organizations globally. Attendance was up 25 percent over 2016.

Over the past few years, one of our core objectives has been to work with projects and communities to promote diversity and inclusiveness in open source. We’re relentlessly focused on this not only because more diverse teams make smarter decisions and generate better business results, but because they create more productive open source projects. Most important, we think supporting diversity and inclusiveness in open source is simply the right thing to do.

While there’s still progress to be made, we’ve made remarkable headway at our events this year. Here are a few of our initiatives:

Read more at The Linux Foundation

Mint 18.3: The Best Linux Desktop Takes Big Steps Forward

I run many operating systems every day, from macOS, to Windows 7 and 10, to more Linux desktop distributions than you can shake a stick at. And, once more, as a power-user’s power user, I’ve found the latest version of Linux Mint to be the best of the best.

If you’ve never installed Mint before, you can download its ISO files from the Mint Downloads. There are still both 64-bit and 32-bit versions for the Cinnamon desktop, but unless you’re running a really old system, just down the 64-bit version. Then burn the ISO image to a DVD using a tool such as ImgBurn. Or, you can put it on a bootable USB stick with a program like Rufus.

Then, boot your computer using the DVD or stick and make sure Mint works with your computer. If it does — and I’ve never met a PC it wouldn’t work on — you can then install it. 

Read more at ZDNet

Overcoming Challenges When Building Great Global Communities

Today’s open source communities include people from all around the world. What challenges can you expect when establishing an online community, and how can you help overcome them?

People contributing to an open source community share a commitment to the software they’re helping to develop. In the past, people communicated by meeting in person at a set place and time, or through letters or phone calls. Today, technology has fostered growth of online communities—people can simply pop into a chat room or messaging channel and start working together. You might work with someone in Morocco in the morning, for example, and with someone in Hawaii that evening.

Global communities: 3 common challenges

Anyone who’s ever worked in a group knows that differences of opinion can be difficult to overcome. In online communities, language barriers, different time zones, and cultural differences can also create challenges.

Read more at Opensource.com

Kubeflow: Bringing Together Kubernetes and Machine Learning

Kubeflow brings composable, easier to use stacks with more control and portability for Kubernetes deployments for all ML, not just TensorFlow.

Introducing Kubeflow, the new project to make machine learning on Kubernetes easy, portable, and scalable. Kubeflow should be able to run in any environment where Kubernetes runs. Instead of recreating other services, Kubeflow distinguishes itself by spinning up the best solutions for Kubernetes users.

Why switch to Kubeflow?

Kubeflow is intended to make ML easier for Kubernetes users. How? By letting the system take care of the details (within reason) and support the kind of tooling ML practitioners want and need.

Read more at Jaxenter

SysAdmins and Kernel Developers Advance Linux Skills with LiFT

Bhumika Goyal, age 22, of India, has had more than 340 patches accepted into the Linux kernel – an accomplishment that contributed in no small part to her receiving one of two Linux Kernel Guru scholarships from The Linux Foundation.

Bhumika Goyal

Goyal served as an Outreachy intern earlier this year, focused on the Linux kernel, where she worked on securing the kernel from surface attacks by making the kernel structures read-only. Since her internship, Goyal has continued this work with the support of The Linux Foundation’s Core Infrastructure Initiative.

“Having contributed to the kernel for a year now, I have developed a keen interest in learning the internals of the kernel,’’ Goyal explained in her scholarship application. “This training will definitely help me to develop my skills so that I can contribute to the kernel community more effectively.”  Her goal is to become a full-time kernel engineer after completing this current project.  

Mohammed Al-Samman

Mohammed Al-Samman, 25, of Egypt, is the other recipient of a Linux Kernel Guru scholarship from The Linux Foundation. Al-Samman has spent the past year working on the Linux kernel, doing analysis, debugging, and compiling. He also built an open source Linux firewall, and a kernel module to monitor power supply electrical current status (AC/DC) by using the Linux kernel notifier.

Al-Samman is studying the Linux kernel network subsystem. “I hope to start a Linux kernel community in my country and secure a job as Linux kernel developer and contribute to the community,’’ he said.

SysAdmin Super Star

Omar Aaziz, 39, United States, is one of two recipients of the Foundation’s SysAdmin Super Star scholarship. Aaziz, who is originally from Iraq, now administers the computer science clusters at New Mexico State University. He manages the Linux firewall to prevent and detect cyberattacks and transfer data safely. He also administers a 180TB supercomputer storage system and performs backups.  

Omar Aaziz

Aaziz has five years of experience with high-performance computing (HPC) and is also pursuing a Ph.D. “As a research assistant, I learned how to build a complete HPC cluster from scratch, install CentOS operating systems, version 6.5 and 7. I,’’ he wrote in his scholarship application. “I have many duties, such as helping the users to install different open-source software, creating job scripts, and more. I managed the clusters [in a] Linux firewall to prevent and detect cyberattacks and transfer data safely.”

Aaziz also had two internships at Sandia National Laboratories, during which time he served as administrator of three supercomputers. His extensive admin work required him to use Linux and open source software heavily. “I used CentOS, Ubuntu, and I built my own Linux version with customized security modules to prevent security breaches.”

His goal is to become a high performance computing engineer, helping develop the next generation of supercomputers.

Leonardo Goncalves da Silva

Leonardo Goncalves da Silva, 41, Brazil, who also received a SysAdmin Super Star scholarship, has worked with Linux for 20 years and recently shifted his career toward cloud development based on a Linux and Kubernetes framework. He currently contributes to several open source projects, and is planning to start contributing to Hyperledger, The Linux Foundation’s open source project focused on blockchains and other tools.

“The various systems I’ve designed were developed using open source tools and frameworks with great results to my employees in terms of cost reduction and productivity,’’ he wrote in his scholarship application. His career shift, he explained, “is helping my customers to create great solutions with security and agility.” Da Silva plans to use the scholarship to take the Kubernetes Fundamentals course to provide better service to his clients.

The annual Linux Foundation Training (LiFT) Scholarships provide advanced open source training to existing and aspiring IT professionals from all over the world. Twenty-seven recipients received scholarships this year – the highest number ever awarded by the Foundation. Scholarship recipients receive a Linux Foundation training course and certification exam at no cost.  

Learn more about the LiFT Scholarship program from The Linux Foundation.

With OPNFV, Orange Plans a Full-Scale Rollout of Network Functions Virtualization

Learn how Orange leverages open source software via OPNFV to solve several important issues along the way. 

Over the past few years, the entire networking industry has begun to transform as network demands rapidly increase. This is true for both the technology itself and the way in which carriers — like my employer Orange, as well as vendors and other service providers — adapt and evolve their approach to meeting these demands. As a result, we’re becoming more and more agile and adept in how we virtualize our evolving network and a shifting ecosystem.” keep up with growing demands and the need to virtualize.

At Orange, we are laser-focused on investments into future technologies and plan to spend over $16 billion between 2015 and 2018 towards new networks (including 4G, 4G+, fixed fiber). A key component of these investments — along with access network investments — are advancements in software-defined networking (SDN) and network functions virtualization (NFV) technologies as a way to create new revenue streams, improve agility, and reduce costs via a program we call On-Demand Networks.

Read more at The New Source

DevOps, Docker, and Empathy

Just because we’re using containers doesn’t mean that we “do DevOps.” Docker is not some kind of fairy dust that you can sprinkle around your code and applications to deploy faster. It is only a tool, albeit a very powerful one. And like every tool, it can be misused. Guess what happens when we misuse a power tool? Power fuck-ups. Let’s talk about it.

I’m writing this because I have seen a few people expressing very deep frustrations about Docker, and I would like to extend a hand to show them that instead of being a giant pain in the neck, Docker can help them to work better, and (if that’s their goal) be an advantage rather than a burden in their journey (or their “digital transformation” if we want to speak fancy.)

Docker: hurting or helping the DevOps cause?

I recently attended a talk where the speaker tried to make the point that Docker was anti-devops, for a number of reasons (that I will list below.) However, each of these reasons was (in my opinion) not exactly a problem with Docker, but rather in the way that it was used (or sometimes, abused). Furthermore, all these reasons were, in fact, not specific to Docker, but generic to cloud deployment, immutable infrastructure, and other things that are generally touted as good things in the DevOps movement, along with cultural choices like cross-team collaboration. The speaker confirmed this when I asked at the end of the talk, “did you identify any issue that was specific to Docker and containers and not to cloud in general?” — there was none.

What are these “Docker problems?” Let’s view a few of them.

Read more at JPetazzo

 

Top 15 Resources for Learning JavaScript

HTML, cascading stylesheets (CSS), and JavaScript have experienced massive growth and evolution over the past two decades, which should come as no surprise given the ever-expanding role of the internet in our lives. JavaScript development has come a long way since the early 1990s and IBM’s famous commercial depicting business’ early recognition of the internet’s significance. That commercial forever changed the role of the web developer. Before the business invasion, web developers were more artistic, but the influence of business and industry changed all of that.

More than 25 years have passed since the first web pages produced with JavaScript were developed, and things have improved immensely. Today, IDEs are well structured to validate your code, and self-contained environments help with testing and debugging web frontend logic. Now, learning JavaScript goes well beyond simply studying the language’s syntax.

Read more at OpenSource.com

Running a Successful Open Source Project

Managing an open source project isn’t as easy as it sounds. A successful open source project is more than just making the source code available. In this article, Wayne Beaton and Gunnar Wagenknecht explain how you can make your open source project a runaway success.

Running an open source project is easy. All you have to do is make your source code available and you’re open source, right? Well, maybe. Ultimately, whether or not an open source project is successful depends on your definition of success. Regardless of your definition, creating an open source project can be a lot of work. If you have goals regarding adoption, for example, then you need to be prepared to invest. While open source software is “free as in beer”, it’s not really free: time and energy are valuable resources and these valuable resources need to be invested in the project.

Read more at Jaxenter