According to some, blockchain is one of the hottest and most intriguing technologies currently in the market. Similar to the rising of the internet, blockchain could potentially disrupt multiple industries, including financial services. This Thursday, October 19 at Sibos in Toronto, Hyperledger’s Security Maven Dave Huseby will be moderating a panel “Does Blockchain technology alleviate security concerns or create new challenges?” During this session, experts will explore whether the shared nature of blockchain helps or hinders security.
We developed a Q&A with Dave to go over some security questions related to blockchain in advance of the panel. Let’s get to it!
What are the cybersecurity concerns that you are noticing today?
Integrating with existing systems, cryptographic key material management, and providing the required network quality of service connecting blockchain members are the greatest cyber-security concerns I am noticing today. Any organization applying blockchain technology to an existing process almost certainly has existing systems that chaincode/smart contracts will have to interact with.
This article was sponsored by Intel and written by Linux.com.
The Open Source Summit Europe conference opens its doors on Oct. 23 in Prague this year. Ahead of the gathering of attendees and presenters representing all walks of Linux communities, Linux.com interviewed industry leaders on some of the top emerging trends and issues of the day. Among those is edge computing, which Imad Sousou, vice president of the Software and Services Group and general manager of the Open Source Technology Center at Intel Corporation, shed considerable light on in this interview. He is also a keynote speaker, addressing this very topic at the event on Tuesday, Oct. 24.
The connected world holds a lot of promise—and an equal measure of complexity. That promise may seem like a long way from reality given the current mix of immature products and technologies. So how do we get there?
The industry is increasingly talking about edge computing as one way to fulfill this promise. We talked with Sousou about this new approach and on how Intel is helping bring it to market.
Linux.com: Intel is talking more and more about the “edge to cloud” computing landscape. What does this mean and why is it important?
Imad Sousou: Until now, Intel has talked about devices, and we’ve talked about cloud. I think it’s important that we start to talk about them together. In the next few years, the industry can expect billions of devices will be connected to each other and to the cloud, generating massive amounts of data and putting a strain on bandwidth, no matter how much optimization work we do. These bandwidth constraints create a need for devices at the edge of the network to do processing and computation—we’re talking potentially about everything from lightbulbs and appliances to manufacturing lines, medical equipment and cars. These devices will soon need to create, transmit, store, process and act upon data in real time, locally. This “edge computing” pushes intelligence to the edge of the network, making the promise of smart cities, intelligent factories, and connected hospitals possible.
Linux.com: As edge computing emerges, what challenges does Intel anticipate?
Sousou: With compute-intensive work moving closer to the edge, devices will need to process and act intelligently on massive amounts of data in real time. Computing performance is essential to ensuring the data and system integrity, reliability, and responsiveness, needed to make this smart, connected world become a reality. We are also seeing that as more devices connect to each other and to the network, security becomes a bigger concern.
Linux.com: Given these challenges, how is Intel helping to address them?
Sousou: Today, we are looking at approaches, technologies and best practices developed for—and proven in—the cloud, and figuring out ways to use them at the edge. Containers and orchestration are great examples of this. With edge devices specifically, responsiveness and security are key. Container technologies, such as Intel® Clear Containers, matched with hardware-based security offered by Intel® architecture, can help meet the speed and security requirements of both data centers and edge devices.
In a world of self-driving cars, drones, and industrial robotics, intelligence, real-time processing and quick decision-making remain critical. Secure, lightweight, open orchestration solutions like Intel® Cloud Integrated Advanced Orchestrator (Ciao) can coordinate deployment of containers, virtual machines and Kubernetes-based clusters, across multiple nodes, with speed, scalability and flexibility.
Linux.com: We’ve talked about computing across this landscape. Where does open source fit in?
Sousou: Open source really makes the promise of a smart, connected world possible. Not only has open source proven to be a viable development model, it is driving much of today’s innovation. Just as the Internet would not have been possible without the access, scale and affordability that open source provides, the world of connected devices is difficult to imagine without open source.
For example, consider communication between connected edge devices. These devices need to recognize each other, and they need a common way of exchanging data quickly and securely. Intel, along with industry-leading companies including Cisco, LG, Microsoft and Samsung, is driving the Open Connectivity Foundation efforts to help standardize how these devices will interact. This open source implementation and certification program can allow devices to communicate regardless of form factor, operating system, service provider or ecosystem.
Linux.com: Is there anything else that you would like to share with readers?
Sousou: I’ve spent most of my career in open source. I still get really excited by the innovations coming out of our community. The great thing about open source is how everyone can learn from each other, evolve and grow. I am confident that together we’ll meet the challenges in connecting the 50 billion devices we expect to be online by 2020, and creating powerful edge solutions.
For those at the Open Source Summit Europe in Prague on Tuesday, Oct. 24, I hope you’ll attend my keynoteto learn more about the future of edge computing, and where Intel is investing. You can also stop by the Intel booth to see in person what we’re doing to help enable smarter, connected edge devices today.
There’s been a lot of adoption of Kubernetes in the last few years, and as of Oct. 17 the open-source container orchestration technology has one more supporter. Docker Inc. announced at its DockerCon EU conference here that it is expanding its Docker platform to support Kubernetes.
Docker had been directly competing against Kubernetes with its Swarm container orchestration system since 2015. The plan now is to provide a seamless platform that supports a heterogenous deployment that can include both Swarm and Kubernetes clusters.
“Docker adapts to you because it’s open,” Docker founder Solomon Hykes said during his keynote address at DockerCon.
Last year, the Kubernetes project introduced its Container Runtime Interface (CRI) — a plugin interface that gives kubelet (a cluster node agent used to create pods and start containers) the ability to use different OCI-compliant container runtimes, without needing to recompile Kubernetes. Building on that work, the CRI-O project (originally known as OCID) is ready to provide a lightweight runtime for Kubernetes.
So what does this really mean?
CRI-O allows you to run containers directly from Kubernetes – without any unnecessary code or tooling. As long as the container is OCI-compliant, CRI-O can run it, cutting out extraneous tooling and allowing containers to do what they do best: fuel your next-generation cloud-native applications.
Announcing the fifth release candidate for the Linux kernel version 4.14, Linus Torvalds has revealed that fuzzing is producing a steady stream of security fixes.
Fuzzing involves stress testing a system by generating random code to induce errors, which in turn may help identify potential security flaws. Fuzzing is helping software developers catch bugs before shipping software to users.
As Torvalds points out, Linux kernel developers have been using fuzzing programs since the beginning, such as tools like “crashme”, which was released in 1991 and nearly 20 years later was used by Google security researcher Tavis Ormandy to test how well shielded a host is when untrusted data is being processed in a virtual machine.
Ken Parmelee, who leads the API gateway for IBM and Big Blue’s open source projects, has a few ideas about open-source methods for “attacking” the API and how to create micro-services and make them scale.
“Micro-services and APIs are products and we need to be thinking about them that way,” Parmelee says. “As you start to put them up people rely on them as part of their business. That’s a key aspect of what you’re doing in this space.”
Anyone can try out these serverless APIs in just 30 seconds at https://console.bluemix.net/openwhisk/ “This sounds very gimmicky, but it is that easy to do…We’re combining the work we’ve done with Cloud Foundry and released them in Bluemix under the OpenWhisk to provide security and scalability.”
Researchers have disclosed a serious weakness in the WPA2 protocol that allows attackers within range of vulnerable device or access point to intercept passwords, e-mails, and other data presumed to be encrypted, and in some cases, to inject ransomware or other malicious content into a website a client is visiting.
The proof-of-concept exploit is called KRACK, short for Key Reinstallation Attacks. The research has been a closely guarded secret for weeks ahead of a coordinated disclosure that was scheduled for 8am Monday, East Coast time. A website disclosing the vulnerability said it affects the core WPA2 protocol itself and is effective against devices running Android, Linux, and OpenBSD, and to a lesser extent macOS and Windows, as well as MediaTek Linksys, and other types of devices. The site warned that attackers can exploit the flaw to decrypt a wealth of sensitive data that’s normally encrypted by the nearly ubiquitous Wi-Fi encryption protocol.
Going to Open Source Summit EU in Prague? While you’re there, be sure stop by The Linux Foundation training booth for fun giveaways and a chance to win a one of three Raspberry Pi kits.
Giveaways include The Linux Foundation branded webcam covers, The Linux Foundation projects’ stickers, Tux stickers, Linux.com stickers, as well as free ebooks: The SysAdmin’s Essential Guide to Linux Workstation Security, Practical GPL Compliance, and A Guide to Understanding OPNFV & NFV.
You can also enter the raffle for a chance to win a Raspberry Pi Kit. There will be 3 raffle winners: names will be drawn and prizes will be mailed on Nov. 2.
DevOps is one of the most highly sought skills employers are seeking to fill among 57 percent of respondents in the 2017 Open Source Jobs Report, from Dice and The Linux Foundation. Specifically, firms are looking for developers (73 percent) and DevOps engineers (60 percent).
This comes as no surprise, given that DevOps professionals come with a blend of development and operations skills, providing the ability for organizations to create a more efficient and collaborative working environment. Unlike system administrators, which the DevOps role evolved from, DevOps requires greater flexibility. Consequently, DevOps professionals often wear different hats as they are tasked with multiple responsibilities, including designing and maintaining systems as well as making software development more efficient.
In addition to being versed in the latest enterprise technologies, DevOps professionals “also have the soft skills necessary to operate collaboratively across any given organization,’’ according to Dice’s annual Salary Survey for 2017. “They can finesse their way through every stage of the software-development lifecycle, all the way through implementation.”
The DevOps role requires not only writing an application, but also understanding how the code operates in production and maintaining it, being mindful of things like performance and stability. It emphasizes the ability to both communicate and collaborate.
“A working DevOps pipeline is a thing of beauty to behold, and helps bring the vision of truly automated end-to-end application automation to life,’’ writes Bernard Golden in enterprise.nxt.
Overall, respondents say the use of open source technologies is becoming increasingly more important (42 percent) in a business strategy, with 58 percent of hiring managers stating they will hire more open source professionals in the next several months, the Open Source Jobs Report finds.
Yet, 89 percent of hiring managers say it is difficult to find the right mix of experience and skills, a similar finding to last year’s 87 percent, according to the report. Meanwhile, 86 percent of open source professionals believe that knowing open source has advanced their career, and 52 percent say it would be easy to find another job. Only 27 percent report they have not received a recruiting call in the past six months.
Emphasis on the cloud and open source
Among the other notable findings in the Open Source Jobs Report is the assumption that Linux is increasingly running underneath work involving cloud and DevOps. Chef, Puppet and Ansible are the most popular DevOps tools and were created as open source with Windows support added later.
As more and more workloads and applications are moved to into the cloud, demand is growing for skills in cloud administration, DevOps and continuous integration/continuous delivery. This is also fueling greater interest in training and certifications related to open source projects and tools that power the cloud, the report notes.
Further illustrating the predominance of Linux in the cloud is the fact that the system underpins Google’s and Amazon’s public clouds. And, Microsoft has said that about 30 percent of Azure instances also run on Linux, with the percentage rising to as much as 50 percent on new workloads. The software giant continues to add to its open source footprint with Linux container and Kubernetes tools.
Agile development and DevOps are often commonly associated with cloud computing because both require immediate infrastructure availability, says Golden.
DevOps salaries paying off
DevOps positions command some of the highest paying salaries in tech, the 2017 Dice survey states. The starting salary for professionals with experience working with the Ansible platform is $121,382, according to Dice. The starting salary for Puppet knowledge is $112,883 and for Chef, $112,523.
The highest-paying DevOps skills are generally focused on automation and configuration management, the Dice salary survey finds. This makes sense, the survey points out, given that platforms including Ansible automate tasks like software provisioning, ensuring that DevOps professionals can perform their job regardless of thesize of the organization.
Looking ahead
The good news is demand for DevOps engineers in North America is expected to continue to be high next year, according to the 2018 Robert Half Salary Guide for Technology Professionals. This is especially true if candidates have solid communication and interpersonal skills, the firm notes.
The report finds the hot vertical industries for next year are healthcare, financial services, and manufacturing.
Download the full 2017 Open Source Jobs Report now.
In my experience, metrics serve three main functions: to increase awareness, to lead change, and to motivate.
Awareness helps you understand where you are in relation to specific policies and goals. For example, if you don’t know how many project contributions were made by under-represented minorities, you cannot determine whether workplace policies that aim to create a more inclusive and diverse work environment are successful.
Leading change focuses on determining a path. If a particular policy is implemented, for example, metrics will indicate whether KPIs increase or decrease.
Motivational actions help communities attract developers and help members achieve goals. For example, many communities reward developers who detect bugs in beta products. This benefits the community in two ways: The bugs are fixed, and looking for bugs becomes a priority for community members.