Home Blog Page 322

At the Crossroads of Open Source and Open Standards

This piece is the first in a series from speakers and sponsors at the Linux Foundation’s Node+JS Interactive (formerly JS Interactive), conference, taking place October 10-12, 2018 at the Vancouver Convention Centre, in Vancouver, Canada. The program will cover a broad spectrum of the JavaScript ecosystem including Node.js, frameworks, best practices and stories from successful end-users.

A new crop of high-value open source software projects stands ready to make a big impact in enterprise production, but structural issues like governance, IPR, and long-term maintenance plague OSS communities at every turn. Meanwhile, facing significant pressures from open source software and the industry groups that support them, standards development organizations are fighting harder than ever to retain members and publish innovative standards. What can these two vastly different philosophies learn from each other, and can they do it in time to ensure they remain relevant for the next 10 years?

Read more at The New Stack

Blockchain Training Takes Off

At major business schools ranging from Berkeley to Wharton, students are flocking to classes on blockchain and cryptocurrency. As CNBC recently reported: “According to a new survey of 675 U.S. undergraduate students by cryptocurrency exchange Coinbase and Qriously, 9 percent of students have already taken a class related to blockchain or cryptocurrency and 26 percent want to take one.”

College course offerings include “Blockchain, Cryptocurrency, and Distributed Ledger Technology” taught by Kevin Werbach and engineering professor David Crosbie at the University of Pennsylvania; and “Blockchain and CryptoEconomics,” taught by computer science professor Dawn Song at the University of California at Berkeley.

Meanwhile, job postings related to blockchain and Hyperledger are taking off, and knowledge in these areas is translating into opportunity. Careers website Glassdoor lists thousands of job posts related to blockchain.

Effectively, blockchain is becoming part of the required lingua franca for those entering the world of business as well as others. Outside of the big business schools, there are many learning resources worth knowing about, including these courses offered by The Linux Foundation:

Hyperledger Fabric Fundamentals (LFD271)

Teaches the fundamental concepts of blockchain and distributed ledger technologies.

Blockchain for Business – An Introduction to Hyperledger Technologies (LFS171)

A primer to blockchain and distributed ledger technologies. Learn how to start building blockchain applications with Hyperledger frameworks.

“In the span of only a year or two, blockchain has gone from something seen only as related to cryptocurrencies to a necessity for businesses across a wide variety of industries,” said The Linux Foundation’s Clyde Seepersad, General Manager, Training & Certification, in introducing the course Blockchain: Understanding its Uses and Implications. “Providing a free introductory course designed not only for technical staff but business professionals will help improve understanding of this important technology, while offering a certificate program through edX will enable professionals from all over the world to clearly demonstrate their expertise.”Aside from full courses, webinars focusing on blockchain technology offer chances to see how individual technologies work, and how industry segments are being influenced by blockchain. On Wednesday, September 26, at 9 a.m. Pacific, you can tune into “A Hitchhiker’s Guide to Deploying Hyperledger Fabric on Kubernetes,a free webinar presented by Alejandro (Sasha) Vicente Grabovetsky and Nicola Paoli of AID:Tech. It’s ideal for DevOps workers and others interested in the increasingly popular Hyperledger Fabric platform.

Conferences also provide good learning opportunities. The Open FinTech Forum in New York City, coming up October 10 and 11, will provide a great opportunity to hear about the latest distributed ledger deployments, use cases, trends, and predictions of blockchain adoption.  Panel discussions are scheduled to cover:

  • Distributed Ledger Technology Deployments & Use Cases in Financial Services

  • Enterprise Blockchain Adoption – Trends and Predictions

  • Blockchain Based Compliance Management Systems

Taking advantage of these opportunities to learn about blockchain makes more sense than ever.

 

Why the Future of Data Storage is (Still) Magnetic Tape

Studies show [PDF] that the amount of data being recorded is increasing at 30 to 40 percent per year. At the same time, the capacity of modern hard drives, which are used to store most of this, is increasing at less than half that rate. Fortunately, much of this information doesn’t need to be accessed instantly. And for such things, magnetic tape is the perfect solution. …

Indeed, much of the world’s data is still kept on tape, including data for basic science, such as particle physics and radio astronomy, human heritage and national archives, major motion pictures, banking, insurance, oil exploration, and more. There is even a cadre of people (including me, trained in materials science, engineering, or physics) whose job it is to keep improving tape storage.

Tape has survived for as long as it has for one fundamental reason: It’s cheap. And it’s getting cheaper all the time. But will that always be the case?

You might expect that if the ability to cram ever more data onto magnetic disks is diminishing, so too must this be true for tape, which uses the same basic technology but is even older. The surprising reality is that for tape, this scaling up in capacity is showing no signs of slowing. Indeed, it should continue for many more years at its historical rate of about 33 percent per year, meaning that you can expect a doubling in capacity roughly every two to three years. Think of it as a Moore’s Law for magnetic tape.

Read more at IEEE Spectrum

Linux on Windows 10: Running Ubuntu VMs Just Got a Lot Easier, Says Microsoft

Ubuntu maintainer Canonical and Microsoft have teamed up to release an optimized Ubuntu Desktop image that’s available through Microsoft’s Hyper-V gallery.

The Ubuntu Desktop image should deliver a better experience when running it as a guest on a Windows 10 Pro host, according to Canonical. The optimized version is Ubuntu Desktop 18.04.1 LTS release, also known as Bionic Beaver.

Microsoft’s work with Canonical was prompted by its users who wanted a “first-class experience” on Linux virtual machines (VMs) as well as Windows VMs. To achieve this goal, Microsoft worked with the developers of XRDP, an open-source remote-desktop protocol (RDP) for Linux based on Microsoft’s RDP for Windows.

Read more at ZDNet

Learn more about Linux on Windows here.

A Deep Dive Into Data Lakes

In the age of Big Data, we’ve had to come up with new terms to describe large-scale data storage. We have databases, data warehouses and now data lakes.

While they all contain data, these terms describe different ways of storing and using that data. Before we discuss data lakes and why they are important, let’s examine how they differ from databases and data warehouses.

Let’s start here: A data warehouse is not a database. Although you could argue that they’re both relational data systems, they serve different purposes. Data warehousing allows you to pull data together from a number of different sources for analysis and reportiong. Data warehouses store vast amounts of historical data for complex queries across all data types being pulled together.

Data lakes are centralized storage and data repositories that allow you to work with a variety of different types of data. The cool thing here is that you don’t need to structure the data and it can be imported “as-is.” This allows you to work with raw data and run analytics, data visualization, big data processing, machine learning tools, AI, and much more. This level of data agility can actually give you some pretty cool competitive advantages.

 

Read more at Datacenter Frontier

The (Awesome) Economics of Open Source

By lowering barriers to innovation, open source is superior to proprietary solutions for enabling continued positive economic growth. …

Successful open source software companies “discover” markets where transaction costs far outweigh all other costs, outcompete the proprietary alternatives for all the good reasons that even the economic nay-sayers already concede (e.g., open source is simply a better development model to create and maintain higher-quality, more rapidly innovative software than the finite limits of proprietary software), and then—and this is the important bit—help clients achieve strategic objectives using open source as a platform for their own innovation. With open source, better/faster/cheaper by itself is available for the low, low price of zero dollars.

As an open source company, we don’t cry about that. Instead, we look at how open source might create a new inflection point that fundamentally changes the economics of existing markets or how it might create entirely new and more valuable markets.

Read more at OpenSource.com 

How IBM Is Using Open Source for a Greater Good

Dr. Angel Diaz is the face of open source at IBM as Vice President of Developer Technology, Open Source & Advocacy. At the recent Open Source Summit in Vancouver, we spoke with Diaz to talk about the importance of open source at IBM and how it’s changing the world around us.

LF: What’s the importance of open source in modern economy?

Angel Diaz: We are living in a technology-fueled business renaissance — cloud, data, artificial intelligence, and the redefinition of the transaction. There is constant democratization of technology. This democratization allows us as computer scientists to innovate higher orders of the stack. You don’t have to worry about compute, storage and network; you get that in the cloud for example, but what has been driving that democratization? Open source.

Open source has been the fuel, the innovation engine, the skills engine, the level playing field that allows us as a society to build more, to build faster and move forward and the rate and pace of that is increasing.

What’s really nice about that is we are doing it in a controlled way with open governance and leveraging the all the work that we do in consortia such as the Linux Foundation.

Read more at The Linux Foundation

TNS Context: The CNCF Open Source Survey and the Ballerina Programming Language

Today on The New Stack Context podcast, we talk with Chris Aniszczyk, co-founder of the TODO Group and Chief Technology Officer of the Cloud Native Computing Foundation (CNCF) about the results of our recent open source program management survey. 

This week, we released the results of a survey on how companies are managing open source software and the benefits and challenges of formal policies and programs. 

We surveyed more than 700 respondents, most of them developers and found that:

  • More than half of respondents (53 percent) across all industries say their organization has an open source software program or has plans to establish one.
  • But having a formal program is a best practice among large tech companies
  • And the number of program is growing: we expect that the number of large companies with open source programs will triple by 2020.

Read more at The New Stack

Quantum Computing and Cryptography

Quantum computing is a new way of computing — one that could allow humankind to perform computations that are simply impossible using today’s computing technologies. It allows for very fast searching, something that would break some of the encryption algorithms we use today. And it allows us to easily factor large numbers, something that would break the RSA cryptosystem for any key length.

This is why cryptographers are hard at work designing and analyzing “quantum-resistant” public-key algorithms. Currently, quantum computing is too nascent for cryptographers to be sure of what is secure and what isn’t. But even assuming aliens have developed the technology to its full potential, quantum computing doesn’t spell the end of the world for cryptography. Symmetric cryptography is easy to make quantum-resistant, and we’re working on quantum-resistant public-key algorithms. If public-key cryptography ends up being a temporary anomaly based on our mathematical knowledge and computational ability, we’ll still survive. And if some inconceivable alien technology can break all of cryptography, we still can have secrecy based on information theory — albeit with significant loss of capability.

At its core, cryptography relies on the mathematical quirk that some things are easier to do than to undo. Just as it’s easier to smash a plate than to glue all the pieces back together, it’s much easier to multiply two prime numbers together to obtain one large number than it is to factor that large number back into two prime numbers. Asymmetries of this kind — one-way functions and trap-door one-way functions — underlie all of cryptography.

Read more at Schneier on Security

Linux For Beginners: What’s A Desktop Environment?

As I continue my journey into the world of Linux, I’ve realized that one of its most distinct advantages over Windows and macOS can also be one of its most confusing hurdles for beginners. Choice has a tendency to be overwhelming, and Linux is all about choice. Let’s say you’ve set your sights on using Ubuntu. That’s a safe decision, but there are eight official “flavors” of Ubuntu that all look and behave differently. For the most part, that comes down to which desktop environment each distribution is using… Read more at Forbes.