GPUs like those of Intel and Vivante support storing the contents of graphical buffers in different formats. Support for describing these formats using modifiers has now been added to Android and Mesa, enabling tiling artifact free running of Android on the iMX6 platform.
Written by Robert Foss, Software Engineer at Collabora.
GPUs like those of Intel and Vivante support storing the contents of graphical buffers in different formats. Support for describing these formats using modifiers has now been added to Android and Mesa, enabling tiling artifact free running of Android on the iMX6 platform.
With modifier support added to Mesa and gbm_gralloc, it is now possible to boot Android on iMX6 platforms using no proprietary blobs at all. This makes iMX6 one of the very few embedded SOCs that needs no blobs at all to run.
Not only is that a great win for Open Source in general, but it also makes the iMX6 more attractive as a platform. A further positive point is that this lays the groundwork for the iMX8 platform, and supporting it will come much easier.
What are modifiers used for?
Modifiers are used to represent different properties of buffers. These properties can cover a range of different information about a buffer, for example compression and tiling.
For the case of the iMX6 and the Vivante GPU which it is equipped with, the modifiers are related to tiling. The reason being that buffers can be tiled in different ways (Tiled, Super Tiled, etc.) or not at all (Linear). Before sending buffers out to a display, they need to have the associated tiling information made available, so that the actual image that is being sent out is not tiled.
This of course raises the question “Why use tiling at all?”, to which the short answer is power efficiency, which is very desirable in the embedded as well as the mobile space.
While the list of enterprise companies that have managed to adopt DevOps successfully continues to grow, many remain convinced that the benefits of continuous delivery are impossible for large and more complex organisations.
“The prevailing notion is that DevOps is for startups and the Googles, Amazons and Facebooks of this world, and not for large, complex companies that have been around for decades or even centuries – but that is really not the case,” said Gene Kim, co-author of The DevOps Handbook.
Kim pointed to the high number of enterprises operating in highly regulated environments that have successfully managed to embrace DevOps, including public sector organisations and financial services firms.
While the options for Linux computers from commercial vendors are still needles in the proverbial haystack of OEM Windows equipment out there, there are more and more options available to a consumer who wants a good, solid device that’s ready-to-use with no messing around.
Still, there are more Linux OEM computers than I could look at for one article—and the options tend to be different in Europe than they are in the United States, with providers like Entroware that don’t ship to the latter at all.
In this article, I look at offerings from three of the most well-known Linux OEMs on the western side of the pond: ZaReason, System76, and Dell.
Not many remember it, because the technology industry tends to focus on its future at the expense of its past, but in the beginning software was free. In both senses of the word free; it was available at no cost, and the source typically came without restrictions. One of the earliest user groups SHARE, founded in 1955, maintained a library, in fact, of users’ patches, fixes and additions to the source code of the IBM mainframe like a proto-GitHub. The modifications SHARE maintained were extensive enough, in fact, that in 1959 SHARE released its own operating system – what we would today refer to as a distribution, the SHARE Operating System (SOS) – for IBM 709 hardware.
IBM made available the software at no cost and in source code form because for the company at that time, the software was not the product, the hardware was. It wasn’t until June 1969 that IBM announced, possibly in response to an anti-trust suit filed by the United State Justice Department in January of that year, that it would “unbundle” its software and hardware.
Today, if you’re building a new product or service, open source software is likely playing a role. But many entrepreneurs and product managers still struggle with how to build a successful business purely on open source.
The big secret of a successful open source business is that “it’s about way more than the code,” says John Mark Walker, a well-known voice in the open source world with extensive expertise in open source product, community, and ecosystem creation at Red Hat and Dell EMC. “In order to build a certified, predictable, manageable product that ‘just works,’ it requires a lot more effort than just writing good code.”
It requires a solid understanding of open source business models and the expertise and management skills to take advantage of developing your products in an open source way.
Building a Business on Open Source
In a new eBook,Building a Business on Open Source, The Linux Foundation has partnered with Walker to distill what it takes to create and manage a product or service built on open source software. It starts with an overview of the various business models, then covers the business value of the open source platform itself, and describes how to create a successful open source product and manage the software supply chain behind it.
“If you’re developing software in an open source way, you have options that proprietary developers don’t have,” Walker writes. “You can deliver better software more efficiently that is more responsive to customer needs — if you do it well and apply best practices.”
The Value of the Open Source Platform
As open source has become more prevalent, it has changed the way products are developed. Walker describes the unique challenges and questions raised by adopting an open source approach, including questions of sustainability, accountability, and monetization.
Walker admits that Red Hat remains the only company that has been successful with a pure open source business model (without being acquired). Many companies are still pursuing a similar model selling open source software, but other models around open source exist, including the venture-capitalist’s favorite open core model, a services and support model, and a hybrid model that mixes open source code with proprietary components.
In discussing the difference between open core and hybrid business models, Walker says his biggest problem with them is that they both assume there is no intrinsic value in the platform itself.
“I am not discounting the added value of proprietary software on top of open source platforms; I am suggesting that the open source platforms themselves are inherently valuable and can be sold as products in their own right, if done correctly,” Walker states.
“If you begin with the premise that open source platforms have great value, and you sell that value in the form of a certified software product, that’s just a starting point. The key is that you’re selling a certified version of an open source platform and from there, it’s up to you how to structure your product approach,” he continues.
What’s emerging now is a new “open platform model,” in which the open source platform itself is sold in the form of a certified product. It may include proprietary add-ons, but derives most of its value from the platform.
A Messy Business
Creating a business purely around an open source platform requires new thinking, and a new process. It’s difficult to turn the code that’s available to everyone for free into a product that just works and can be used at scale.
“Creating a product is a messy, messy business. There are multiple layers of QA, QE, and UX/UI design that, after years of effort, may result in something that somewhat resembles a usable product,” writes Walker.
Walker explains the distinction between an open source project and a product that’s based on that project. He points out that “creating, marketing and selling a product is no different in the open source space from any other endeavor.”
He details the process of making a product out of an open source project; it’s not nearly as easy as packing the code into a product and charging for it.
Mastering the Supply Chain
Part two of the ebook covers more advanced topics, including the management of open source software supply chains, which offers some unique challenges.
“A well-managed supply chain is crucial to business success. How do you determine ideal pricing, build relationships with the right manufacturers, and maximize the efficiency of your supply chain so you’re able to produce more products cheaply and sell more of them?” asks Walker.
“One potential conclusion is that to be successful at open source products, you must master the ability to influence and manage the sundry supply chains that ultimately come together in the product creation process,” he says.
In the final chapter, Walker takes a deep dive into the importance of being an influencer of the supply chain. He talks about some best practices in the process of evaluating supply chain components and gives examples of companies like Red Hat who have an upstream first policy that plays a big role in making them an influencer of the supply chain.
The crux is, “To get the most benefit from the open source software supply chain, you must be the open source software supply chain.”
Conclusion
It might sound easy to take some free source code, package it up, and create a product out of it. But in reality it’s a very challenging job. But if you do it right, an open source approach offers immense benefits that are unmatched in the closed source world.
That’s exactly what this book is all about. Doing it right. The methodologies and processes detailed by Walker will help companies, managers, and developers adopt best practices to create valuable open source products as open source business models shift, yet again.
Containers are becoming the de facto approach for deploying applications, because they are easy to use and cost-effective. With containers, you can significantly cut down the time to go to market if the entire team responsible for the application lifecycle is involved — whether they are developers, Quality Assurance engineers, or Ops engineers.
The new Containers for Developers and Quality Assurance (LFS254) self-paced course from The Linux Foundation is designed for developers and Quality Assurance engineers who are interested in learning the workflow of an application with Docker. In this self-paced course, we will quickly review some Docker basics, including installation, and then, with the help of a sample application, we will walk through the lifecycle of that application with Docker.
The online course is presented almost entirely on video and some of the topics covered in this course preview include:
In the course, we focus on creating an end-to-end workflow for our application — from development to production. We’ll use Docker as our primary container environment and Jenkins as our primary CI/CD tool. All of the Docker hosts used in this course will be deployed on the cloud (DigitalOcean).
Install Docker
You’ll need to have Docker installed in order to work along with the course materials. All of Docker’s free products come under the Docker Community Edition. They’re offered in two variants: edge and stable. All of the enterprise and production-ready products come under the Docker Enterprise Edition umbrella.
And, you can download all the Docker products from the Docker Store. For this course, we will be using the Community edition. So, click on “GET DOCKER CE” to proceed further. If you select “Linux” in the “Operating Systems” section, you’ll see that Docker is available on all the major Linux distributions, like CentOS, Ubuntu, Fedora, and so on. It’s also available for Mac and Windows.
This preview series is intended to give you a sample of the course format and quality of the content, which is prepared and presented by Neependra Khare (@neependra), Founder and Principal Consultant at CloudYuga, Docker Captain, and author of the Docker Cookbook.
There are many benefits of agile software development, including the ability to accelerate growth, foster developer autonomy, and respond to changing customer needs faster, all while creating a company culture that embraces innovation. But, while we’re still bickering over what is precisely agile and what precisely isn’t, some feel left behind. From middle management to whole project management offices, there are many struggling to find their place in an agile transformation.
But there is an argument for the role the project management office (PMO) can play in a company gone agile, according to scrum master Dean Latchana, who gave a talk on this subject to a skeptical audience recently at the AgiNext Conference in London.
Internet of Things (IoT) platforms are becoming a hub for connecting devices, sensors, networks, and services as well as providing a range of organizations with crucial data so they can drive more cash flow and stay ahead of competition. In 2016, IoT platforms generated about $2 billion in revenue and is expected to grow to $83.4 billion by 2025, according to a study by MachNation.
“When it comes to picking an IoT platform, it’s like the Wild West,” said Strategy Analytics Analyst Chris Ambrosio. “It all depends on what the customer needs, and each platform can be applied to different use cases.”
The IoT platform space is highly fragmented. In fact, some estimates indicate there are more than 350 different dedicated IoT platforms across different industry verticals. SDxCentral has narrowed its list to these 10 companies and platforms that we think are making the most inroads in IoT right now.
It’s one thing to create a great piece of software. It’s quite another to have it make a mark on the entire industry. These are the companies and organizations whose work has had a significant impact on what others build, how they build it, and ultimately, who uses it.
The Open Source Survey is an open data project by GitHub and collaborators from academia, industry, and the broader open source community.
With over 50 questions, the 2017 survey covers a wide range of topics. Below, we highlight some of the most actionable and important insights about the community.
The data described below covers only the random sample sourced from open source repositories on GitHub.com. Percentages are rounded and may not always sum to 100.
Documentation is highly valued, frequently overlooked, and a means for establishing inclusive and accessible communities.
Negative interactions are infrequent but highly visible, with consequences for project activity.
Open source is used by the whole world, but its contributors don’t yet reflect its broad audience.
Using and contributing to open source often happens on the job.
Open source is the default when choosing software.