ODPi Won’t Fork Hadoop, Pledges Support for Apache Software Foundation with New Gold Sponsorship

220

The folks at the Open Data Platform Initiative (ODPi) have heard the concerns and the criticisms of the Hadoop community, and today John Mertic, the standards organization’s Director of Program Management, took to Apache Big Data in Vancouver to clear the air.

Contrary to the Hadoop community’s concerns, ODPi does not want to take over the development of Hadoop, it does not want to fork Hadoop, Mertic said.

The ODPi wants the big data projects based at the Apache Software Foundation, including Hadoop, to continue to innovate, try new things and splash around in the code base’s pool – digitally speaking, of course.

What ODPi intends to do is keep the companies looking to use Hadoop in production downstream from getting wet while the ASF is making waves.

ASF for innovation, ODPi for standards

ODPi was formed last year by dozens of leading tech companies, including Hortonworks, IBM, and Pivotal, as a collaborative project at The Linux Foundation to develop a common reference platform called ODPi Core. The nonprofit has been developing standards around a few Apache projects, including Hadoop in its Runtime Spec and test suite, which includes HDFS, YARN, and MapReduce. The Operations Spec due out this summer will focus on Ambari.

To show the organization’s commitment to the ASF’s mission, Mertic announced that ODPi is now a Gold Sponsor of the open source foundation.

“We want the ASF to be the home of innovation,” Mertic said. “What we want to bring to the table is those use cases from industry; [to show] how are people using Hadoop.”

Mertic said that end users and independent software vendors (ISVs) have been frustrated by inconsistencies across the various Hadoop distributions, which is having a dampening effect on investment. He said while there are 15 generally-accepted common components to a Hadoop distribution, the versions of those components are often different from distro to distro.

“Both our organizations are laser focused – we want a stronger Hadoop ecosystem,” Mertic said, of the ASF and ODPi. “But let’s be honest, it hasn’t been easy.

“The Apache project really focuses on those raw components, the peanuts and the chocolate, if you will. But if you’re an end user, you’re looking for that Reese’s Peanut Butter Cup.

By developing a layer of standards for Hadoop distributions so end users and ISVs see a consistent level of performance and progression, to spur adoption and investment.

“From the ODPi perspective, we find this to be the place where we we need to provide a baseline group of standards so people know what to expect,” Mertic said.

Mertic gave the following quote by Robert W. Lane, CEO of Deere & Company, as the perfect encapsulation of the ODPi’s mission:

“A sustainable environment requires increased productivity; productivity comes about by innovation; innovation is the result of investment; and investment is only possible when a reasonable return is expected. The efficient use of money is more assured when there are known standards in which to operate.”

https://www.youtube.com/watch?v=XUl7vlVwNaI?list=PLGeM09tlguZQ3ouijqG4r1YIIZYxCKsLp

linux-com_ctas_apache_052316_452x121.png?itok=eJwyR2ye