ODPi Webinar on DataOps at Scale: Taking Apache Hadoop Enterprise-Wide

134

2016 was a pivotal year for Apache Hadoop, a year in which enterprises across a variety of industries moved the technology out of PoCs and the lab and into production. Look no further than AtScale’s latest Big Data Maturity survey, in which 73 percent of respondents report running Hadoop in production.

ODPi recently ran a series of its own Twitter polls and found that 41 percent of respondents do not use Hadoop in-production, while 41% of respondents said they do. This split may partly be due to the fact that the concept of “production” Hadoop can be misleading. For instance, pilot deployments and enterprise-wide deployments are both considered “production,” but they are vastly different in terms of DataOps, as Table 1 below illustrates.

YiNSxpTWDbZhddVcZmA13-qBFp8yp7gqIKpNPcU2

Table 1: DataOps Considerations from Lab to Enterprise-wide Production.

As businesses move Apache Hadoop and Big Data out of Proof of Concepts (POC)s and into enterprise-wide production, hybrid deployments are the norm and several important considerations must be addressed. 

Dive into this topic further on June 28th for a free webinar with John Mertic, Director of ODPi at the Linux Foundation, hosting Tamara Dull, Director of Emerging Technologies at SAS Institute.

The webinar will discuss ODPi’s recent 2017 Preview: The Year of Enterprise-wide Production Hadoop and explore DataOps at Scale and the considerations businesses need to make as they move Apache Hadoop and Big Data out of Proof of Concepts (POC)s and into enterprise-wide production, hybrid deployments.

Register for the webinar here.

As a sneak peek to the webinar, we sat down with Mertic to learn a little more about production Hadoop needs.

Why is it that the deployment and management techniques that work in limited production may not scale when you go enterprise wide?

IT policies kick in as you move from Mode 2 IT — which tends to focus on fast moving, experimental projects such as Hadoop deployments — to Mode 1 IT — which controls stable, enterprise wide deployments of software. Mode 1 IT has to consider both the enterprise security and access requirements, but also data regulations that impact how a tool is used. On top of that, cost and efficiency come into play, as Mode 1 IT is cost conscious.

What are some of the step-change DataOps requirements that come when you take Hadoop into enterprise-wide production? 

Integrating into Mode 1 IT’s existing toolset is the biggest requirement. Mode 1 IT doesn’t want to manage tools it’s not familiar with, nor those it doesn’t feel it can integrating into the existing management tools the enterprise is already using. The more Hadoop uniformly fits into the existing devops patterns – the more successful it will be.

Register for the webinar now.