Hadoop, while it may be synonymous with big data, and while it may be free to access and work with, engineering teams globally will admit that behind every Hadoop undertaking is a major technical delivery project.
Failures are so commonplace that even the experts don’t have great expectations of 2017: at the recent Gartner Data & Analytics Summit in Sydney, research director Nick Heudecker claimed that 70% of Hadoop deployments in 2017 will either fail to deliver their estimated cost savings or their predicted revenue.
It shouldn’t come as a surprise. Hadoop was designed for big data storage, but it wasn’t designed as an actual big data application. Hadoop and Spark are incredible enabling technologies.
Read more at Information Age