Channel Insider content and product recommendations are editorially independent. We may make money when you click on links to our partners. View our editorial policy here.

The biggest challenge with big data projects at this point isn’t mastering the technology. Rather, the difficulty lies in figuring out the use cases that will result in customers deploying big data technologies in production environments.

By now, most IT organizations have either experimented with big data technologies, with a large number of them having already deployed platforms such as Hadoop within the confines of a pilot project, according to Vikram Duvvoori, chief technologist and corporate vice president for HCL Technologies, a global systems integrator.

“When it comes to big data, most of our clients are getting ready to move beyond the foundational stuff,” Duvvoori said. “Now it’s becoming more of an architectural conversation about how to make technologies, such as Hadoop, coexist in an existing IT environment.”

As such, Duvvoori explains that most IT organizations are looking for solution provider partners that both understand the business cases surrounding a big data project and how to actually integrate those technologies within a production environment.

What’s critical to note, Duvvori said, is that most IT organizations are not looking to replace their existing data warehouse technologies with Hadoop as much as they are looking to extend the value of those investments by being able to analyze a much larger pool of data.

In fact, a recent survey of 315 IT and analytics professionals conducted by Dimensional Insight on behalf of Snowflake Computing, a provider of data warehouse delivered as a cloud service, finds that 96 percent said they will not be replacing their data warehouse with Hadoop. The study also finds there is a critical shortage of Hadoop expertise. Only 12 percent said they have easy access to Hadoop expertise.

Most customers are still trying to figure out exactly where Hadoop fits inside their organizations, said Jon Bock, vice president of product and marketing for Snowflake, said. “There will be lot of integration opportunities,” he said. “People want to be able to interact with big data in near real time.”

Tim Hall, vice president for product management for Hortonworks, one of the primary distributors of Hadoop, explained that, on the whole, those integration opportunities will multiply in the coming years as IT organizations move Hadoop into production environments.

“Organizations are now moving out of the early phase of adoption,” Hall said. “Organizations now want to get value out of Hadoop as quickly as possible.”

When it comes to deploying big data applications in production environments, the emphasis quickly shifts to operational issues, Dave McCrory, CTO for Basho Technologies, a provider of the Riak distributed database. “Most IT organizations are looking for operational simplicity,” he said. “It’s really comes down availability and scalability.”

Big Data Application Opportunities

As critical as the IT infrastructure might be, however, solution providers across the channel would also be well-advised not to take their eyes off the big data application opportunity either.

Providers of business intelligence (BI) applications, such as Looker, are seeking partners that can establish relationships with a new breed of data analysts and line-of-business executives that are often savvier about, for example, application programming interfaces (APIs).

Subscribe for updates!

You must input a valid work email address.
You must agree to our terms.