Hortonworks Aims To Make Hadoop Easier To Use - InformationWeek

InformationWeek is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.


Hortonworks Aims To Make Hadoop Easier To Use

With the aim of simplifying Hadoop use, Hortonworks rolled out three new products to coincide with Strata+Hadoop World in New York City.

10 IT Infrastructure Skills You Should Master
10 IT Infrastructure Skills You Should Master
(Click image for larger view and slideshow.)

Hortonworks has been hard at work trying to make Hadoop easier to use. To that end, the company made three product announcements this week to coincide with Strata+Hadoop World in New York City.

First is its Big Data Scorecard, which will help users generate a customized "whitepaper" that will outline how they can best use Hadoop in the enterprise. The document is generated by answering 20 questions on a Hortonworks website. Hortonworks will then use that document as the first step in working with a client toward the Hadoop path.

The second product announcement is Hortonworks DataFlow (HDF). Powered by Apache NiFi, HDF will allow enterprise users to automate and secure any data flow.

Apache NiFi began life as "Niagara Files," an in-house application at the National Security Agency that has been recently declassified and made available for commercial use. Hortonworks DataFlow will provide the platform for further applications that can manage, analyze, and handle large data flows in real time.

[What could possibly go wrong? Read 7 Data Center Disasters You'll Never See Coming.]

Finally, Hortonworks is teaming up with firms ManTech International and B23 to further the development of OpenSOC, an open source security analytic platform that should work with Hortonworks DataFlow (HDF) and Hortonworks' other main product, Hortonworks Data platform (HDP).

B23 is in the business of constructing Big Data analytics solutions. ManTech blends cyber-security, cloud, and Hadoop OpenSOC. It is also a Hortonworks systems integration partner and reseller.

Tying all this together is a broader goal: Making it easier to run Hadoop and manage the platform, according to Tim Hall, VP of product management at Hortonworks. The focus was on ease of use, simplification, and making sure it is enterprise-ready.

(Image: tiero/iStockphoto)

(Image: tiero/iStockphoto)

"Hadoop is reaching a level of maturity," he said. More companies want to use Hadoop for mission-critical workloads, according to him, and they want to move their Hadoop implementations to the cloud. That necessitates some changes to the way Hadoop is handled.

Customers Want an On-Ramp

One piece of customer feedback is a desire to get away from the command line interface in Hadoop. Longtime users still love it, Hall said, but "Nobody else does."

According to him, customers want to use a browser interface for Hadoop, or they want to handle Hadoop in the cloud. Either way, the user will not be using a command line interface. "We continue to drive in that direction."

What customers want is an "on" button, he explained. However, it should do more than turn Hadoop on. It should also activate all the associated applications and utilities that work with Hadoop. Right now, there are single "on buttons" that activate what Hall called "primitives"-- a very specific process to suit a user's need. For example, one might be set up for a Hive-SQL builder, or be a standardized ETL that moves data between two apps. All functions would be tied to a single user interface. For the user, the operative phrase is "just make it work," he explained. "We just keep going until it is done."

The goal is to attack such challenges from a workflow perspective. According to Hall, if customers want mission-critical work to run on Hadoop, then there has to be some way to manage security, home directories, and views - -in short, all the IT work that's needed to onboard a new tenant and regulate his access to data.

Which is pretty much what an on-ramp is.

William Terdoslavich is an experienced writer with a working understanding of business, information technology, airlines, politics, government, and history, having worked at Mobile Computing & Communications, Computer Reseller News, Tour and Travel News, and Computer Systems ... View Full Bio

We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.
Comment  | 
Print  | 
More Insights
Newest First  |  Oldest First  |  Threaded View
Charlie Babcock
Charlie Babcock,
User Rank: Author
10/1/2015 | 3:36:34 PM
Third parties help basic open source systems
It's not secret that raw Hadoop is hard to use. Cloudera, Hortonworks and MapR are all trying to bring ease of use features to the beast to make it more useful in the enterprise analytics.
11 Ways DevOps Is Evolving
Lisa Morgan, Freelance Writer,  2/18/2021
Graph-Based AI Enters the Enterprise Mainstream
James Kobielus, Tech Analyst, Consultant and Author,  2/16/2021
What Comes Next for AWS with Jassy to Become Amazon CEO
Joao-Pierre S. Ruth, Senior Writer,  2/4/2021
White Papers
Register for InformationWeek Newsletters
The State of Cloud Computing - Fall 2020
The State of Cloud Computing - Fall 2020
Download this report to compare how cloud usage and spending patterns have changed in 2020, and how respondents think they'll evolve over the next two years.
Current Issue
2021 Top Enterprise IT Trends
We've identified the key trends that are poised to impact the IT landscape in 2021. Find out why they're important and how they will affect you.
Flash Poll