Michael Hay

Another peanut spilled...?

Blog Post created by Michael Hay Employee on Feb 12, 2015

It is all about Hitachi Data Systems, Hortonworks, and the mythical Hadoop Elephants so we're spilling peanuts not beans -- yes I know technically the peanut is a legume which is a bean.

 

Analytics+Cloud.pngThis is another step in our on going journey towards Social Innovation, which is the aggregation of the Internet of Things that Matter and Advanced Analytics.  Truthfully, we have been experimenting with Hadoop, in the Japanese geography, with universities and other semi-governmental organizations.  There we used stock standard Hadoop, Qliktech and Hitachi's systems management stack called JP1 and called it Kantan (easy) Hadoop.  During our in-market education we learned the following: Yes it is possible to role your own Hadoop distribution, but why should you do that when it isn't your core competency.  So we set about to find a stellar partner to work with because we know we needed a well known brand, strong technologies, Open Source, and a cultural match.  Well we played the field and our friends over at Hortonworks came out miles ahead of the pack in all areas!

 

While I cannot spill our whole bag of peanuts in one shot I can bridge off of the news we just talked about with Pentaho joining our family.  In my last post Extending the Family to Inspire Social Innovation I related how users could start from a VSP (or HUS-VM) and grow up their infrastructure and skills towards a private analytics cloud. Obviously a key part of that is running Hadoop or HDP in a VM, and this is something we're seeing more and more often in our customer base -- that is running Hadoop in a VM.  Whether it is an all VM approach, using Logical Partitioning or the emerging container phenomena, led by Docker, we're finding that increasingly people want faster time to operation and are giving up the absolute performance religion. (I mentioned this in the previous post.)  This is effectively to be able to quickly gain advantage of an emerging trend/market opportunity by quickly building an application and testing it out.  Hadoop falls into this mix as well and quick experimentations -- often called test and development environments -- seem to be more important today than massive scale.

 

All of this leads to the joint development goals we have with Hortonworks' of making it more enterprise ready focused on high speed deployment.  In essence we'd like to see VM infrastructures enable minute scale deployments of Hadoop so you can quickly begin connecting Pentaho, authoring your Map Reduce code, or submitting your SQL-like query to the infrastructure.  So less time in setup and make-ready and more time in production.  (Note, Ken Wood has done a great job in talking about how Hortonworks HDP can be used with Pentaho in ADDIO-Amigo.)

Outcomes