Hitachi Solution for Databases – Cost Effective and Optimized for Oracle Enterprise Data Warehouse

By Shashikant Gaikwad posted 11-26-2019 11:29

Introduction: In this blog, I’ll talk about how customers can leverage Hitachi Solutions for Databases with optimized Oracle Enterprise Data Warehouse to make Enterprise Data Warehouse (EDW) efficient and cost effective. I will also provide information on how this will provide readiness in using the platform/infrastructure for business intelligence.    

Figure 1 shows “Typical EDW Problems”                                                       

Figure 1

Enterprise Data Warehouse with “Big Data” existence tend to deal with increasing processing demand for structured/unstructured data and higher storage capacities/costs. There will be performance impacts and higher infrastructure cost with aggregating ever increasing large volume of data causes creating large, very large databases, which keep demanding more and more computing resources and significantly slows down business operations. Big Data platforms like Hitachi Solutions for Big Data (which supports, MongoDB and Hadoop among others), Hitachi Content Platform are widely accepted platforms for big data environment.

In a typical EDW environment, data volumes grow fast while Extract-transformation-load (ETL) processes consumes vast amounts of CPU cycles of Oracle database nodes. Backup and archiving processes could therefore take longer to complete. That inherently affects the performance of user processes among other things. Critical user queries may not be able to complete within acceptable length of time. Upgrading database infrastructure is the costly way to deal with this kind of challenges as the total cost of the infrastructure including database licenses would increase dramatically.

Solving such EDW problems is challenging. How to select and where to put “COLD” data is one of the biggest question customers go through. Particularly with the data growth rates we’ve all grown accustomed to. It is essential to optimize CPU utilization, which directly affects CPU based licensing cost too. EDW requires to work with manageable workloads to give best results. How to optimize EDW?

Hitachi Solution for optimizing EDW for Oracle databases

With Hitachi Vantara, we enable customers to offload cold data from EDW to a less cost prohibitive solution automatically by leveraging Pentaho Data Integration’s (PDI) capabilities. PDI provides unique functionalities that enables DBAs and other Data administrators to work with many data sources like Hitachi Content Platform (HPC), MongoDB, Hadoop HDFS and RDBMS product like Oracle etc. With Pentaho the process of offloading data from Oracle EDW to HCP/MongoDB/Hadoop environment is transparent and seamless. Hitachi Vantara’s Professional services team also provides the capability to quickly configure your offload using a toolkit that eliminates potential user errors and the time-consuming tasks of building complex offload configuration for large environments.

Below is the overall architecture design for this solution. We recommend the use of Hitachi UCP CI for Oracle system to run your Oracle Enterprise Data Warehouse, however, that is not a requirement for this offering. Existing Oracle system can be used as well to leverage the solution benefits.

Figure 2 shows “Architecture Design”  

Figure 2                                                     

Our lab results have shown that you can quickly configure and use Pentaho Data Integration to move data from a large Enterprise Data Warehouse Oracle host to HCP or MongoDB or Apache Hive on top of Hadoop Distributed File System to relieve the workload on your Oracle host. This provides a cost-effective solution to expanding capacity to relieve server utilization pressures. Offloaded data will be like a golden treasure, which possesses potential to get into business insights for our valuable customers.

For more details, you can access reference architecture documents here:
Read this solution profile to learn how Hitachi Solution for Databases helps you build an optimized enterprise data warehouse (EDW)
You can access a video demonstration at which demonstrates overall architecture and software toolkit execution and important features to offload data to HCP.
1 comment



11-26-2019 13:55

Great blog Shashikant! Let's hope customers & partners see the value & leverage our PDW Toolkit.