Skip to main content

Hitachi Data Connector SAP ERP & BW

What you should know about the Connector technology

Hitachi Data Connector SAP ERP & BW
Connection to SAP ERPConnection to SAP BW
You can use all data from all SAP modules with the Hitachi Data Connector for SAP ERP and Business Warehouse (e.g. FI, CO, MM, etc.). The connector relies on RFC functional modules (BAPI's) and SAP tables, thus you can also query complex and nested SAP structures (such as cost centre groups or only the structures/data that exist at runtime in the SAP System). The Hitachi Data Connector for SAP ERP and Business Warehouse from it-novum offers a cost-effective way to extract SAP data from "expensive" SAP landscapes using offload scenarios. The data can then be further processed as needed. You can access BW data using two different scenarios:  The staging area data can be extracted by accessing SAP Data Store Objects (DSO) or by means of analytical workloads (querying) at the data warehouse layer (SAP InfoCubes).

Various application scenarios

SAP Data Integration


  • SAP data needed for strategic and tactical decisions
  • SAP data has proprietary format and structure
  • SAP data is transaction-oriented


  • Preparation and enhancement of SAP data
  • Storage of historical data in Data Warehouse / Lake
  • Automated, standardized processes


  • Better availability of data for decisions
  • Increase of data quality
  • Reduction of many manual activities

SAP Data Blending


  • Data from SAP and non SAP systems (e.g. NoSQL) are to be merged.
  • Data is stored in heterogeneous silos in the company
  • KPIs need context information


  • Integration of the various data sources
  • Data preparation for descriptive and prescriptive analyses
  • Self-Service Analytics for End Users


  • Better data utilization enables deeper insights
  • Competitive advantages through tailor-made provision of information

SAP Data Offload


  • Increasing data volume leads to higher license costs
  • Data must be distributed according to volume and performance.


  • Implementation of an architecture with SAP Hana and Hadoop
  • Distribution of data storage and processing on the appropriate platform


  • Decreasing license costs
  • Hadooponomics (Source: Forrester)
  • 1 Node HANA ► 750k p.a.
  • 1 Node Hadoop ► 2-3k p.a.
  • Faster time-to-value with Pentaho

Feature Overview

Supported PDI Version7.x, 8.x

Connection method

Unlimited record lengthvia customization
Search for tables fields in the user interface
(data directory)
Partial table load (filter)
Lookup / join functionalityvia customization
Data type support for negative packed,
floating points and raw hex fields
Chunk loading of big tablesvia customization
Call other RFCs (also for calling functions to read, lock-up and write back)
SAP ERP Table InputSAP BW/ERP RFC ExecutorSAP BW DSO Input
Metadata Injection
Using Variables
Filter functionality
Selection of fields
Mapping of SAP/Java Datatype
SAP Table Read./.
BAPI Querying./.
Custom SAP Functionality
Server Mode
Parallel Processing
Advanced-DSO (SAP Hana)./../.

Evaluate and visualize SAP data at the click of a mouse

Questions and Answers about the SAP Connector

Does the Table Input Step also allow table contents to be filtered in the SAP ERP system?

Yes, it is possible to create filters in the Table Input Step. The filter is then sent with a table call and executed in the SAP system.

Is it also possible to call SAP function modules that you have written yourself?

Yes, the introduced RFC Executor Step offers the user an interface to call any BAPI modules in Pentaho. It is also possible to concatenate several function modules.

Can data be loaded from the SAP BW system using the SAP Connector, such as Infocubes?

Yes, both the presented SAP Table Input and the RFC Executor Step are BW-capable. Furthermore, we have additional steps such as the DSO and Infocube Input Step to access BW-specific data structures.

Can the SAP Connector also access large cluster tables, such as VBRP?

Yes, the SAP Table Input Step can be used to read cluster tables such as VBRP.

Can queries be created?

Current Bex queries that are stored as Info Providers can be called using the RFC Executor Step by accessing the corresponding BAPI module. The development of a separate step for this use case is planned in our roadmap.