Skip to main content

Hitachi Data Connector for SAP ERP & BW

Evaluate SAP data at the click of a mouse

Why the connector from it-novum

The Hitachi Data Connector for SAP ERP & Business Warehouse from it-novum is specially designed for SAP and uses BAPI, RFC and web services. It fulfills all standard-compliant tasks without causing higher internal effort. Therefore, instead of programming individual one-to-one interfaces between non-SAP solutions and SAP, companies should use our tool, which enables fast and code-free integration. Companies can concentrate on their business processes and no longer have to worry about the technical aspects of integration.

The Hitachi Data Connector for SAP ERP & Business Warehouse is the only connector for SAP since Pentaho 8.2.

The advantages of the Connector:

  • Flexible and easy to use analysis of SAP data
  • Read all data from SAP modules
  • Query of complex and nested SAP structures (such as cost center groups or structures/data only available at runtime of the SAP system)
  • Access to BW data using SAP Data Store Objects
  • Support for Metadata Injection
  • Cost-saving offload scenarios
  • Data Blending of SAP Data
  • Analytical Workloads on Data Warehouse Layer (SAP Infocubes)
  • High performance data transfer from SAP to Hadoop thanks to integrated server mode

What you should know about the connector technology

Hitachi Data Connector SAP ERP & BW
Connection to SAP ERPConnection to SAP BW
You can use all data from all SAP modules with the Hitachi Data Connector for SAP ERP and Business Warehouse (e.g. FI, CO, MM, etc.). The connector relies on RFC functional modules (BAPI's) and SAP tables, thus you can also query complex and nested SAP structures (such as cost centre groups or only the structures/data that exist at runtime in the SAP System). The Hitachi Data Connector for SAP ERP and Business Warehouse from it-novum offers a cost-effective way to extract SAP data from "expensive" SAP landscapes using offload scenarios. The data can then be further processed as needed. You can access BW data using two different scenarios:  The staging area data can be extracted by accessing SAP Data Store Objects (DSO) or by means of analytical workloads (querying) at the data warehouse layer (SAP InfoCubes).

Feature Overview

Supported PDI Version7.x, 8.x

Connection method

RFC via SAP JCo
Unlimited record lengthvia customization
Search for tables fields in the user interface
(data directory)
Partial table load (filter)
Lookup / join functionalityvia customization
Data type support for negative packed,
floating points and raw hex fields
Chunk loading of big tablesvia customization
Call other RFCs (also for calling functions to read, lock-up and write back)
SAP ERP Table InputSAP BW/ERP RFC ExecutorSAP BW DSO Input
Metadata Injection
Using Variables
Filter functionality
Selection of fields
Mapping of SAP/Java Datatype
SAP Table Read./.
BAPI Querying./.
Custom SAP Functionality
Multi-Language
Server Mode
Parallel Processing
Advanced-DSO (SAP Hana)./../.

Have a look at the live demo of the connector

Questions and Answers about the SAP Connector

Does the Table Input Step also allow table contents to be filtered in the SAP ERP system?

Yes, it is possible to create filters in the Table Input Step. The filter is then sent with a table call and executed in the SAP system.

Is it also possible to call SAP function modules that you have written yourself?

Yes, the introduced RFC Executor Step offers the user an interface to call any BAPI modules in Pentaho. It is also possible to concatenate several function modules.

Can data be loaded from the SAP BW system using the SAP Connector, such as Infocubes?

Yes, both the presented SAP Table Input and the RFC Executor Step are BW-capable. Furthermore, we have additional steps such as the DSO and Infocube Input Step to access BW-specific data structures.

Can the SAP Connector also access large cluster tables, such as VBRP?

Yes, the SAP Table Input Step can be used to read cluster tables such as VBRP.

Can queries be created?

Current Bex queries that are stored as Info Providers can be called using the RFC Executor Step by accessing the corresponding BAPI module. The development of a separate step for this use case is planned in our roadmap.