Skip to main content

Data Integration

Data integration is the basis for gaining added value from company data

Specifically, we define data integration as a technology that includes ETL (Extract, Transform, and Load) and ELT (Load Data Before Transforming). As a data integration provider, we also take care of data quality, Data Governance (GDPR compliant!) and master data management (MDM).

But data alone does not bring any insights. To gain insights data needs context and data silos must be linked:

  • by connecting third-party systems, such as SAP
  • by processing of unstructured data, e.g. logs, web, sensors and much more

Data integration makes data valuable

Data processed through data integration is critical to applications – such as business performance management, reporting, dashboards, scorecards, online analytic processing (OLAP), and advanced analytics. Similar to a value creation process in manufacturing, data integration collects raw material (data from source systems) and assembles it by combining the data into a product (new data sets). These BI and data lake / data warehouse insights, partially facilitated by data integration, can increase revenue, retain customers, increase operational efficiency, enable more accurate planning, make sales and marketing more effective, and lead to manifold other valuable business achievements.

Stefan Müller
Director Big Data Analytics

Are you planning a project or would you like more information about Data Integration?


First Name*:

Last Name*:


Job Title*:



Business e-mail*:

Do some of these challenges sound familiar or do you expect them to arise in the near future?

  1. the connection / processing of data sources, in particular:
    • the automation of the processes
    • the connection of SAP data - with high speed and with large amounts of data
    • the connection of data in Hadoop, NoSQL, HANA
    • the connection of unstructured data
  2. the merging of multiple data sources:
    • e.g. RDBMS, files, apps, and others
    • platforms: SAP, Salesforce, Marketo, and others
  3. the data quality
  4. the historicization of the data
  5. the performance of the system:
    • e.g. loading data in a specific time window
  6. the scaling with increasing data volume
  7. the conversion or re-implementation of real-time and streaming applications
  8. providing a suitable database for machine learning
  9. „muddy“ data lakes
  10. data discovery (also in relation to „dark data“)
  11. a lack of employees or the knowledge for data integration

    Our problem-solving approach makes us special

    As your data architects, we provide a solution to all these problems:

    • We integrate your data into an analytics solution, no matter where the data comes from, the format in which it is available and the speed at which it is delivered
    • We optimize the performance of your BI / big data systems. For ETL routes, we reduce the loading of data from weeks to just a few hours.
    • We have a proven expertise to design your entire process chain data-driven and on an enterprise level. This sets us apart from pure visualization tool providers such as Tableau.
    • We create individual and high-performance use case architectures based on best practice-related system architectures
    • We provide the resources for your project. A team of data analysts, data scientists, data engineers, developers, trainers, consultants, supporters, data architects is ready to assist you.
    • We cover the entire data analytics lifecycle: consulting – implementation – training – support
    • We save time for all involved stakeholders: the IT when it comes to the implementation and the business users when accessing information. Because our solutions require
      • little programming knowledge
      • little documentation for the installation in the hand-over process (visually self-explanatory through a graphical tool, in contrast to i.e. the necessary efforts with Java)
      • enable an extremely fast on-boarding of new employees
    • We do not settle for PoCs, but we know how to successfully build real-life projects. For this we draw on trained experts who implement best practice methods (for example with CRISP-DM)

      We offer investment security and independence

      As software solutions around Hadoop and Cloud rapidly evolve, no company can predict or safely plan which technology will be state-of-the-art for them in a few years. In order to be future-proof, a decoupled software layer is needed to give companies the flexibility to react on all kinds of situations, even hybrid architectures. We offer exactly such solutions. We ensure that your data remain fully accessible in the future and that you always have full control over your data.

      From competition sets us off

      We see ourselves as solution experts for data integration and analytics. Individual customer problems require tailor-made data analytics solutions. Our specialists have a lot of experience finding a solution for every data problem. We make sure that our solutions are maintainable and flexible. We achieve this by using open source software. We are one of the few German providers with decades of open source expertise in the enterprise sector.

      Matt Casters
      Senior Solutions Architect,
      founder of Kettle

      “The real strength and unique selling point of Kettle is its capability of running in the environment that the customer needs, not the other way around. This gives you a tremendous amount of freedom and power if you combine it with the ability to write plugins like it-novum has done on multiple occasions. Whatever architecture a customer needs for their needs, Kettle will fit into it.”

      Pentaho from Hitachi Vantara is the leading software for data integration and big data analytics. Hitachi Vantara offers a comprehensive solution portfolio for Big Data, Internet of Things and Cloud. it-novum is Hitachi Vantara's Big Data Insights and IoT partner and the largest Pentaho implementation and training partner in EMEA.