Language & Region

22 June 2022

Kafka Part I: What makes up event-based communication

Data exchange between individual systems in a heterogeneous landscape is often difficult, time-consuming and error-prone. In times of Big Data and real-time communication, rigid interfaces are reaching their limits. Kafka promises a solution - an event streaming platform for real-time data exchange between different systems and applications.

Many companies do not map their processes in a single system, but in a complex heterogeneous system landscape. The so-called legacy systems, which have often been operated for decades for historical reasons, often map critical processes such as supplier assessment, quality inspection or inventory planning. The company's individual tools and systems communicate with each other via interfaces: As soon as a process is completed, the data is transferred to the next system for further process handling, which can take plenty of time depending on the volume to be transferred. Only once the data has been transferred can further processing take place in the downstream system. With increasing data volume and the current requirements of real-time response, this type of data transfer proves to be slow, inflexible and ultimately uncompetitive. The remedy is event-based data transfer, implemented in the platform tool called Apache Kafka.

Kafka is an event streaming platform that enables real-time communication between different systems and tools. The data is not passed to the next system, but is kept in a container on the Kafka platform ready for retrieval. The system that retrieves the data takes from the container only the data it needs for the given process, which makes the interface much more performant. The data is not retrieved periodically, but only when it is actually needed. Such requirements are defined as so-called events. Events such as invoice receipt, order receipt and others can serve as events. The occurrence of an event triggers the retrieval of the data provided in Kafka. In this way, the system always has the latest information and operates with real-time data. 
Another reason for switching to Kafka logic is the independence of individual systems from each other. In the event of an update or complex developments on one of the systems, the remaining systems are not affected and can continue to operate.

Kafka for the switch to S/4HANA

In the course of the changeover to S/4HANA, a connection to Kafka becomes topical for many companies. It would be wrong to believe that with the introduction of S/4HANA, all legacy systems will be eliminated and replaced by a central application - for example, S/4HANA - without exception. While it is advisable to check all processes for mappability in a central system, often the historical legacy systems are simply so well tailored to the company that mapping some business areas in a new ERP system becomes obsolete. In order to continue to use the applications that have proven themselves on the one hand, and on the other hand to move away from rigid structures and be able to use modern technologies, the old interfaces are replaced by the Kafka platform. 

Possible architecture with S/4HANA

The Kafka platform contains predefined interfaces that can be adapted to any system environment. SAP Cloud Integration - a product that connects the individual SAP systems with each other - even already contains the adapters necessary for receiving and sending data from and to Kafka. For use, the adapters must be configured accordingly with regard to message protocols, error handling, authentication, etc. Also, some SAP Cloud products are already equipped with Kafka streaming, for example SAP Concur and SAP Qualtrics. 

For a Kafka integration into an on-premise SAP environment, there are no standard adapters from Apache or SAP, but third-party software or in-house developments can be used. Third-party providers include products such as ASAPIO as a cloud integrator for SAP ERP and S/4HANA, Advantco with the use of standard SAP PI/PO tools, Workato with OData technology, INIT Software with its own ODP connector, or KaTE, where the connection to Kafka runs via SAP PO. For connections from the BI system, SAP OpenHub can be used as an adapter.

Various tools are available for custom developments, such as SAP Cloud SDK, which allows development of applications with Java or JavaScript that communicate with SAP solutions and services. SAP Cloud Platform Enterprise Messaging provides an asynchronous interface and can also be used for Kafka integration. Another technical infrastructure for such purposes is SAP Operational Data Provisioning: a type of change data capture with out-of-the-box support for various SAP and non-SAP products. 

Summary

The Kafka platform can be used both on-premise and in the cloud. The platform allows data to be retrieved in the form and to the extent required by the user. The high autonomy of the individual systems ensures greater independence and avoidance of waiting times in the event of system failures. Both the source and target systems can be developed more flexibly without negatively impacting other systems.

In the course of the changeover to S/4HANA, a connection to Kafka is becoming topical for many companies. In order to be able to continue using the applications that have proven themselves on the one hand, and on the other hand to get away from rigid structures and be able to use modern technologies, the old interfaces are being replaced by the Kafka platform.

Lukas Kovacovic, Managing ConsultantCONSILIO GmbH

Further Informations:

Learn about the new technologies on our solutions page

Move smart with CONSILIO