Kong introduced Kong Event Gateway, a new tool for managing real-time data streams powered by Apache Kafka. According to the company, customers can now use Konnect to manage both their APIs and Kafka-powered data streams. That removes the need to use two separate sets of management tools, which can ease day-to-day maintenance tasks. Kafka makes it possible to create data streams called topics that connect to an application, detect when the application generates a new record and collect the record. Other workloads can subscribe to a topic to receive the records it collects. Kong Event Gateway acts as an intermediary between an application and the Kafka data streams to which it subscribes. Before data reaches the application, it goes through the Kong Event Gateway. The fact that information is routed through the tool allows it to regulate how workloads access the information. Using Kong Event Gateway, a company can require that applications perform authentication before accessing a Kafka data stream. The tool encrypts the records that are sent over the data stream to prevent unauthorized access. According to Kong, it doubles as an observability tool that enables administrators to monitor how workloads interact with the information transmitted by Kafka. Kafka transmits data using a custom network protocol. According to Kong, Kong Event Gateway allows applications to access data via standard HTTPS APIs instead of the custom protocol. That eases development by sparing the need for software teams to familiarize themselves with Kafka’s information streaming mechanism. Kong Event Gateway allows multiple workloads to share the same data stream without the need for copies. Administrators can create separate data access permissions for each workload. Another feature, Virtual Clusters, allows multiple software teams to share the same Kafka cluster without gaining access to one another’s data.