Xapix processes streams of data (Kafka event streams) in real-time. Kafka event streams consume data from sources such as IoT sensors that generate data traffic on a regular time schedule or in random events.
For example, you can use Xapix to transform data from these sources and send to a webhook service for consumption by applications or to a different Kafka topic.
Kafka event streams are a continuously updated data set "of unknown or unlimited size". They are a reliable published streams of data records (or messages).
A Topic is the key abstraction of a Kafka event stream.
Kafka is run on Kafka Servers as a cluster. In a Kafka cluster, streams of records (data) are stored in categories called Topics. Each Topic contains a key, a value and a timestamp. Each Kafka event stream in Xapix contains a single topic.
Kafka event streams have one or many Event Producers and one or many Event Consumers.
Event Producers publish data events (messages) to topics. Event Consumers read data published by topics. For a Consumer to read published data, they must be subscribed to a topic.
Each Consumer is part of a Consumer Group, even if the group contains only one Consumer. Consequently, the consumer group receives topics to which a consumer belonging to the group is subscribed.
In Xapix, you define a single Consumer Group with one Consumer.
In a Kafka event stream, a Sink exports data to any external system. However, in Xapix, the Sink provides a termination of the pipeline.
In a Xapix pipeline, the Event unit connects to a Kafka event stream. The Topic data is orchestrated/transformed based on the design of the pipeline. The transformed data is made available via webhooks.
The following screenshot shows a completed pipeline using a Kafka event stream. It contains the usual elements of a pipeline built for a REST endpoint and you use the same techniques to build it.