Kafka event streams

Learn how to create a new Kafka event stream for your project.

You onboard a Kafka event stream by defining the stream's topic and consumer group, initial position and Kafka Server. Afterwards you create a pipeline using relevant data sources.

To add a Kafka event stream, follow these steps.

Steps
Steps
  1. In a project select Home > Kafka > Add new.

  2. In the New Kafka Stream dialog, enter the following information :

    • Topic: Name of category (feed) to which records are published.

    • Consumer Group: Label of Consumers where each published record to a topic is delivered to a Consumer within the Consumer Group. Essentially, all Consumers use the same Consumer Group. As Xapix is the only Consumer in the Kafka event stream, this Consumer uses the named Consumer Group.

    • Initial Position: Position where records are to be consumed. Resets the offset id to the earliest or latest position within a partition containing a sequence of records.

    For Kafka Server, enter the following:

    • Name: Name of server cluster on which Kafka is running.

    • Boot Servers: List of addresses:port of the Kafka server, separated by commas.

  3. Click Save for the Kafka Server.

  4. Click Create Stream.

Example
Example

After you have created the Kafka event stream, a basic pipeline appears with an Event unit and a Sink unit. You can now build a pipeline as is done for REST endpoints.

Pipeline example
Pipeline example
Kafka basic pipeline ready to be built