DataCater
Search
K

Integrating existing Apache Kafka topics

Self-managed installations of DataCater (Open-Core or Enterprise) allow you to integrate existing Apache Kafka topics. You can add a new stream and connect it with a Kafka broker and topic. Depending on whether you want to publish data to or consume data from the stream (or topic), you can set the according properties of the official Apache Kafka consumer and producer API:
In addition to the name of the topic and the connection properties, you must let DataCater know the format of the data in the topic. To this end, you can define deserializers and serializers for both the key and the value of the records. By default, DataCater expects data to be formatted in JSON.

Integrating topics holding JSON

DataCater provides serializers and deserializers for handling JSON data:

Integrating topics holding AVRO

DataCater provides serializers and deserializers for handling JSON data:
Please note that you need to provide a schema when using AVRO. To this end, either point DataCater to a schema registry using the property schema.registry.url or provide an inline schema using key.deserializer.schema, value.deserializer.schema, key.serializer.schema, or value.serializer.schema.

Integrating topics holding Strings

DataCater can work with the (de)serializers provided by Apache Kafka for handling raw strings:
  • If you want to consume Strings from a stream, please choose the org.apache.kafka.common.serialization.StringDeserializer for the key.deserializer and/or value.deserializer
  • If you want to publish data to a stream in the String format, please choose the org.apache.kafka.common.serialization.StringSerializer for the key.serializer and/or value.serializer