Guide: Introduction to Apache Kafka and Kafka Java API

less than 1 minute read

Introduction to Apache Kafka and Kafka Java API

Setting up Apache Kafka

Single Node setup

Download the latest version of kafka from: http://www-eu.apache.org/dist/kafka/

wget http://www-us.apache.org/dist/kafka/0.9.0.0/kafka_2.11-0.9.0.0.tgz
tar -xf kafka_2.11-0.9.0.0.tgz
cd kafka_2.11-0.9.0.0
bin/zookeeper-server-start.sh config/zookeeper.properties &
bin/kafka-server-start.sh config/server.properties &

Using kafka cli consumer and producer: creating a topic

bin/kafka-topics.sh –create –zookeeper localhost:2181 –replication-factor 1 –partitions 1 –topic test

Producing messages for the topic “test”

bin/kafka-console-producer.sh –broker-list localhost:9092 –topic test

Consuming the messages:

bin/kafka-console-consumer.sh –zookeeper localhost:2181 –topic test –from-beginning

Multi-broker Kafka Cluster

In order to create a multi-broker multi-node Kafka cluster, one needs to change the server.properties on each node with different broker-id to set it up.

Kafka Java API

Kafka producer API for writing a simple Java producer for Kafka https://kafka.apache.org/090/javadoc/index.html?org/apache/kafka/clients/producer/KafkaProducer.html

For more info on Kafka, visit: http://kafka.apache.org/090/documentation.html

For information on how Flink works with Kafka, visit: http://data-artisans.com/kafka-flink-a-practical-how-to/ and Apache Kafka connector for Flinkhttps://ci.apache.org/projects/flink/flink-docs-release-1.0/apis/streaming/connectors/kafka.html

Kafka flink example by data-artisans https://github.com/dataArtisans/kafka-example