Spring Boot and Apache Kafka: A Quick Start
I had just wrapped up the basics of Apache Kafka in my previous blog. The quickstart covered enough ground to get Kafka running, but there was a problem. Building producers and consumers from scratch? That’s like cooking a meal by first planting the seeds. Spring Boot changes that. It gives you the tools to work with Kafka without drowning in boilerplate code.
This blog will be code-heavy. If you’re looking for explanations on why Kafka exists or what a topic is, go read the previous blog. Here, we are going straight to implementation.
Setting Up the Project
The first thing you need is a Spring Boot project with Kafka dependencies. Just make sure you have spring-kafka in your pom.xml or build.gradle.
Once that is done, the configuration begins. Spring Boot loves configuration files, and Kafka is no exception. Here is what goes into application.properties:
spring.application.name=springboot-kafka
spring.kafka.consumer.bootstrap-servers= localhost:9092
spring.kafka.consumer.group-id= Group1
spring.kafka.consumer.auto-offset-reset= earliest
spring.kafka.consumer.key-deserializer=org.apache.kafka.common.serialization.StringDeserializer
spring.kafka.consumer.value-deserializer=org.apache.kafka.common.serialization.StringDeserializer
spring.kafka.producer.bootstrap-servers= localhost:9092
spring.kafka.producer.key-serializer=org.apache.kafka.common.serialization.StringSerializer
spring.kafka.producer.value-serializer=org.apache.kafka.common.serialization.StringSerializer
This is straightforward. The consumer and producer both point to localhost:9092, which is where your Kafka broker is running. The consumer has a group ID, Group1 and it is set to read from the earliest offset. The serializers and deserializers are just String types for now. If you want to send objects, you will need to configure JSON serializers. But we are keeping it simple.
Creating a Topic
In the previous blog, you would have created topics manually using Kafka’s command-line tools. Spring Boot lets you do it programmatically. Here’s the configuration class:
package com.kafkafirst.springboot_kafka.config;
import org.apache.kafka.clients.admin.NewTopic;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.kafka.config.TopicBuilder;
@Configuration
public class KafkaTopicConfig {
@Bean
public NewTopic springKafkaTopic() {
return TopicBuilder.name("springkafka")
.partitions(10)
.build();
}
}
The @Configuration annotation tells Spring this is a configuration class. The @Bean annotation makes springKafkaTopic() a Spring-managed bean. When the application starts, this bean gets created, and so does the topic springkafka, with 10 partitions. You can skip the partition count if you want the default, but I added it here. Control over partitioning matters when you scale.
Building the Producer
The producer is the part that sends messages to Kafka. Here’s the implementation:
package com.kafkafirst.springboot_kafka.kafka;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.stereotype.Service;
@Service
public class KafkaProducer {
private static final Logger LOGGER = LoggerFactory.getLogger(KafkaProducer.class);
private KafkaTemplate<String, String> kafkaTemplate;
public KafkaProducer(KafkaTemplate<String, String> kafkaTemplate) {
this.kafkaTemplate = kafkaTemplate;
}
public void sendMessage(String message) {
kafkaTemplate.send("springkafka", message);
LOGGER.info(String.format("Message sent: %s", message));
}
}
The KafkaTemplate is Spring’s abstraction over Kafka’s producer API. You do not instantiate it yourself, Spring does that for you based on the properties file. The sendMessage() method takes a string and sends it to the springkafka topic. The logger confirms it went through. Simple and Clean.
Exposing the Producer via REST
You need a way to trigger the producer. A REST endpoint works. Here’s the controller:
package com.kafkafirst.springboot_kafka.controller;
import com.kafkafirst.springboot_kafka.kafka.KafkaProducer;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;
@RestController
@RequestMapping("/api/springkafka")
public class MessageController {
private KafkaProducer kafkaProducer;
public MessageController(KafkaProducer kafkaProducer) {
this.kafkaProducer = kafkaProducer;
}
@GetMapping("/publish")
public ResponseEntity<String> publish(@RequestParam("message") String message) {
kafkaProducer.sendMessage(message);
return ResponseEntity.ok("Message sent to the topic");
}
}
The @RestController annotation makes this a REST endpoint. The @GetMapping maps the /publish path to the publish() method. You hit http://localhost:8080/api/springkafka/publish?message=HELLO, and the message gets sent to Kafka. The response confirms it.
Building the Consumer
The consumer listens to the topic and processes messages. Here’s the code:
package com.kafkafirst.springboot_kafka.kafka;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.kafka.annotation.KafkaListener;
import org.springframework.stereotype.Service;
@Service
public class KafkaConsumer {
private static final Logger LOGGER = LoggerFactory.getLogger(KafkaConsumer.class);
@KafkaListener(topics = "springkafka", groupId = "Group1")
public void consume(String message) {
LOGGER.info(String.format("Message received: %s", message));
}
}
The @KafkaListener annotation is the magic here. You specify the topic and the group ID, and Spring handles the rest. When a message arrives on springkafka, the consume() method gets called. The logger prints it. That’s it. You don’t need to write loops or polling logic. Spring does it.
Running the Application
Start your Kafka broker if it is not already running. Then start the Spring Boot application. The topic gets created. The consumer starts listening. Hit the REST endpoint with a message, and you will see it logged by the consumer. The whole flow, producer to broker to consumer, works out of the box.
This is the power of Spring Boot. It doesn’t change how Kafka works. It just removes the friction. You focus on what messages to send and what to do with them. The rest is handled.

