Real-time data and Reactive Programming

Part 1: That record is soooooo yesterday.

Photo by Aron Visuals on Unsplash

Having realtime data across your backend systems can be extremely valuable depending on its use. Whereas in the past you would have to maybe perform nightly jobs via batch ETL to keep different systems in sync, a lot of people today are using messaging systems like Kafka and RabbitMQ to send messages during data updates, which then can be listened to and reacted to in multiple other systems.

Say you have a list of items for sale in an inventory system, if the item goes away you wouldn’t want it to still show up in your ElasticSearch as available until the next day or next full refresh would you?

I’m seeing both styles of data syncing on my latest new team and some of the pros and cons of each. I’m learning how you can trigger batch jobs manually in Swagger UI REST calls and see them listed in UI such as via apache Spark, and also how you can write very simple Publishers/Listeners to do this for you via streaming, shown below.

Server Side Messaging

Here’s an example in Spring Boot that can be used to write messages to a Kafka stream and listen for them as well.

Producing Messages

Once you setup your configuration like so:

@Configuration
public class KafkaProducerConfig {

@Bean
public ProducerFactory<String, String> producerFactory() {
return new DefaultKafkaProducerFactory<>(producerConfigs());
}

@Bean
public Map<String, Object> producerConfigs() {
Map<String, Object> props = new HashMap<>();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "server-and-port-to-kafka-instance");
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
// See https://kafka.apache.org/documentation/#producerconfigs for more properties
return props;
}

@Bean
public KafkaTemplate<String, String> kafkaTemplate() {
return new KafkaTemplate<String, String>(producerFactory());
}
}

You can send a message by Autowiring that config into a property

@Autowired
KafkaProducerConfig config;

and calling `send()` on its template

ListenableFuture<SendResult<String, String>> future = config.kafkaTemplate().send("topic1", "whatever it is you want to send");

Consuming Messages

Alternatively, if you want to react to messages sent from elsewhere in other systems, you can do this:

@Component
class KafkaConsumerConfig {
@Autowired
MyService myService;

@KafkaListener(id = "myId", topics = "topic1")
public void listen(String in) {
System.out.println(in);

myService.doSomethingWith(in);
}
}

Furthermore, if you want the real-time feel to propagate to the FE, the messages can then be sent to your client as well! (see part 2 on Client Side Reactivity) 😄