Krakow, Poland, 31 May - 2 June 2023

Danica Fine
Confluent
Danica Fine is a Senior Developer Advocate at Confluent where she helps others get the most out of their event-driven pipelines. In her previous role as a software engineer on a streaming infrastructure team, she predominantly worked on Kafka Streams- and Kafka Connect-based projects. She can be found on Twitter, tweeting about tech, plants, and baking @TheDanicaFine.

Houseplants can be hard – in many cases, over- and under-watering can have the same symptoms. Take away the guesswork involved in caring for your houseplants while also gaining valuable experience in building a practical, event-driven pipeline in your own home! This talk explores the process of building a houseplant monitoring and alerting system using a Raspberry Pi and Apache Kafka. 

Moisture and temperature readings are captured from sensors in the soil and streamed into Kafka. From here, we’ll use stream processing to transform the data, create a summary view of the current state, and drive real-time push alerts through Telegram. In this session, I’ll talk about how I ingest the data followed by the tools – including ksqlDB and Kafka Connect – that help transform the raw data into useful information, and finally I’ll show how to use Kafka Producers and Consumers to make the entire application more interactive.

By the end of the talk, you’ll have everything you need to start building practical streaming pipelines in your own home. Roll up your sleeves – let’s get our hands dirty!

More
A Kafka Client’s Request: There and Back Again
Conference (INTERMEDIATE level)
Room 4B

Do you know how your data moves into and out of your Apache Kafka® instance? From the programmer’s point of view, it’s relatively simple. But under the hood, writing to and reading from Kafka is a complex process with a fascinating life cycle that’s worth understanding.

When you call producer.send() or consumer.poll(), those calls are translated into low-level requests which are sent along to the brokers for processing. In this session, we’ll dive into the world of Kafka producers and consumers to follow a request from an initial call to send() or poll(), all the way to disk, and back to the client via the broker’s final response. Along the way, we’ll explore a number of client and broker configurations that affect how these requests are handled and discuss the metrics that you can monitor to help you to keep track of every stage of the request life cycle.

By the end of this session, you’ll know the ins and outs of the read and write requests that your Kafka clients make, making your next debugging or performance analysis session a breeze. 

More

Ticket prices will go up in...

28
Days
:
 
14
Hours
:
 
54
Minutes
:
 
12
Seconds

You missed out!

Venue address

ICE Krakow, ul. Marii Konopnickiej 17

Phone

+48 691 793 877

Email

info@devoxx.pl

Social Media