Artwork

Content provided by Hubert Dulay. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Hubert Dulay or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ro.player.fm/legal.
Player FM - Aplicație Podcast
Treceți offline cu aplicația Player FM !

Interview with Kai Waehner

 
Distribuie
 

Manage episode 393682758 series 3509937
Content provided by Hubert Dulay. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Hubert Dulay or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ro.player.fm/legal.

In this podcast Ralph and I interview a former colleague of mine, Kai, who has extensive experience in the data streaming and real-time events space. Kai highlights the top five trends for data streaming with Kafka and Flink, including data sharing, data contracts for governance, serverless stream processing, multi-cloud adoption, and the use of generative AI in real-time contexts. We discuss the role of generative AI in providing accurate answers and the importance of real-time data integration for contextual recommendations, using the example of travel and flight cancellations. We also delve into the role of Flink as a stream processor in ensuring the accuracy and freshness of data for semantic searches and generative AI applications.

We also delve into the idea of streaming databases and whether the market is ready to embrace them. We discuss the need for data contracts and data governance to understand the flow of data through systems, as well as the responsibility of the data engineering team in creating embeddings. We also discuss integrating large language models with other applications using technologies like Kafka and provide examples of how generative AI can be integrated into existing business processes. The interview touches on the concept of a "lake house" and the separation of compute and storage for real-time analytics. The guest also highlights Confluent's approach to building Kafka in a cloud-native way and their focus on the streaming side, while emphasizing the need for accessible stream processing solutions for ordinary database users.

  continue reading

19 episoade

Artwork
iconDistribuie
 
Manage episode 393682758 series 3509937
Content provided by Hubert Dulay. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Hubert Dulay or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ro.player.fm/legal.

In this podcast Ralph and I interview a former colleague of mine, Kai, who has extensive experience in the data streaming and real-time events space. Kai highlights the top five trends for data streaming with Kafka and Flink, including data sharing, data contracts for governance, serverless stream processing, multi-cloud adoption, and the use of generative AI in real-time contexts. We discuss the role of generative AI in providing accurate answers and the importance of real-time data integration for contextual recommendations, using the example of travel and flight cancellations. We also delve into the role of Flink as a stream processor in ensuring the accuracy and freshness of data for semantic searches and generative AI applications.

We also delve into the idea of streaming databases and whether the market is ready to embrace them. We discuss the need for data contracts and data governance to understand the flow of data through systems, as well as the responsibility of the data engineering team in creating embeddings. We also discuss integrating large language models with other applications using technologies like Kafka and provide examples of how generative AI can be integrated into existing business processes. The interview touches on the concept of a "lake house" and the separation of compute and storage for real-time analytics. The guest also highlights Confluent's approach to building Kafka in a cloud-native way and their focus on the streaming side, while emphasizing the need for accessible stream processing solutions for ordinary database users.

  continue reading

19 episoade

Toate episoadele

×
 
Loading …

Bun venit la Player FM!

Player FM scanează web-ul pentru podcast-uri de înaltă calitate pentru a vă putea bucura acum. Este cea mai bună aplicație pentru podcast și funcționează pe Android, iPhone și pe web. Înscrieți-vă pentru a sincroniza abonamentele pe toate dispozitivele.

 

Ghid rapid de referință