Traditional data platforms often create silos, suffer from high latency, and struggle to scale—making it difficult to deliver the real-time data pipelines modern applications and ML models demand.
In this on-demand workshop, you’ll see how a streaming lakehouse architecture—powered by Apache Kafka and Apache Iceberg—solves these challenges with a unified, scalable approach.
You’ll learn how to:
Watch this hands-on session, including a mini demo and reference application, and walk away with the practical knowledge to start building your own streaming lakehouse.
This session is ideal for data engineers, platform architects, and developers looking to cut latency, simplify pipelines, and deliver real-time, ML-ready data.
Associate Solution Architect
I am a solutions architect at Aiven and have been developing software for 5 years with a focus on event streaming architecture. I am particularly interested in demystifying complex data infrastructure for the small to medium sized businesses.
Associate Solution Architect
I'm a Solution Architect at Aiven, specializing in designing and implementing robust data architectures. My hands-on experience allows me to build scalable solutions that are both practical and effective.
Live and interactive sessions to upgrade your skills with expert guidance covering a range of open source technologies.
Explore all workshops