4 episode summariesNew episodes added hourly16 unique signals extracted
Podcasts/Latent Space: The AI Engineer Podcast
Latent Space: The AI Engineer Podcast

Latent Space: The AI Engineer Podcast

Hosted by Latent.Space

About

The podcast by and for AI Engineers! In 2025, over 10 million readers and listeners came to Latent Space to hear about news, papers and interviews in Software 3.0. We cover Foundation Models changing every domain in Code Generation, Multimodality, AI Agents, GPU Infra and more, directly from the founders, builders, and thinkers involved in pushing the cutting edge. Striving to give you both the definitive take on the Current Thing down to the first introduction to the tech you'll be using in the next 3 months! We break news and exclusive interviews from OpenAI, Anthropic, Gemini, Meta (Soumith Chintala), Sierra (Bret Taylor), tiny (George Hotz), Databricks/MosaicML (Jon Frankle), Modular (Chris Lattner), Answer.ai (Jeremy Howard), et al. Full show notes always on https://latent.space <br/><br/><a href="https://www.latent.space?utm_medium=podcast">www.latent.space</a>

Host

Latent.Space

Host of Latent Space: The AI Engineer Podcast

Want more? Subscribe to go deeper! โ†’

โ€œThe reality is that although the visuals do look fantastic, those visuals actually aren't accompanied by an understanding of the 3D world, understanding how objects can move, what the consequences of different actions are, and that's what's really needed for spatial intelligence.โ€

โ€” Chris Manning
#4
APR 3, 2026Latent.Space

Marc Andreessen introspects on The Death of the Browser, Pi + OpenClaw, and Why "This Time Is Different"

STUDY HISTORYBUILD NEURAL NETSWATCH SCALINGDEPLOY AI
  • โ€ข

    AI is an 80-year overnight success - current breakthroughs like ChatGPT and O1 are not sudden accidents but the culmination of a research wellspring dating back to the first neural network paper in 1943.

    โ€œIt's an overnight success because it's like, bam, you know, ChatGPT hits and then O1 hits... but they're drawing on an 80-year sort of wellspring backlog, you know, of ideas and thinking.โ€

    โ€” Marc Andreessen
  • โ€ข

    The neural network debate is officially over - after 70 years of controversy, the industry has reached a technical consensus that the neural network is the definitive architecture for machine intelligence.

    โ€œWe now know the neural network is the correct architecture. And I will tell you, like, there was a 60-year run where that was like, you know, or even 70 years where that was controversial.โ€

    โ€” Marc Andreessen
  • โ€ข

    Institutional caution created a massive capability overhang - major tech players like Google and OpenAI held back functional chatbots for years due to safety concerns before deployment finally hit a catalytic tipping point.

    โ€œThe real story is it was the AlexNet basically breakthrough in 2013 That was the real knee in the curve, and then it was obviously the transformer breakthrough in 17 and then everything that followed.โ€

    โ€” Marc Andreessen
#3
APR 3, 2026Latent.Space

Marc Andreessen introspects on The Death of the Browser, Pi + OpenClaw, and Why "This Time Is Different"

STUDY HISTORYBUILD NEURAL NETSWATCH SCALINGDEPLOY AI
  • โ€ข

    AI is an 80-year overnight success - current breakthroughs like ChatGPT and O1 are not sudden accidents but the culmination of a research wellspring dating back to the first neural network paper in 1943.

    โ€œIt's an overnight success because it's like, bam, you know, ChatGPT hits and then O1 hits... but they're drawing on an 80-year sort of wellspring backlog, you know, of ideas and thinking.โ€

    โ€” Marc Andreessen
  • โ€ข

    The neural network debate is officially over - after 70 years of controversy, the industry has reached a technical consensus that the neural network is the definitive architecture for machine intelligence.

    โ€œWe now know the neural network is the correct architecture. And I will tell you, like, there was a 60-year run where that was like, you know, or even 70 years where that was controversial.โ€

    โ€” Marc Andreessen
  • โ€ข

    Institutional caution created a massive capability overhang - major tech players like Google and OpenAI held back functional chatbots for years due to safety concerns before deployment finally hit a catalytic tipping point.

    โ€œThe real story is it was the AlexNet basically breakthrough in 2013 That was the real knee in the curve, and then it was obviously the transformer breakthrough in 17 and then everything that followed.โ€

    โ€” Marc Andreessen
#2
APR 2, 2026Latent.Space

Moonlake: Causal World Models should be Multimodal, Interactive, and Efficient โ€” with Chris Manning and Fan-yun Sun

BUILD WORLD MODELSCOLLECT ACTION DATAMOVE BEYOND PIXELSSCALE SPATIAL REASONING
  • โ€ข

    Video generation lacks causal world understanding - current models like Sora produce impressive visuals but fail to grasp 3D physics, object permanence, or the consequences of specific actions over long time scales.

    โ€œThe reality is that although the visuals do look fantastic, those visuals actually aren't accompanied by an understanding of the 3D world, understanding how objects can move, what the consequences of different actions are, and that's what's really needed for spatial intelligence.โ€

    โ€” Chris Manning
  • โ€ข

    Symbolic structure provides a massive efficiency shortcut - while raw pixel data is abundant, integrating a semantic abstraction layer allows models to learn world rules with up to five orders of magnitude less data than brute-force scaling.

    โ€œIf there are ways in which you can work with five orders of magnitude, less data than people working purely from pixels, you're going to be able to make a lot more progress a lot more quickly. And that's the bet here.โ€

    โ€” Chris Manning
  • โ€ข

    Interactive data is the critical bottleneck for robotics - standard observational video lacks the action-consequence loops necessary for embodied intelligence, creating a massive demand for simulated worlds where agents can learn through trial and error.

    โ€œOn our way to, let's call it, embody general intelligence, models need to learn the consequences behind their actions, which means that they need interactive data. The demand for those types of data are growing exponentially.โ€

    โ€” Fan-yun Sun
#1
APR 2, 2026Latent.Space

Moonlake: Causal World Models should be Multimodal, Interactive, and Efficient โ€” with Chris Manning and Fan-yun Sun

BUILD WORLD MODELSCOLLECT ACTION DATAMOVE BEYOND PIXELSSCALE SPATIAL REASONING
  • โ€ข

    Video generation lacks causal world understanding - current models like Sora produce impressive visuals but fail to grasp 3D physics, object permanence, or the consequences of specific actions over long time scales.

    โ€œThe reality is that although the visuals do look fantastic, those visuals actually aren't accompanied by an understanding of the 3D world, understanding how objects can move, what the consequences of different actions are, and that's what's really needed for spatial intelligence.โ€

    โ€” Chris Manning
  • โ€ข

    Symbolic structure provides a massive efficiency shortcut - while raw pixel data is abundant, integrating a semantic abstraction layer allows models to learn world rules with up to five orders of magnitude less data than brute-force scaling.

    โ€œIf there are ways in which you can work with five orders of magnitude, less data than people working purely from pixels, you're going to be able to make a lot more progress a lot more quickly. And that's the bet here.โ€

    โ€” Chris Manning
  • โ€ข

    Interactive data is the critical bottleneck for robotics - standard observational video lacks the action-consequence loops necessary for embodied intelligence, creating a massive demand for simulated worlds where agents can learn through trial and error.

    โ€œOn our way to, let's call it, embody general intelligence, models need to learn the consequences behind their actions, which means that they need interactive data. The demand for those types of data are growing exponentially.โ€

    โ€” Fan-yun Sun

Stay in the Loop

Get Latent Space: The AI Engineer Podcast summaries and more, delivered free.