⬤ Demis Hassabis, DeepMind's chief executive, says current AI systems are still missing several core abilities needed for real reasoning. Despite all the recent progress, these models can't plan far into the future or think through problems the way top human scientists do.
⬤ The biggest gap is that AI doesn't have a proper understanding of how the world works. Sure, modern systems can handle massive amounts of text, images, and video, but they don't actually get physics, cause and effect, or how one action leads to another over time. That's why they can't make solid long-term plans or reason reliably about complicated real-world situations.
⬤ This explains why AI crushes narrowly defined tasks but falls apart when problems require thinking through multiple steps. Without an internal map of how things work, these systems can't predict what'll happen next or adjust their approach like humans do when facing new or changing scenarios.
⬤ Hassabis's comments show why getting to true general intelligence is still a huge challenge. Moving forward isn't just about making models bigger—it'll take entirely new approaches that let AI reason about the world more deeply. As this technology keeps spreading into research, business, and everyday tools, understanding these limitations matters more than ever.
Sergey Diakov
Sergey Diakov