Cai Borui and Zhao Yao from Deakin University (Australia) presented a concept that they believe will bridge the gap between modern chatbots and general-purpose AI. Their proposed “Intelligence Foundation Model” (IFM) shifts the focus of AI training from merely learning surface-level data patterns to mastering the universal mechanisms of intelligence itself. By utilizing a biologically inspired “State Neural Network” architecture and a “Neuron Output Prediction” learning objective, the framework is designed to mimic the collective dynamics of biological brains and internalize how information is processed over time. This approach aims to overcome the reasoning limitations of current Large Language Models, offering a scalable path toward true Artificial General Intelligence (AGI) and theoretically laying the groundwork for the future convergence of biological and digital minds.
The Intelligence Foundation Model represents a bold new proposal in the quest to build machines that can truly think. We currently live in an era dominated by Large Language Models like ChatGPT and Gemini. These systems are incredibly impressive feats of engineering that can write poetry, solve coding errors, and summarize history. However, despite their fluency, they often lack the fundamental spark of what we consider true intelligence.
They are brilliant mimics that predict statistical patterns in text but do not actually understand the world or learn from it in real-time. A new research paper suggests that to get to the next level, we need to stop modeling language and start modeling the brain itself.
Borui Cai and Yao Zhao have introduced a concept they believe will bridge the gap between today’s chatbots and Artificial General Intelligence. Published in a preprint on arXiv, their research argues that existing foundation models suffer from severe limitations because they specialize in specific domains like vision or text. While a chatbot can tell you what a bicycle is, it does not understand the physics of riding one in the way a human does.