Yann LeCun, from Meta, believes that the development of Artificial General Intelligence is not a problem of design or technology, but a scientific one.
According to Meta’s chief AI scientist, Yann LeCun says:
large language models (LLMs) such as those used in AI chatbots like ChatGPT and Gemini, will not be able to achieve the same level of reasoning and planning as humans.
In a recent interview with the Financial Times, LeCun pointed out that large language models currently have a very limited understanding of logic. He also mentioned that these models do not understand the physical world, lack persistent memory, cannot reason in any reasonable definition of the term, and cannot plan hierarchically.
Large language models such as ChatGPT are considered “intrinsically unsafe” because their ability to provide accurate responses is entirely dependent on the accuracy of the data they are trained on. According to LeCun, these models are limited in their evolution as they rely solely on human-fed data for learning.
He suggests that what may appear as reasoning is essentially the models exploiting the vast amount of knowledge accumulated from their training data. Despite these limitations, LeCun acknowledges the significant utility of large language models like ChatGPT and Gemini.
To achieve human-level intelligence, LeCun explained that Meta’s Fundamental AI Research (Fair) lab, which consists of about 500 people, is developing a new AI system. This system aims to understand common sense and learn about the world around us, using an approach called ‘world modeling’. However, this approach carries some potential risks for Meta, as investors typically expect quick returns on their AI investments. LeCun believes that the challenge of developing
Artificial General Intelligence (AGI)
is not just about design and technology, but it’s also a scientific endeavor. Mark Zuckerberg recently announced that he plans to invest more in AI and aims to make Meta the top AI company globally. As a result, the social media company’s value dropped by around $200 billion.