The journey towards Artificial General Intelligence (AGI) is a complex and multidimensional one. Reflecting on my thoughts during a flight to the UK in 2000, I find myself contemplating the evolution of AI and its trajectory towards AGI. In this exploration, I simplify certain technical aspects for clarity.

brown rocks on lake during daytime

Photo by Sam Barber on Unsplash

Data Storage: Mimicking the Human Brain

The first hurdle in AGI development, as I envisaged two decades ago, lies in creating a storage medium that behaves more like the human brain. Traditional database storage systems rely on allocated space with a reference point and index. This method, suitable for less storage space and simpler retrieval, falls short in terms of the freeform access and learning required for AGI.

Transition to NoSQL and Vector Databases

In pursuit of this, the tech world moved towards NoSQL or key/value database storage, providing greater flexibility and scalability than relational databases. Yet, these were still scalar in nature. Enter the era of GPT and the rise of Vector databases, resembling the sketches I drew back in 2000. These databases mirror the brain’s functionality more closely, offering a glimpse into a data storage medium suitable for AGI.

Pathway Intensity & Weighting – The Brain’s Data Handling

The human brain continually forms new synapses and pathways, linking data elements with varying intensity levels. Through use and practice, these connections strengthen, linking with other memories and pathways. In a simplified view, neurons can be seen as data blobs connected by synapses, forming pathways – akin to knowledge.

GPT – The Pre-Trained Transformer

Large Language Models (LLMs) like GPT are essentially collections of pre-existing synapses, guiding context-driven navigation between data elements. They serve as a pre-trained transformer, akin to a boot operating system, equipped with enough knowledge to get started.

Vector Database – The Memory Storehouse

Vector databases function as a memory store, with relational synapses acting as vectors. In the AI context, they could encompass the entirety of human knowledge.

The Missing Link – Freeform Learning Component

What remains elusive in the quest for AGI is the component of freeform learning – the critical thinking aspect. It involves an engine to drive learning and knowledge pursuit. This engine should foster a thirst for knowledge and discovery while safeguarding against the risks that conformity and accountability aim to mitigate.

AGI – The Convergence of Elements

Achieving AGI involves creating a system that:

  • Has a Thirst for Knowledge and Discovery: A drive similar to human curiosity and exploration.

  • Utilizes LLM and Vector Storage: Harnessing existing knowledge and the ability to navigate it effectively.

  • Employs Critical Thinking and Learning: Exploring through trial and error, akin to the scientific method.

LLM as Pre-school Learning

LLMs represent the pre-school phase of learning – essential for navigating the world, communicating, and understanding social norms. This foundational knowledge doesn’t need to be as vast as current LLM trends suggest.

Vector Database as a Library

The vector database is akin to a library, a vast store of information accessible for reference and learning.

The Agent as a Learning Child

The agent in this AGI model is comparable to a child, equipped with basic knowledge and skills, and an inherent curiosity to learn and discover.

Beyond Giant LLMs

The expanding size of current LLMs is driven by consumer demand for fast, knowledgeable responses. However, the future likely holds smaller, more focused LLMs that serve as a foundation for accessing and processing data from a larger, more complex knowledge base. This shift will mark a significant step in the journey towards AGI, bridging the gap between where we are and where we need to be in the realm of artificial intelligence.