A roadmap for AI: Past, Present and Future (Part 2): Fixed vs Flexible, Memory Soup vs Hierarchy
In this session, we talk about the future of AI systems!
Recap of Last session (Past and Present AI systems): Expert Knowledge Systems (learning rules from experts), Supervised Learning (human-labelled data), Unsupervised Learning/Self-Supervised Learning (non human-labelled data), Foundational Models to learn from data and set a baseline for performance.
Moving to the Future:
Memory: We will discuss how we can use memory as the next wave to improve AI systems (this also includes the Large Language Model grounding with Knowledge Graphs).
Hierarchy or memory soup: We will talk about hierarchy and whether it is the essential ingredient for better representation and planning, or could it be just a "memory soup" mix of abstractions and using pattern matching to find the right one. This bears some resemblance to the "Thousand Brains Theory of Intelligence" by Numenta.
Multi-agent: The last part covers agents, multiple systems in one agent, multiple agents in an ecosystem, and finally, multiple ecosystems.
The future is unknown, but what is sure is that technological improvement will lead us to something which is far more advanced than the current state of the art.
My research work seeks to help to attain fast and adaptable agents, and I believe this will be key for the future of AI systems.
~~~~~~~~~~~~~~~~~~~
Part 1 here: https://www.youtube.com/watch?v=VP4DDdUsGws
References:
Fixed vs Flexible:
BERT: https://arxiv.org/abs/1810.04805
Memory:
Knowledge Graphs and LLMs: https://arxiv.org/abs/2306.08302
Learning, Fast and Slow (my work): https://www.youtube.com/watch?v=DSVFA7nmwHQ
Hierarchy in the brain: https://www.nature.com/articles/s41562-022-01516-2
Part-whole representations (Geoffrey Hinton): https://arxiv.org/abs/2102.12627
A path towards autonomous machine intelligence (Yann Lecun): https://openreview.net/pdf?id=BZ5a1r-kVsf
Thousand Brains Theory (Jeff Hawkins): https://www.numenta.com/technology/research/thousand-brains-theory/
Incomplete Hash Map (doing cosine similarity on a subset of the embedding vector): https://www.youtube.com/watch?v=q9uMEAcB3lM
Shuchen's idea of chunking to form hierarchies: https://www.youtube.com/watch?v=Y-goByHfsoo
~~~~~~~~~~~~~~~~~~~
0:00 Introduction
2:26 Merging Fixed + Flexible
10:49 Memory for Fast Learning
20:55 Chain of Memories - A New Kind of Knowledge Graph
50:22 Learning, Fast and Slow - Neural Network + Memory
53:27 Hierarchical Prediction
1:11:00 Memory Soup and Thousand Brains Theory
1:35:00 Teaser for Next Week - Multi-Agent Systems
1:35:20 Discussion
~~~~~~~~~~~~~~~~
AI and ML enthusiast. Likes to think about the essences behind breakthroughs of AI and explain it in a simple and relatable way. Also, I am an avid game creator.
Discord: https://discord.gg/bzp87AHJy5
LinkedIn: https://www.linkedin.com/in/chong-min-tan-94652288/
Online AI blog: https://delvingintotech.wordpress.com/
Twitter: https://twitter.com/johntanchongmin
Try out my games here: https://simmer.io/@chongmin