Beyond A*: Better Planning with Transformers via Search Dynamics Bootstrapping (Searchformer)

Subscribers:
284,000
Published on ● Video Link: https://www.youtube.com/watch?v=PW4JiJ-WaY4



Duration: 44:04
33,866 views
992


Paper: https://arxiv.org/abs/2402.14083

Abstract:
While Transformers have enabled tremendous progress in various application settings, such architectures still lag behind traditional symbolic planners for solving complex decision making tasks. In this work, we demonstrate how to train Transformers to solve complex planning tasks and present Searchformer, a Transformer model that optimally solves previously unseen Sokoban puzzles 93.7% of the time, while using up to 26.8% fewer search steps than standard A∗ search. Searchformer is an encoder-decoder Transformer model trained to predict the search dynamics of A∗. This model is then fine-tuned via expert iterations to perform fewer search steps than A∗ search while still generating an optimal plan. In our training method, A∗'s search dynamics are expressed as a token sequence outlining when task states are added and removed into the search tree during symbolic planning. In our ablation studies on maze navigation, we find that Searchformer significantly outperforms baselines that predict the optimal plan directly with a 5-10× smaller model size and a 10× smaller training dataset. We also demonstrate how Searchformer scales to larger and more complex decision making tasks like Sokoban with improved percentage of solved tasks and shortened search dynamics.

Authors: Lucas Lehnert, Sainbayar Sukhbaatar, Paul Mcvay, Michael Rabbat, Yuandong Tian

Links:
Homepage: https://ykilcher.com
Merch: https://ykilcher.com/merch
YouTube: https://www.youtube.com/c/yannickilcher
Twitter: https://twitter.com/ykilcher
Discord: https://ykilcher.com/discord
LinkedIn: https://www.linkedin.com/in/ykilcher

If you want to support me, the best thing to do is to share out the content :)

If you want to support me financially (completely optional and voluntary, but a lot of people have asked for this):
SubscribeStar: https://www.subscribestar.com/yannickilcher
Patreon: https://www.patreon.com/yannickilcher
Bitcoin (BTC): bc1q49lsw3q325tr58ygf8sudx2dqfguclvngvy2cq
Ethereum (ETH): 0x7ad3513E3B8f66799f507Aa7874b1B0eBC7F85e2
Litecoin (LTC): LQW2TRyKYetVC8WjFkhpPhtpbDM4Vw7r9m
Monero (XMR): 4ACL8AGrEo5hAir8A9CeVrW8pEauWvnp1WnSDZxW7tziCDLhZAGsgzhRQABDnFy8yuM9fWJDviJPHKRjV4FWt19CJZN9D4n




Other Videos By Yannic Kilcher


2024-05-01ORPO: Monolithic Preference Optimization without Reference Model (Paper Explained)
2024-04-30[ML News] Chips, Robots, and Models
2024-04-28TransformerFAM: Feedback attention is working memory
2024-04-27[ML News] Devin exposed | NeurIPS track for high school students
2024-04-24Leave No Context Behind: Efficient Infinite Context Transformers with Infini-attention
2024-04-23[ML News] Llama 3 changes the game
2024-04-17Hugging Face got hacked
2024-04-15[ML News] Microsoft to spend 100 BILLION DOLLARS on supercomputer (& more industry news)
2024-04-13[ML News] Jamba, CMD-R+, and other new models (yes, I know this is like a week behind 🙃)
2024-04-08Flow Matching for Generative Modeling (Paper Explained)
2024-04-06Beyond A*: Better Planning with Transformers via Search Dynamics Bootstrapping (Searchformer)
2024-03-26[ML News] Grok-1 open-sourced | Nvidia GTC | OpenAI leaks model names | AI Act
2024-03-17[ML News] Devin AI Software Engineer | GPT-4.5-Turbo LEAKED | US Gov't Report: Total Extinction
2024-03-10[ML News] Elon sues OpenAI | Mistral Large | More Gemini Drama
2024-03-07On Claude 3
2024-03-05No, Anthropic's Claude 3 is NOT sentient
2024-03-01[ML News] Groq, Gemma, Sora, Gemini, and Air Canada's chatbot troubles
2024-02-22Gemini has a Diversity Problem
2024-02-19V-JEPA: Revisiting Feature Prediction for Learning Visual Representations from Video (Explained)
2024-02-18What a day in AI! (Sora, Gemini 1.5, V-JEPA, and lots of news)
2024-02-04Lumiere: A Space-Time Diffusion Model for Video Generation (Paper Explained)



Tags:
deep learning
machine learning
arxiv
explained
neural networks
ai
artificial intelligence
paper