- Turing Post
- Posts
- The Mysterious AI Reading List: Ilya Sutskever's Recommendations
The Mysterious AI Reading List: Ilya Sutskever's Recommendations
A List Everyone Talks About, But No One Has Ever Seen
There's a mysterious list of research papers that Ilya Sutskever reportedly gave to John Carmack in 2020. While everyone talks about it, no one has ever seen it. Here’s the story, an update on it, and the purported list →
John Carmack, the renowned game developer, rocket engineer, and VR visionary, shared in an interview that he asked Ilya Sutskever, OpenAI co-founder and former Chief Scientist, for a reading list about AI. Ilya responded with a list of approximately 40 research papers, saying:
If you really learn all of these, you’ll know 90% of what matters today.
This elusive list became a topic of search and discussion, amassing 131 comments on Ask HN. So many people wanted it that Carmack posted on Twitter, expressing his hope that Ilya would make it public and noting that “a canonical list of references from a leading figure would be appreciated by many”:
I rather expected @ilyasut to have made a public post by now after all the discussion of the AI reading list he gave me. A canonical list of references from a leading figure would be appreciated by many. I would be curious myself about what he would add from the last three years.
— John Carmack (@ID_AA_Carmack)
7:06 PM • Feb 6, 2023
We agree. However, Ilya has yet to publish such a list, leaving us to speculate. Recently, an OpenAI researcher reignited the conversation by claiming to have compiled this list, and the post went viral.
Here’s what was inside (grouped for your convenience)
Core Neural Network Innovations
Recurrent Neural Network Regularization - Enhancement to LSTM units for better overfitting prevention.
Pointer Networks - Novel architecture for solving problems with discrete token outputs.
Deep Residual Learning for Image Recognition - Improvements for training very deep networks through residual learning.
Identity Mappings in Deep Residual Networks - Enhancements to deep residual networks through identity mappings.
Neural Turing Machines - Combining neural networks with external memory resources for enhanced algorithmic tasks.
Attention Is All You Need - Introducing the Transformer architecture solely based on attention mechanisms.
Reply