
FOD#128: Universe of Incredible Models
An insanely saturated week with models you actually need to know about

AI 101: What is Continual Learning?
Can models add new knowledge without wiping out what they already know? We look at why continual learning is becoming important right now and explore the new methods emerging for it, including Google’s Nested Learning and Meta’s Sparse Memory Fine-tuning
AI 101: What is Continual Learning?
Can models add new knowledge without wiping out what they already know? We look at why continual learning is becoming important right now and explore the new methods emerging for it, including Google’s Nested Learning and Meta’s Sparse Memory Fine-tuning
FOD#128: Universe of Incredible Models
An insanely saturated week with models you actually need to know about
State of AI Coding: Context, Trust, and Subagents
Emerging playbook beyond IDE
Archive

AI 101: What is Continual Learning?
Can models add new knowledge without wiping out what they already know? We look at why continual learning is becoming important right now and explore the new methods emerging for it, including Google’s Nested Learning and Meta’s Sparse Memory Fine-tuning













