Memory & Continual Learning
Long-context understanding, persistent memory, RAG systems, and lifelong learning. Giving AI the ability to remember and learn continuously.
Key Benchmarks
Recent Papers
Improving Multi-step RAG with Hypergraph-based Memory for Long-Context Complex Relational Modeling
Chulun Zhou, Chunkang Zhang, Guoxin Yu +4 more
Fast-weight Product Key Memory
Tianyu Zhao, Llion Jones
ReFusion: A Diffusion Large Language Model with Parallel Autoregressive Decoding
Jia-Nan Li, Jian Guan, Wei Wu +1 more
Memory in the Age of AI Agents
Yuyang Hu, Shichun Liu, Yanwei Yue +2 more
QwenLong-L1.5: Post-Training Recipe for Long-Context Reasoning and Memory Management
Weizhou Shen, Ziyi Yang, Chenliang Li +2 more
Error-Free Linear Attention is a Free Lunch: Exact Solution from Continuous-Time Dynamics
Jingdi Lei, Di Zhang, Soujanya Poria
Confucius Code Agent: An Open-sourced AI Software Engineer at Industrial Scale
Zhaodong Wang, Zhenting Qi, Sherman Wong +11 more
AgentProg: Empowering Long-Horizon GUI Agents with Program-Guided Context Management
Shizuo Tian, Hao Wen, Yuxuan Chen +6 more
Confucius Code Agent: An Open-sourced AI Software Engineer at Industrial Scale
Zhaodong Wang, Zhenting Qi, Sherman Wong +3 more
Recent Milestones
Gemini 1.5 Pro 2M Context
Google expands Gemini 1.5 Pro to 2 million token context window.
GPT-4 Turbo 128K Context
OpenAI expands GPT-4 Turbo to 128K tokens with improved retrieval over long documents.
Claude 3 200K Context
Anthropic releases Claude 3 with 200K context window and improved long-context performance.
Gemini 1.5 Pro 1M Context Window
Google releases Gemini 1.5 Pro with 1 million token context window, 10x previous limits.