Memory & Continual Learning
Long-context understanding, persistent memory, RAG systems, and lifelong learning. Giving AI the ability to remember and learn continuously.
Key Benchmarks
Recent Papers
LogicPoison: Logical Attacks on Graph Retrieval-Augmented Generation
Yilin Xiao, Jin Chen, Qinggang Zhang +6 more
Optimizing RAG Rerankers with LLM Feedback via Reinforcement Learning
Yuhang Wu, Xiangqing Shen, Fanfan Wang +4 more
Neuro-RIT: Neuron-Guided Instruction Tuning for Robust Retrieval-Augmented Language Model
Jaemin Kim, Jae O Lee, Sumyeong Ahn +1 more
Diffusion Language Models Are Natively Length-Aware
Vittorio Rossi, Giacomo CirĂ², Davide Beltrame +3 more
LIT-RAGBench: Benchmarking Generator Capabilities of Large Language Models in Retrieval-Augmented Generation
Koki Itai, Shunichi Hasegawa, Yuta Yamamoto +2 more
FlashPrefill: Instantaneous Pattern Discovery and Thresholding for Ultra-Fast Long-Context Prefilling
Qihang Fan, Huaibo Huang, Zhiying Wu +3 more
MemSkill: Learning and Evolving Memory Skills for Self-Evolving Agents
Haozhen Zhang, Quanyu Long, Jianzhu Bao +4 more
KV-CoRE: Benchmarking Data-Dependent Low-Rank Compressibility of KV-Caches in LLMs
Jian Chen, Zhuoran Wang, Jiayu Qin +6 more
Shared LoRA Subspaces for almost Strict Continual Learning
Prakhar Kaushik, Ankit Vaidya, Shravan Chaudhari +2 more
EuroLLM-22B: Technical Report
Miguel Moura Ramos, Duarte M. Alves, Hippolyte Gisserot-Boukhlef +15 more
Recent Milestones
Gemini 1.5 Pro 2M Context
Google expands Gemini 1.5 Pro to 2 million token context window.
GPT-4 Turbo 128K Context
OpenAI expands GPT-4 Turbo to 128K tokens with improved retrieval over long documents.
Claude 3 200K Context
Anthropic releases Claude 3 with 200K context window and improved long-context performance.
Gemini 1.5 Pro 1M Context Window
Google releases Gemini 1.5 Pro with 1 million token context window, 10x previous limits.