Back to Arena
Nemori
by Nemori AI (independent)
System Card
OrganizationNemori AI (independent)
Released2025-08
Architectureepisodic-buffer / Event-segmentation-aligned episodic + semantic memory
DetailsTwo-step alignment principle from Event Segmentation Theory segments raw conversations into semantically meaningful episodes. Predict-Calibrate principle enables proactive learning from prediction gaps. Produces episodic + semantic memory with a unified search surface.
Parameters—
Domainagent-memoryepisodic-session
Open SourceYes
PaperView Paper
CodeRepository
event-segmentationcognitive-sciencelocomolongmemeval
Capability Profile
Benchmark Scores
6 of 14 benchmarksLong-Context Retrieval0/5
RULER
no dataNIAH
no dataLooGLE
no dataLongBench
no data∞Bench
no dataMulti-Turn Recall2/2
Cross-Session Memory1/1
Multi-Hop QA2/3
Agent Task Memory1/1
Personalization0/1
PerLTQA
no dataFactuality / Grounding0/1
RAGAS
no dataSources:arXiv:2508.03341 results table — LLM-judge score with gpt-4.1-mini backbone; gpt-4o-mini variant scored 74.4arXiv:2508.03341 results table — LongMemEval-S accuracy with gpt-4.1-mini; uses 95-96% less context than full-context baselineNemori paper (arXiv:2508.03341); evaluated on AgentBench Memory Track (Tsinghua KEG, 2308)Nemori paper (arXiv:2508.03341); evaluated on HotpotQA: A Dataset for Diverse, Explainable Multi-hop Question Answering (Stanford / CMU, 1809)Nemori paper (arXiv:2508.03341); evaluated on MemoryBank: Enhancing LLMs with Long-Term Memory (Sun Yat-sen University, 2305)Nemori paper (arXiv:2508.03341); evaluated on MultiHop-RAG: Benchmarking Retrieval-Augmented Generation for Multi-Hop Queries (HKUST, 2401)