Show HN: AI memory with biological decay (52% recall)
Source: Hacker News
Overview
Most RAG setups fail because they treat memory like a static filing cabinet. When every transient bug fix or abandoned rule is stored forever, the context window eventually chokes on noise, spiking token costs and degrading the agent’s reasoning.
Implementation
This implementation experiments with a biological approach by using the Ebbinghaus forgetting curve to manage context as a living substrate. Memories are assigned a strength score where each recall reinforces the data and flattens its decay curve (spaced repetition), while unused data eventually hits a threshold and is pruned.
To solve the “logical neighbor” problem where semantic search misses relevant but non‑similar nodes, a graph layer is added over the vector store.
Results
Benchmarked against the LoCoMo dataset, this approach reached 52 % Recall@5, nearly double the accuracy of stateless vector stores, while cutting token waste by roughly 84 %.
Repository
- GitHub:
Discussion
- Comments:
- Points: 31
-
Comments: 10