Resident Evil Star Develops AI Memory System with Claude, Achieves Perfect Score on LongMemEval Benchmark

According to 1M AI News, Hollywood actress Milla Jovovich (known for her roles in ‘The Fifth Element’ and the ‘Resident Evil’ series) has co-developed an open-source AI memory system called MemPalace with Bitcoin entrepreneur and founder of the decentralized lending platform Libre, Ben Sigman. Released under the MIT license on GitHub, the project garnered 5,500 stars within three days. Sigman stated that the two spent months developing the project using Anthropic’s Claude, with Claude Opus 4.6 listed as a co-author in the Git commit history. The core competitive advantage of MemPalace lies in its benchmark performance. On the industry-standard memory retrieval benchmark LongMemEval, it achieved a Recall@5 of 96.6% with pure local retrieval (without calling any external APIs), and after enabling the optional Haiku model reordering, it scored a perfect 500 out of 500 questions, which the project team claims is the highest score ever recorded on this benchmark, whether for free or paid products. On two other benchmarks, ConvoMem scored 92.9%, claiming to exceed AI memory product Mem0 by more than twice; LoCoMo achieved perfect scores across all multi-hop reasoning categories. The benchmark testing code has been made publicly available with the repository for reproducibility. Unlike common vector database solutions, MemPalace organizes information using the ancient Greek orator’s technique of the ‘memory palace.’ The system structures users’ conversation records into a four-tier hierarchy: Wing (divided by person or project) → Room (specific topics) → Closet (compressed summaries) → Drawer (verbatim conversation records), with related rooms within the same wing interconnected through ‘Halls’ and cross-referenced between different wings via ‘Tunnels.’ Testing by the project team showed that this structure alone could improve retrieval accuracy by 34%. The project also created a lossless compression dialect called AAAK, designed specifically for AI agents, which compresses user context from thousands of tokens to about 120 tokens, achieving a compression ratio of approximately 30 times. AAAK consists of purely structured text, requiring no special decoders or fine-tuning, and can be directly understood by any large language model capable of reading text. The system also includes built-in contradiction detection, capable of capturing inconsistencies in names, pronouns, ages, and more before output. The entire system runs locally, does not rely on cloud services, requires no API keys, and is free of charge. It supports integration with tools like Claude, ChatGPT, and Cursor via the MCP protocol (offering 19 MCP tools) and also supports generating context summaries through command lines for local models like Llama and Mistral. Jovovich’s crossover into the tech world has surprised many. The project repository is registered under her GitHub account, with 4 out of 7 commits made by her, including the initial commit containing all core code. She posted an introductory video about the project on Instagram.

BTC4,35%
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin