for JavaScript & TypeScript
memori-js is an active memory layer that lives inside your application, automatically managing context for your AI agents.ZERO-CONFIG · TYPE-SAFE · LOCAL-FIRST
npm install memori-js[Legacy SQL Match] Found keyword 'what' in unrelated record #8492
Match: 1.5%
System Error: Index out of sync.
Match: 0.5%
Keyword / Basic SQL · None
The speed of light is approximately 299,792 km/s.
Relevance: 0.252
Photosynthesis converts light into chemical energy.
Relevance: -0.071
The Great Wall of China is visible from space (mostly a myth).
Relevance: -0.095
The Earth is the third planet from the Sun.
Relevance: -0.100
DNA carries genetic instructions for life.
Relevance: -0.110
Vector Similarity (Cosine) · HNSW / IVFFlat (via sqlite-vec)
Stop building complex RAG pipelines. Let Memori handle context injection automatically.
Explore the primitives that power stateful agents.
Memori lives inside your app, actively managing context for your agents.
1// Auto-injects relevant context2memori.llm.register(openai);3await openai.chat.completions.create({...})Works seamlessly with OpenAI, Google GenAI, and Anthropic. One line to register, zero config to maintain.
1import OpenAI from "openai";2import { Memori } from "memori-js";34const client = new OpenAI({ apiKey: process.env.OPENAI_API_KEY });5const memori = new Memori({ googleApiKey: process.env.GOOGLE_API_KEY });67// Register for auto-augmentation8memori.llm.register(client, "openai");910// Now, every call is memory-augmented!11const response = await client.chat.completions.create({12 model: "gpt-4",13 messages: [{ role: "user", content: "What is my favorite color?" }],14});"Memory should be invisible."
Most "Memory" libraries are just complex wrappers around vector stores. Memori-JS takes a different approach: as a developer, you shouldn't care how the relevant context is found, only that your agent has it.