Command Palette

Search for a command to run...

v1.1.1 is live

Give your AI
Unforgettable Memory.

The missing SQL-native layer for intelligent agents. No vector DB complexity. Just pure recall.

Start Building
TERMINAL
Input
import { Memori } from "memori-js";

// 1. Initialize
const memori = new Memori();

// 2. Add Memory
await memori.store({
  content: "User prefers dark mode",
  tags: ["ui"]
});

// 3. Recall
const context = await memori.retrieve(
  "What do they like?"
);
Output
// WAITING FOR INPUT...

The Pain of Vector DBs

Building memory for AI agents usually requires glueing together multiple complex systems.

Set up a vector DB
Chunk/embed manually
Query the DB
Inject context
Call the LLM
Save conversation

Memori.ts abstracts the entire RAG pipeline into a single line of code.

// 1. Register middleware
memori.llm.register(client);
// 2. Call LLM as normal
await client.chat.completions.create(...)
Memory is auto-injected.

Why settle for
complexity?

Most memory solutions require complex infrastructure, vector databases, and cloud dependencies. Memori.ts is just a standard library.

SPECIFICATION
MEMORI.TS
VECTOR DBs
Infrastructure
Setup Time
Query Language
Developer Exp.
Data Privacy
Latency
LLM Support
Cost Model
01

SQL Native

Query your agent's memory using standard SQL. No learning curve, just powerful relational queries.

02

Zero Config

Start coding instantly. No Docker, no API keys, no complex vector DB infrastructure to manage.

03

Universal Support

Works with OpenAI, Anthropic, Gemini, and local LLMs via a simple, unified adapter interface.

04

Local-First

Data lives on your device by default for max privacy. Sync to the cloud only when you need it.

Architecture

The proactive memory loop.

DEVICE
APP LAYER
RETRIEVE
⦿
CORE
INJECT
LLM
MODEL LAYER
Deep Dive into Core ConceptsLearn how Memori orchestrates context