Command Palette

Search for a command to run...

Providers

LLM Providers

Memori works by "patching" your LLM client. It supports major providers out of the box.

OpenAI

Full Support

Supports automated context injection for both standard completions and structured outputs.

typescript
1import { Memori } from "memori-js";
2import OpenAI from "openai";
3
4const client = new OpenAI();
5const memori = new Memori();
6
7// Register middleware
8memori.llm.register(client);

Anthropic

Full Support

Works seamlessly with Claude 3 and newer models.

typescript
1import { Memori } from "memori-js";
2import Anthropic from "@anthropic-ai/sdk";
3
4const client = new Anthropic();
5const memori = new Memori();
6
7// Register middleware
8memori.llm.register(client);

Custom / Vercel AI SDK

Beta

For other providers or the Vercel AI SDK, you can manually inject context using the retrieve API.

typescript
1// Manual Context Injection
2const context = await memori.retrieve(userQuery);
3
4const response = await generateText({
5 model: customModel,
6 messages: [
7 { role: 'system', content: `Context: ${context}` },
8 { role: 'user', content: userQuery }
9 ]
10});