Providers
LLM Providers
Memori works by "patching" your LLM client. It supports major providers out of the box.
OpenAI
Full SupportSupports automated context injection for both standard completions and structured outputs.
typescript
1import { Memori } from "memori-js";2import OpenAI from "openai";34const client = new OpenAI();5const memori = new Memori();67// Register middleware8memori.llm.register(client);Anthropic
Full SupportWorks seamlessly with Claude 3 and newer models.
typescript
1import { Memori } from "memori-js";2import Anthropic from "@anthropic-ai/sdk";34const client = new Anthropic();5const memori = new Memori();67// Register middleware8memori.llm.register(client);Custom / Vercel AI SDK
BetaFor other providers or the Vercel AI SDK, you can manually inject context using the retrieve API.
typescript
1// Manual Context Injection2const context = await memori.retrieve(userQuery);34const response = await generateText({5 model: customModel,6 messages: [7 { role: 'system', content: `Context: ${context}` },8 { role: 'user', content: userQuery }9 ]10});