Documentation Index
Fetch the complete documentation index at: https://docs.jeanmemory.com/llms.txt
Use this file to discover all available pages before exploring further.
The Jean Memory Node.js SDK is a headless library for integrating our Context API into your backend services.
Installation
npm install @jeanmemory/node
Usage: Context-Aware API Route
Create a secure bridge between your frontend and your LLM.
Extract User Token
Pull the userToken from the request body. This proves the user’s identity (handled by React SDK).
Fetch Context
Call jean.getContext() to get relevant memories for the user’s current message.
Stream Response
Inject the context into your system prompt and stream the LLM response back to the client.
Next.js Example (App Router)
import { JeanClient } from '@jeanmemory/node';
import OpenAI from 'openai';
import { OpenAIStream, StreamingTextResponse } from 'ai';
const jean = new JeanClient({ apiKey: process.env.JEAN_API_KEY });
const openai = new OpenAI({ apiKey: process.env.OPENAI_API_KEY });
export const runtime = 'edge';
export async function POST(req: Request) {
// 1. Get data
const { messages, userToken } = await req.json();
const lastMessage = messages[messages.length - 1].content;
// 2. Get Context
const context = await jean.getContext({
user_token: userToken,
message: lastMessage,
speed: "balanced"
});
// 3. Create Prompt
const prompt = `
Context: ${context.text}
Question: ${lastMessage}
`;
// 4. Stream Response
const response = await openai.chat.completions.create({
model: 'gpt-4-turbo',
stream: true,
messages: [{ role: "user", content: prompt }],
});
const stream = OpenAIStream(response);
return new StreamingTextResponse(stream);
}
Configuration
// "fast", "balanced", "comprehensive"
await jean.getContext({
user_token: token,
message: msg,
speed: "fast"
});
For advanced deterministic control, access the tools namespace directly.
// Add Memory
await jean.tools.add_memory({
user_token: token,
content: "Project deadline is Friday"
});
// Search Memory
const results = await jean.tools.search_memory({
user_token: token,
query: "deadlines"
});