Documentation Index
Fetch the complete documentation index at: https://docs.jeanmemory.com/llms.txt
Use this file to discover all available pages before exploring further.
Choose your path. Add a complete UI component to your frontend or add a powerful context layer to your backend.
Drop-in UI Component
Headless Backend
The fastest way to get a full-featured chatbot running in your app.// 1. Install the React SDK
// npm install @jeanmemory/react
// 2. Add the provider and complete chat component
import { JeanProvider, JeanChatComplete } from '@jeanmemory/react';
function MyPage() {
return (
<JeanProvider apiKey="jean_sk_your_key">
<JeanChatComplete />
</JeanProvider>
);
}
For developers who want to power their existing AI agents with our headless SDK.# 1. Install the Python SDK
# pip install jeanmemory openai
# 2. Get context before calling your LLM
import os
from jeanmemory import JeanMemoryClient
from openai import OpenAI
jean = JeanMemoryClient(api_key=os.environ["JEAN_API_KEY"])
openai = OpenAI(api_key=os.environ["OPENAI_API_KEY"])
context = jean.get_context(
user_token="USER_TOKEN_FROM_FRONTEND",
message="What was our last conversation about?"
).text
prompt = f"Context: {context}\n\nUser question: What was our last conversation about?"
# 3. Use the context in your LLM call
completion = openai.chat.completions.create(
model="gpt-4-turbo",
messages=[{"role": "user", "content": prompt}]
)
Full-Stack Integration: User signs in with React SDK, then the same user token works across all SDKs. Frontend handles auth, backend gets context.