Skip to main content
The Jean Memory Python SDK provides a simple, headless interface to our powerful Context API.

Installation

pip install jeanmemory

The Golden Path

How to add memory to an AI agent in 5 steps.
1

Initialize Clients

Create instances of JeanMemoryClient and your LLM client (e.g., OpenAI).
2

Get User Token

Retrieve the user_token from your frontend (passed via OAuth) to identify the user.
3

Get Context

Call jean.get_context() with the user’s message. Jean Memory synthesizes the perfect background info.
4

Engineer Prompt

Inject the retrieved context before the user’s question in your final system prompt.
5

Call LLM

Send the context-rich prompt to your model for a personalized answer.

Complete Example

import os
from openai import OpenAI
from jeanmemory import JeanMemoryClient

# 1. Initialize
jean = JeanMemoryClient(api_key=os.environ["JEAN_API_KEY"])
openai = OpenAI(api_key=os.environ["OPENAI_API_KEY"])

# 2. Get User Token (from frontend request)
user_token = get_token_from_request() 

# 3. Get Context
user_msg = "What were the key takeaways from my last meeting?"
context = jean.get_context(
    user_token=user_token,
    message=user_msg
).text

# 4. Engineer Prompt
prompt = f"""
Context:
{context}

User Question: {user_msg}
"""

# 5. Call LLM
completion = openai.chat.completions.create(
    model="gpt-4-turbo",
    messages=[{"role": "user", "content": prompt}]
)

Configuration

Control speed, tools, and formatting.
# Options: "fast", "balanced" (default), "comprehensive"
context = jean.get_context(
    user_token=token,
    message=msg,
    speed="fast" 
)

Headless Authentication

For backend-only apps (no frontend user).
# Option 1: Test Mode (Development)
# Passing None creates an automatic test user
jean.get_context(user_token=None, message="Hello")

# Option 2: Manual OAuth (Production)
url = jean.get_auth_url(callback_url="...")
# ... user visits URL ...
token = jean.exchange_code_for_token(code)