All Frameworks

AutoGen / AG2 Quickstart

Persist AutoGen agent conversations across sessions with durable memory.

1Install & Configure

Install the SDK alongside AutoGen and set your API keys.

pip install agenticmemory-sdk autogen-agentchat
export AGENTICMEMORY_API_KEY="amk_your_key_here"
export OPENAI_API_KEY="sk-..."

2Build Agents with Persistent Memory

Use AgenticMemory to persist conversations and context across AutoGen sessions.

from agenticmemoryai import AgenticMemory
import autogen

memory = AgenticMemory()

# Create a space for this agent group
space = memory.create_space("AutoGen Team", "autogen-team")
space_id = space["space"]["_id"]

# Bootstrap prior context from previous sessions
ctx = memory.bootstrap(space_id, messages=20, entries=5)
prior_messages = ctx.get("messages", [])
prior_summary = "\n".join(
    e["content"] for e in ctx.get("entries", [])
)

# Configure LLM
llm_config = {"model": "gpt-4o", "api_key": os.environ["OPENAI_API_KEY"]}

# Create agents with prior knowledge
assistant = autogen.AssistantAgent(
    name="assistant",
    system_message=f"You are a helpful assistant.\nPrior knowledge:\n{prior_summary}",
    llm_config=llm_config
)

user_proxy = autogen.UserProxyAgent(
    name="user",
    human_input_mode="NEVER",
    max_consecutive_auto_reply=3
)

# Register a function to store each exchange
def persist_message(sender, message, recipient, silent):
    role = "user" if sender.name == "user" else "assistant"
    content = message if isinstance(message, str) else message.get("content", "")
    memory.store(space_id, role=role, content=content)
    return False  # don't skip

assistant.register_hook(hookpoint="process_message_before_send",
                        hook=lambda sender, msg, recip, silent:
                            persist_message(sender, msg, recip, silent) or msg)

# Run the conversation
user_proxy.initiate_chat(assistant, message="Summarize our project status.")

# After the chat, store a summary as a long-term entry
memory.add_entry(space_id, type="summary", title="Session summary",
                 content=assistant.last_message()["content"],
                 tags=["autogen", "session"])

# Search across all memory
results = memory.search(space_id, "project status", limit=5, scope="all")
print("Found", len(results["results"]), "relevant memories")

3Expected Output

assistant (to user):
Based on our previous sessions, here's the project status:
- Auth module: completed in Sprint 4
- API integration: 80% done
- Testing: in progress
Found 4 relevant memories

Free tier: 1 space, 1K messages/month. Upgrade to Pro ($24.99/mo) for semantic search, entries, and entities.

Help
Help

Click the icon on any page for context-sensitive help.

Quick Links