All Frameworks

Vercel AI SDK Quickstart

Add persistent memory to your Vercel AI chatbot with tool middleware.

1Install & Configure

Install the SDK alongside the Vercel AI SDK and set your API keys.

npm install agenticmemoryai ai @ai-sdk/openai
export AGENTICMEMORY_API_KEY="amk_your_key_here"
export OPENAI_API_KEY="sk-..."

2Build a Chat Route with Memory

Create a Next.js API route that persists conversation history and supports memory search.

// app/api/chat/route.js
const { AgenticMemory } = require('agenticmemoryai');
const { streamText, tool } = require('ai');
const { openai } = require('@ai-sdk/openai');
const { z } = require('zod');

const memory = new AgenticMemory();
const SPACE_ID = process.env.MEMORY_SPACE_ID;

module.exports = {
  async POST(req) {
    const { messages } = await req.json();
    const lastMessage = messages[messages.length - 1];

    // Store the user message
    await memory.store(SPACE_ID, { role: 'user', content: lastMessage.content });

    // Bootstrap prior context
    const ctx = await memory.bootstrap(SPACE_ID, {
      messages: 20, entries: 5, semantic: true, query: lastMessage.content
    });

    // Build system prompt with memory context
    const memoryContext = ctx.entries?.map(e => e.content).join('\n') || '';

    const result = streamText({
      model: openai('gpt-4o'),
      system: `You are a helpful assistant with persistent memory.\n\nMemory:\n${memoryContext}`,
      messages,
      tools: {
        search_memory: tool({
          description: 'Search persistent memory for relevant information',
          parameters: z.object({ query: z.string() }),
          async execute({ query }) {
            const results = await memory.search(SPACE_ID, query, { limit: 5, scope: 'all' });
            return results.results.map(r => r.content).join('\n');
          }
        }),
        remember: tool({
          description: 'Store an important fact to long-term memory',
          parameters: z.object({ content: z.string(), tags: z.array(z.string()).optional() }),
          async execute({ content, tags }) {
            await memory.addEntry(SPACE_ID, {
              type: 'fact', title: content.slice(0, 50), content, tags: tags || []
            });
            return 'Saved to memory.';
          }
        })
      },
      async onFinish({ text }) {
        // Persist the assistant response
        await memory.store(SPACE_ID, { role: 'assistant', content: text });
      }
    });

    return result.toDataStreamResponse();
  }
};

3Expected Output

Your chatbot now remembers conversations across sessions. Users can ask "What did we discuss last time?" and the agent will search its memory.

// Client-side messages array
[
  { "role": "user", "content": "What did we discuss yesterday?" },
  { "role": "assistant", "content": "Yesterday we discussed the project deadline (June 15) and the auth module status." }
]

Free tier: 1 space, 1K messages/month. Upgrade to Pro ($24.99/mo) for semantic search, entries, and entities.

Help
Help

Click the icon on any page for context-sensitive help.

Quick Links