Skip to main content
@upstash/context7-tools-ai-sdk provides Vercel AI SDK compatible tools and agents that give your AI applications access to up-to-date library documentation. When building AI-powered applications with the Vercel AI SDK, your models often need accurate information about libraries and frameworks. Instead of relying on potentially outdated training data, Context7 tools let your AI fetch current documentation on-demand, ensuring responses include correct API usage, current best practices, and working code examples. The package gives you two ways to integrate:
  1. Individual tools (resolveLibraryId and queryDocs) that you add to your existing generateText or streamText calls
  2. A pre-built agent (Context7Agent) that handles the entire documentation lookup workflow automatically
Both approaches work with any AI provider supported by the Vercel AI SDK, including OpenAI, Anthropic, Google, and others.

Installation

npm install @upstash/context7-tools-ai-sdk

Prerequisites

You’ll need:
  1. A Context7 API key from the Context7 Dashboard
  2. An AI provider SDK (e.g., @ai-sdk/openai, @ai-sdk/anthropic)

Configuration

Set your Context7 API key as an environment variable:
CONTEXT7_API_KEY=ctx7sk-...
The tools and agents will automatically use this key.

Quick Start

Using Tools with generateText

The simplest way to add documentation lookup to your AI application:
import { resolveLibraryId, queryDocs } from "@upstash/context7-tools-ai-sdk";
import { generateText, stepCountIs } from "ai";
import { openai } from "@ai-sdk/openai";

const { text } = await generateText({
  model: openai("gpt-5.2"),
  prompt: "How do I create a server action in Next.js?",
  tools: {
    resolveLibraryId: resolveLibraryId(),
    queryDocs: queryDocs(),
  },
  stopWhen: stepCountIs(5),
});

console.log(text);

Using the Context7 Agent

For a more streamlined experience, use the pre-configured agent:
import { Context7Agent } from "@upstash/context7-tools-ai-sdk";
import { anthropic } from "@ai-sdk/anthropic";

const agent = new Context7Agent({
  model: anthropic("claude-sonnet-4-20250514"),
});

const { text } = await agent.generate({
  prompt: "How do I use React Server Components?",
});

console.log(text);

Using Tools with streamText

For streaming responses:
import { resolveLibraryId, queryDocs } from "@upstash/context7-tools-ai-sdk";
import { streamText, stepCountIs } from "ai";
import { openai } from "@ai-sdk/openai";

const { textStream } = streamText({
  model: openai("gpt-5.2"),
  prompt: "Explain how to use Tanstack Query for data fetching",
  tools: {
    resolveLibraryId: resolveLibraryId(),
    queryDocs: queryDocs(),
  },
  stopWhen: stepCountIs(5),
});

for await (const chunk of textStream) {
  process.stdout.write(chunk);
}

Explicit Configuration

You can also pass the API key directly if needed:
import { resolveLibraryId, queryDocs } from "@upstash/context7-tools-ai-sdk";

const tools = {
  resolveLibraryId: resolveLibraryId({ apiKey: "your-api-key" }),
  queryDocs: queryDocs({ apiKey: "your-api-key" }),
};

How It Works

The tools follow a two-step workflow:
  1. resolveLibraryId - Searches Context7’s database to find the correct library ID for a given query (e.g., “react” → /reactjs/react.dev)
  2. queryDocs - Fetches documentation for the resolved library using the user’s query to retrieve relevant content
The AI model orchestrates these tools automatically based on the user’s prompt, fetching relevant documentation before generating a response.

Next Steps