Skip to main content

Vercel AI SDK

The Vercel AI SDK connects to the gateway through two provider packages: @ai-sdk/openai-compatible for Chat Completions and Responses, and @ai-sdk/anthropic for the Messages API.

Before you begin

Complete the AI SDK Node.js quickstart first. It covers installation, environment variables, and the base project setup for the AI SDK.

Configure providers

OpenAI-compatible provider

Use @ai-sdk/openai-compatible to access any model through the gateway's Chat Completions endpoint.

TypeScriptsrc/gateway.ts
import { createOpenAICompatible } from "@ai-sdk/openai-compatible";

export const gateway = createOpenAICompatible({
  name: "mastra-gateway",
  baseURL: "https://gateway-api.mastra.ai/v1/chat",
  apiKey: "YOUR_API_KEY",
});

Anthropic provider

Use @ai-sdk/anthropic to access Claude models through the gateway's Messages endpoint. Add the Anthropic provider to the same src/gateway.ts file.

TypeScriptsrc/gateway.ts
import { createAnthropic } from "@ai-sdk/anthropic";

export const anthropicGateway = createAnthropic({
  baseURL: "https://gateway-api.mastra.ai/v1",
  apiKey: "YOUR_API_KEY",  // msk_ key
});

Generate text

Use generateText for non-streaming completions.

TypeScriptsrc/generate.ts
import { generateText } from "ai";
import { gateway } from "./gateway";

const result = await generateText({
  model: gateway.chatModel("google/gemini-2.5-flash"),
  messages: [
    { role: "user", content: "What is 2+2? Reply with just the number." },
  ],
  maxOutputTokens: 20,
});

console.log(result.text);
// "4"
console.log(result.usage.inputTokens);  // > 0
console.log(result.usage.outputTokens); // > 0

With a system message

Pass a system string to define the model's behavior.

const result = await generateText({
  model: gateway.chatModel("google/gemini-2.5-flash"),
  system: "You are a calculator. Only respond with numbers, no words.",
  messages: [
    { role: "user", content: "What is 10 * 5?" },
  ],
  maxOutputTokens: 100,
});

console.log(result.text);
// "50"

With the Anthropic provider

Use the Anthropic provider to route requests through the gateway's Messages endpoint.

import { generateText } from "ai";
import { anthropicGateway } from "./gateway";

const result = await generateText({
  model: anthropicGateway("anthropic/claude-haiku-4.5"),
  messages: [
    { role: "user", content: "What is 2+2? Reply with just the number." },
  ],
  maxOutputTokens: 20,
});

console.log(result.text);
// "4"

Stream text

Use streamText to receive chunks incrementally.

TypeScriptsrc/stream.ts
import { streamText } from "ai";
import { gateway } from "./gateway";

const result = streamText({
  model: gateway.chatModel("google/gemini-2.5-flash"),
  messages: [
    { role: "user", content: "Count from 1 to 5, separated by commas." },
  ],
  maxOutputTokens: 50,
});

for await (const chunk of result.textStream) {
  process.stdout.write(chunk);
}
// "1, 2, 3, 4, 5"

const usage = await result.usage;
console.log(usage.inputTokens);  // > 0
console.log(usage.outputTokens); // > 0

Tool calling

Define tools using the AI SDK tool helper with Zod schemas. The AI SDK auto-executes tool calls and sends results back to the model.

TypeScriptsrc/tools.ts
import { generateText, tool, stepCountIs } from "ai";
import { z } from "zod";
import { gateway } from "./gateway";

export const helloWorldTool = tool({
  description: "Returns a greeting for the given name",
  inputSchema: z.object({
    name: z.string().describe("The name to greet"),
  }),
  execute: async ({ name }) => ({ greeting: `Hello, ${name}!` }),
});

const result = await generateText({
  model: gateway.chatModel("google/gemini-2.5-flash"),
  messages: [
    { role: "user", content: "Use the helloWorld tool to greet Alice." },
  ],
  tools: { helloWorld: helloWorldTool },
  maxOutputTokens: 200,
  stopWhen: stepCountIs(2),
});

console.log(result.text);
// "Hello, Alice!"

// Inspect the tool call
console.log(result.steps[0].toolCalls[0].toolName); // "helloWorld"
console.log(result.steps[0].toolCalls[0].input);    // { name: "Alice" }

Streaming tool calls

Use streamText with tools to stream both text and tool call results.

import { streamText, stepCountIs } from "ai";
import { gateway } from "./gateway";
import { helloWorldTool } from "./tools";

const result = streamText({
  model: gateway.chatModel("google/gemini-2.5-flash"),
  messages: [
    { role: "user", content: "Use the helloWorld tool to greet Bob." },
  ],
  tools: { helloWorld: helloWorldTool },
  maxOutputTokens: 200,
  stopWhen: stepCountIs(2),
});

for await (const chunk of result.textStream) {
  process.stdout.write(chunk);
}

const steps = await result.steps;
console.log(steps[0].toolCalls[0].toolName); // "helloWorld"
console.log(steps[0].toolCalls[0].input);    // { name: "Bob" }

Tool calling with Anthropic

The same pattern works with the Anthropic provider:

import { generateText, stepCountIs } from "ai";
import { anthropicGateway } from "./gateway";
import { helloWorldTool } from "./tools";

const result = await generateText({
  model: anthropicGateway("anthropic/claude-haiku-4.5"),
  messages: [
    { role: "user", content: "Use the helloWorld tool to greet Alice." },
  ],
  tools: { helloWorld: helloWorldTool },
  maxOutputTokens: 200,
  stopWhen: stepCountIs(2),
});

console.log(result.text);
// "Hello, Alice!"
  • Features: Observational memory, streaming, BYOK, and gateway tools
  • Models: Supported providers and model routing
  • API reference: Complete endpoint documentation