Skip to main content

Deep Agents

Deep Agents builds on LangGraph to add task planning, subagent delegation, and file system tools. Connect it to the gateway through @langchain/openai's ChatOpenAI class and all LLM calls are proxied with automatic memory.

Before you begin

Complete the Deep Agents JS quickstart first. It covers installation and the base project setup for Deep Agents.

Configure the model

Create a ChatOpenAI instance with your gateway API key and base URL.

TypeScriptsrc/agent.ts
import { ChatOpenAI } from "@langchain/openai";

export const model = new ChatOpenAI({
  model: "google/gemini-2.5-flash",
  temperature: 0,
  configuration: {
    apiKey: "YOUR_API_KEY",
    baseURL: "https://gateway-api.mastra.ai/v1",
  },
});

Create a deep agent

Use createDeepAgent with a model, tools, and an optional system prompt. Deep Agents adds built-in planning and file system tools automatically.

TypeScriptsrc/agent.ts
import { createDeepAgent } from "deepagents";
import { ChatOpenAI } from "@langchain/openai";
import { tool } from "@langchain/core/tools";
import { z } from "zod";

export const getWeather = tool(
  async ({ location }) => {
    return `The weather in ${location} is sunny, 72°F.`;
  },
  {
    name: "get_weather",
    description: "Get the current weather for a given location",
    schema: z.object({
      location: z.string().describe("The city to get weather for"),
    }),
  },
);

export const model = new ChatOpenAI({
  model: "google/gemini-2.5-flash",
  temperature: 0,
  configuration: {
    apiKey: "YOUR_API_KEY",
    baseURL: "https://gateway-api.mastra.ai/v1",
  },
});

export const agent = createDeepAgent({
  model,
  tools: [getWeather],
  system: "You are a helpful assistant. Use the weather tool when asked about weather.",
});

Run the agent

The agent returned by createDeepAgent is a compiled LangGraph graph. Invoke it the same way you invoke any LangGraph graph.

TypeScriptsrc/run.ts
import { agent } from "./agent";

const result = await agent.invoke({
  messages: [{ role: "user", content: "What is the weather in Tokyo?" }],
});

const lastMessage = result.messages[result.messages.length - 1];
console.log(lastMessage.content);
// "The weather in Tokyo is sunny, 72°F."

Memory with thread and resource IDs

Pass x-thread-id and x-resource-id as default headers to enable observational memory. The gateway stores observations per thread and injects them as context on subsequent requests.

TypeScriptsrc/agent.ts
import { createDeepAgent } from "deepagents";
import { ChatOpenAI } from "@langchain/openai";
import { tool } from "@langchain/core/tools";
import { z } from "zod";

export const getWeather = tool(
  async ({ location }) => {
    return `The weather in ${location} is sunny, 72°F.`;
  },
  {
    name: "get_weather",
    description: "Get the current weather for a given location",
    schema: z.object({
      location: z.string().describe("The city to get weather for"),
    }),
  },
);

export const model = new ChatOpenAI({
  model: "google/gemini-2.5-flash",
  temperature: 0,
  configuration: {
    apiKey: "YOUR_API_KEY",
    baseURL: "https://gateway-api.mastra.ai/v1",
    defaultHeaders: {
      "x-thread-id": "my-thread-1",
      "x-resource-id": "user-42",
    },
  },
});

export const agent = createDeepAgent({
  model,
  tools: [getWeather],
  system: "You are a helpful assistant.",
});

// First request: introduce yourself
await agent.invoke({
  messages: [{ role: "user", content: "My name is Alex and I prefer concise answers." }],
});

// Second request: the gateway remembers
const result = await agent.invoke({
  messages: [{ role: "user", content: "What is my name?" }],
});

console.log(result.messages[result.messages.length - 1].content);
// "Alex"

Subagents

Deep Agents supports spawning specialized subagents for context isolation. Define subagents as objects with a name, description, and their own tools.

TypeScriptsrc/agent.ts
import { createDeepAgent } from "deepagents";
import { ChatOpenAI } from "@langchain/openai";
import { tool } from "@langchain/core/tools";
import { z } from "zod";

const getWeather = tool(
  async ({ location }) => {
    return `The weather in ${location} is sunny, 72°F.`;
  },
  {
    name: "get_weather",
    description: "Get the current weather for a given location",
    schema: z.object({
      location: z.string().describe("The city to get weather for"),
    }),
  },
);

const model = new ChatOpenAI({
  model: "google/gemini-2.5-flash",
  temperature: 0,
  configuration: {
    apiKey: "YOUR_API_KEY",
    baseURL: "https://gateway-api.mastra.ai/v1",
  },
});

export const agent = createDeepAgent({
  model,
  tools: [getWeather],
  system: "You are a helpful assistant. Delegate weather research to the weather subagent.",
  subagents: [
    {
      name: "weather_researcher",
      description: "Investigate weather-related questions and summarize findings.",
      system: "You help with weather requests. Use available tools and return concise notes.",
      tools: [getWeather],
    },
  ],
});

Streaming

Stream responses for incremental output.

TypeScriptsrc/stream.ts
import { agent } from "./agent";

const stream = await agent.stream(
  { messages: [{ role: "user", content: "What is the weather in Paris?" }] },
  { streamMode: "messages" },
);

for await (const [message, _metadata] of stream) {
  if (message.content) {
    process.stdout.write(String(message.content));
  }
}
  • Features: Observational memory, streaming, BYOK, and gateway tools
  • Models: Supported providers and model routing
  • API reference: Complete endpoint documentation