---
title: "Agent Skeleton"
description: "Define the initial ToolLoopAgent in lib/agent.ts. Set the model, wire up empty tools and instructions, and export the agent so the API route can use it."
canonical_url: "https://vercel.com/academy/filesystem-agents/agent-skeleton"
md_url: "https://vercel.com/academy/filesystem-agents/agent-skeleton.md"
docset_id: "vercel-academy"
doc_version: "1.0"
last_updated: "2026-04-11T11:17:39.032Z"
content_type: "lesson"
course: "filesystem-agents"
course_title: "Building Filesystem Agents"
prerequisites:  []
---

<agent-instructions>
Vercel Academy — structured learning, not reference docs.
Lessons are sequenced.
Adapt commands to the human's actual environment (OS, package manager, shell, editor) — detect from project context or ask, don't assume.
The lesson shows one path; if the human's project diverges, adapt concepts to their setup.
Preserve the learning goal over literal steps.
Quizzes are pedagogical — engage, don't spoil.
Quiz answers are included for your reference.
</agent-instructions>

# Agent Skeleton

# Define the Agent Skeleton

A regular LLM call is one turn: you send a prompt, you get a response. An agent is a loop. The model generates a response, and if that response includes a tool call, the agent executes the tool, feeds the result back, and lets the model decide what to do next. This continues until the model produces a final text response or hits a step limit.

The AI SDK's `ToolLoopAgent` handles this loop for you. You give it a model, tools, and instructions. It manages the back-and-forth:

```
prompt → LLM → tool call → execute tool → result → LLM → tool call → ... → final text
```

For your filesystem agent, the loop looks like this: the user asks "did anyone mention pricing?", the model calls `bashTool` with `grep -r "pricing" calls/`, reads the output, maybe runs another command to get more context, and then synthesizes a final answer. Each tool call is one step in the loop. The default limit is 20 steps before the agent stops.

## Outcome

You have a minimal `ToolLoopAgent` exported from `lib/agent.ts` that compiles and responds (without tools) when called from the API route.

## Fast Track

1. Import `ToolLoopAgent` from `ai` in `lib/agent.ts`
2. Create and export a `ToolLoopAgent` with the model set to `anthropic/claude-opus-4.6`
3. Start the dev server and verify the app loads at `localhost:3000`

## How the API route uses the agent

The starter repo's API route is already wired up to import your agent and stream its responses:

```ts title="app/api/route.ts" {1,6}
import { agent } from '@/lib/agent';

export async function POST(request: Request) {
  // ... extract prompt from messages ...

  const stream = await agent.stream({ prompt });
  // ... stream response back to UI ...
}
```

The route calls `agent.stream()` with the user's message. The agent runs its loop, streaming text and tool calls back as they happen. Your job is to export an `agent` from `lib/agent.ts` that this route can use.

## Hands-on Exercise 1.2

Create the agent skeleton in `lib/agent.ts`.

**Requirements:**

1. Import `ToolLoopAgent` from the `ai` package
2. Set the model to `anthropic/claude-opus-4.6` (via [AI Gateway](https://vercel.com/ai-gateway/models))
3. Pass empty `instructions` and `tools` for now
4. Export the agent as a named export called `agent`

**Implementation hints:**

- The `ToolLoopAgent` constructor takes an object with `model`, `instructions`, and `tools`
- The model string `'anthropic/claude-opus-4.6'` routes through [AI Gateway](https://vercel.com/ai-gateway) automatically. No `createGateway()` call needed.
- Use a `MODEL` constant so you can easily swap models later
- The `tools` property takes an object. Pass `{}` for now.
- The API route expects `export const agent`, not a default export

## Try It

1. **Start the dev server:**
   ```bash
   pnpm dev
   ```

2. **Open `http://localhost:3000`** and type a question like "hello". The agent responds with plain text (no tools, no file access). That's expected.

3. **Check the terminal** for any compilation errors. If `lib/agent.ts` exports correctly, the app compiles without issues.

\*\*Warning: Generic responses are expected\*\*

Without tools or instructions, the agent is just a bare LLM. It can't explore files or answer questions about calls yet. You'll fix that in the next two lessons.

## Commit

```bash
git add lib/agent.ts
git commit -m "feat(agent): add ToolLoopAgent skeleton"
```

## Done-When

- [ ] `lib/agent.ts` exports a `ToolLoopAgent` instance named `agent`
- [ ] The dev server compiles without errors
- [ ] The chat UI loads and the agent responds to messages (generic responses are fine)

## Solution

```ts title="lib/agent.ts"
import { ToolLoopAgent } from 'ai';

const MODEL = 'anthropic/claude-opus-4.6';

export const agent = new ToolLoopAgent({
  model: MODEL,
  instructions: '',
  tools: {}
});
```


---

[Full course index](/academy/llms.txt) · [Sitemap](/academy/sitemap.md)
