Skip to main content
Get Started

Providers & Models

VoltAgent is built directly on top of the Vercel AI SDK. You can either:

  • Pass a LanguageModel from an ai-sdk provider package, or
  • Use a model string like openai/gpt-4o-mini and let VoltAgent resolve it with the built-in model router.

Both approaches are fully compatible with ai-sdk streaming, tool calling, and structured outputs. For the router, VoltAgent ships with a registry snapshot generated from models.dev.

Model Strings (Model Router)

Model strings remove the need to import provider packages in your app:

import { Agent } from "@voltagent/core";

const agent = new Agent({
name: "my-agent",
instructions: "You are a helpful assistant",
model: "openai/gpt-4o-mini",
});

Other examples:

const claudeAgent = new Agent({
name: "claude-agent",
instructions: "Answer with concise reasoning",
model: "anthropic/claude-3-5-sonnet",
});

const geminiAgent = new Agent({
name: "gemini-agent",
instructions: "Respond in Turkish",
model: "google/gemini-2.0-flash",
});

If you need provider-specific configuration or want to use the ai-sdk APIs directly, pass a LanguageModel instead. See Model Router & Registry for how strings are resolved and how env vars are mapped. For the full provider directory, see Models.

Installation

Install the AI SDK base package (required):

npm install ai

If you plan to import ai-sdk providers directly (for embeddings or provider-specific helpers like openai.embedding(...)), install those packages too. If you only use model strings, you can skip them:

# For example, to use OpenAI:
npm install @ai-sdk/openai

# Or Anthropic:
npm install @ai-sdk/anthropic

# Or Google:
npm install @ai-sdk/google

If you only use model strings, you can skip installing provider packages.

Usage Examples

Model Strings

import { Agent } from "@voltagent/core";

const agent = new Agent({
name: "my-agent",
instructions: "You are a helpful assistant",
model: "openai/gpt-4o-mini",
});

Direct ai-sdk Provider

import { Agent } from "@voltagent/core";
import { openai } from "@ai-sdk/openai";

const agent = new Agent({
name: "my-agent",
instructions: "You are a helpful assistant",
model: openai("gpt-4o-mini"),
});

Available Providers

The lists below describe ai-sdk provider packages. Model strings are resolved through VoltAgent's registry and map to these providers under the hood.

First-Party AI SDK Providers

These providers are maintained by Vercel and offer the highest level of support and integration:

Foundation Models

ProviderPackageDocumentationKey Models
xAI Grok@ai-sdk/xaiDocsgrok-4, grok-3, grok-2-vision
OpenAI@ai-sdk/openaiDocsgpt-4.1, gpt-4o, o3, o1
Anthropic@ai-sdk/anthropicDocsclaude-opus-4, claude-sonnet-4, claude-3.5
Google Generative AI@ai-sdk/googleDocsgemini-2.0-flash, gemini-1.5-pro
Google Vertex@ai-sdk/google-vertexDocsgemini models, claude models via Vertex
Mistral@ai-sdk/mistralDocsmistral-large, pixtral-large, mistral-medium

Cloud Platforms

ProviderPackageDocumentationDescription
Amazon Bedrock@ai-sdk/amazon-bedrockDocsAccess to various models via AWS
Azure OpenAI@ai-sdk/azureDocsOpenAI models via Azure
Vercel@ai-sdk/vercelDocsv0 model for code generation

Specialized Providers

ProviderPackageDocumentationSpecialization
Groq@ai-sdk/groqDocsUltra-fast inference
Together.ai@ai-sdk/togetheraiDocsOpen-source models
Cohere@ai-sdk/cohereDocsEnterprise search & generation
Fireworks@ai-sdk/fireworksDocsFast open-source models
DeepInfra@ai-sdk/deepinfraDocsAffordable inference
DeepSeek@ai-sdk/deepseekDocsDeepSeek models including reasoner
Cerebras@ai-sdk/cerebrasDocsFast Llama models
Perplexity@ai-sdk/perplexityDocsSearch-enhanced responses

Audio & Speech Providers

ProviderPackageDocumentationSpecialization
ElevenLabs@ai-sdk/elevenlabsDocsText-to-speech
LMNT@ai-sdk/lmntDocsVoice synthesis
Hume@ai-sdk/humeDocsEmotional intelligence
Rev.ai@ai-sdk/revaiDocsSpeech recognition
Deepgram@ai-sdk/deepgramDocsSpeech-to-text
Gladia@ai-sdk/gladiaDocsAudio intelligence
AssemblyAI@ai-sdk/assemblyaiDocsSpeech recognition & understanding

Community Providers

These providers are created and maintained by the open-source community:

ProviderPackageDocumentationDescription
Ollamaollama-ai-providerDocsLocal model execution
FriendliAI@friendliai/ai-providerDocsOptimized inference
Portkey@portkey-ai/vercel-providerDocsLLM gateway & observability
Cloudflare Workers AIworkers-ai-providerDocsEdge AI inference
OpenRouter@openrouter/ai-sdk-providerDocsUnified API for multiple providers
Requesty@requesty/ai-sdkDocsRequest management
Crosshatch@crosshatch/ai-providerDocsSpecialized models
Mixedbreadmixedbread-ai-providerDocsEmbedding models
Voyage AIvoyage-ai-providerDocsEmbedding models
Mem0@mem0/vercel-ai-providerDocsMemory-enhanced AI
Letta@letta-ai/vercel-ai-sdk-providerDocsStateful agents
Sparkspark-ai-providerDocsChinese language models
AnthropicVertexanthropic-vertex-aiDocsClaude via Vertex AI
LangDB@langdb/vercel-providerDocsDatabase-aware AI
Difydify-ai-providerDocsLLMOps platform
Sarvamsarvam-ai-providerDocsIndian language models
Claude Codeai-sdk-provider-claude-codeDocsCode-optimized Claude
Built-in AIbuilt-in-aiDocsBrowser-native AI
Gemini CLIai-sdk-provider-gemini-cliDocsCLI-based Gemini
A2Aa2a-ai-providerDocsSpecialized models
SAP-AI@mymediset/sap-ai-providerDocsSAP AI Core integration

OpenAI-Compatible Providers

For providers that follow the OpenAI API specification:

ProviderDocumentationDescription
LM StudioDocsLocal model execution with GUI
BasetenDocsModel deployment platform
Any OpenAI-compatible APIDocsCustom endpoints

Model Capabilities

The AI providers support different language models with various capabilities. Here are the capabilities of popular models:

ProviderModelImage InputObject GenerationTool UsageTool Streaming
xAI Grokgrok-4
xAI Grokgrok-3
xAI Grokgrok-3-fast
xAI Grokgrok-3-mini
xAI Grokgrok-3-mini-fast
xAI Grokgrok-2-1212
xAI Grokgrok-2-vision-1212
xAI Grokgrok-beta
xAI Grokgrok-vision-beta
Vercelv0-1.0-md
OpenAIgpt-4.1
OpenAIgpt-4.1-mini
OpenAIgpt-4.1-nano
OpenAIgpt-4o
OpenAIgpt-4o-mini
OpenAIgpt-4
OpenAIo3-mini
OpenAIo3
OpenAIo4-mini
OpenAIo1
OpenAIo1-mini
OpenAIo1-preview
Anthropicclaude-opus-4-20250514
Anthropicclaude-sonnet-4-20250514
Anthropicclaude-3-7-sonnet-20250219
Anthropicclaude-3-5-sonnet-20241022
Anthropicclaude-3-5-sonnet-20240620
Anthropicclaude-3-5-haiku-20241022
Mistralpixtral-large-latest
Mistralmistral-large-latest
Mistralmistral-medium-latest
Mistralmistral-medium-2505
Mistralmistral-small-latest
Mistralpixtral-12b-2409
Google Generative AIgemini-2.0-flash-exp
Google Generative AIgemini-1.5-flash
Google Generative AIgemini-1.5-pro
Google Vertexgemini-2.0-flash-exp
Google Vertexgemini-1.5-flash
Google Vertexgemini-1.5-pro
DeepSeekdeepseek-chat
DeepSeekdeepseek-reasoner
Cerebrasllama3.1-8b
Cerebrasllama3.1-70b
Cerebrasllama3.3-70b
Groqmeta-llama/llama-4-scout-17b-16e-instruct
Groqllama-3.3-70b-versatile
Groqllama-3.1-8b-instant
Groqmixtral-8x7b-32768
Groqgemma2-9b-it

Note: This table is not exhaustive. Additional models can be found in the provider documentation pages and on the provider websites.

Migration from Deprecated Providers

If you're currently using VoltAgent's native providers (@voltagent/anthropic-ai, @voltagent/google-ai, @voltagent/groq-ai), we recommend migrating to the Vercel AI SDK providers:

Before (Deprecated):

import { AnthropicProvider } from "@voltagent/anthropic-ai";

const provider = new AnthropicProvider({ apiKey: "..." });
const agent = new Agent({
llm: provider,
model: "claude-opus-4-1",
});
import { Agent } from "@voltagent/core";

const agent = new Agent({
model: "anthropic/claude-3-5-sonnet",
instructions: "You are a helpful assistant",
});

Environment Variables

Most providers use environment variables for API keys:

# OpenAI
OPENAI_API_KEY=your-key

# Anthropic
ANTHROPIC_API_KEY=your-key

# Google
GOOGLE_GENERATIVE_AI_API_KEY=your-key

# Groq
GROQ_API_KEY=your-key

# And so on...

OpenAI-compatible providers may also need a base URL. You can set a provider-specific override like:

OPENROUTER_BASE_URL=https://openrouter.ai/api/v1

Next Steps

  1. Choose a provider based on your needs (performance, cost, capabilities)
  2. Install the corresponding package if you plan to import the provider
  3. Configure your API keys
  4. Start building with VoltAgent!

For detailed information about each provider, visit the Vercel AI SDK documentation.


Acknowledgments

The provider lists and model capabilities in this documentation are sourced from the Vercel AI SDK documentation.

A special thanks to the Vercel AI SDK maintainers and community for creating and maintaining this comprehensive ecosystem of AI providers. Their work enables developers to seamlessly integrate with 30+ AI providers through a unified, well-designed interface.

VoltAgent builds upon this excellent foundation to provide a complete framework for building AI agents and workflows.

Table of Contents