Recommendations

  • Best accuracy: O3
  • Fastest: llama4 on groq
  • Balanced: fast + cheap + clever: gemini-2.5-flash or gpt-4.1-mini

OpenAI example

O3 model is recommended for best performance.
from browser_use import Agent, ChatOpenAI

# Initialize the model
llm = ChatOpenAI(
    model="o3",
)

# Create agent with the model
agent = Agent(
    task="...", # Your task here
    llm=llm
)
Required environment variables:
.env
OPENAI_API_KEY=
You can use any OpenAI compatible model by passing the model name to the ChatOpenAI class using a custom URL (or any other parameter that would go into the normal OpenAI API call).

Anthropic example

from browser_use import Agent, ChatAnthropic

# Initialize the model
llm = ChatAnthropic(
    model="claude-sonnet-4-0",
)

# Create agent with the model
agent = Agent(
    task="...", # Your task here
    llm=llm
)
And add the variable:
.env
ANTHROPIC_API_KEY=

Azure OpenAI example

from browser_use import Agent, ChatAzureOpenAI
from pydantic import SecretStr
import os

# Initialize the model
llm = ChatAzureOpenAI(
    model="o4-mini",
)

# Create agent with the model
agent = Agent(
    task="...", # Your task here
    llm=llm
)
Required environment variables:
.env
AZURE_OPENAI_ENDPOINT=https://your-endpoint.openai.azure.com/
AZURE_OPENAI_API_KEY=

Gemini example

[!IMPORTANT] GEMINI_API_KEY was the old environment var name, it should be called GOOGLE_API_KEY as of 2025-05.
from browser_use import Agent, ChatGoogle
from dotenv import load_dotenv

# Read GOOGLE_API_KEY into env
load_dotenv()

# Initialize the model
llm = ChatGoogle(model='gemini-2.5-flash')

# Create agent with the model
agent = Agent(
    task="Your task here",
    llm=llm
)
Required environment variables:
.env
GOOGLE_API_KEY=

AWS Bedrock example

AWS Bedrock provides access to multiple model providers through a single API. We support both a general AWS Bedrock client and provider-specific convenience classes.

General AWS Bedrock (supports all providers)

from browser_use import Agent, ChatAWSBedrock

# Works with any Bedrock model (Anthropic, Meta, AI21, etc.)
llm = ChatAWSBedrock(
    model="anthropic.claude-3-5-sonnet-20240620-v1:0",  # or any Bedrock model
    aws_region="us-east-1",
)

# Create agent with the model
agent = Agent(
    task="Your task here",
    llm=llm
)

Anthropic Claude via AWS Bedrock (convenience class)

from browser_use import Agent, ChatAnthropicBedrock

# Anthropic-specific class with Claude defaults
llm = ChatAnthropicBedrock(
    model="anthropic.claude-3-5-sonnet-20240620-v1:0",
    aws_region="us-east-1",
)

# Create agent with the model
agent = Agent(
    task="Your task here",
    llm=llm
)

AWS Authentication

Required environment variables:
.env
AWS_ACCESS_KEY_ID=
AWS_SECRET_ACCESS_KEY=
AWS_DEFAULT_REGION=us-east-1
You can also use AWS profiles or IAM roles instead of environment variables. The implementation supports:
  • Environment variables (AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_DEFAULT_REGION)
  • AWS profiles and credential files
  • IAM roles (when running on EC2)
  • Session tokens for temporary credentials
  • AWS SSO authentication (aws_sso_auth=True)

Groq example

from browser_use import Agent, ChatGroq

llm = ChatGroq(model="meta-llama/llama-4-maverick-17b-128e-instruct")

agent = Agent(
    task="Your task here",
    llm=llm
)
Required environment variables:
.env
GROQ_API_KEY=

Ollama

from browser_use import Agent, ChatOllama

llm = ChatOllama(model="llama3.1:8b")

Langchain

Example on how to use Langchain with Browser Use.

Other models (DeepSeek, Novita, X, Qwen…)

We support all other models that can be called via OpenAI compatible API. We are open to PRs for more providers. Examples available: