POST
/
api
/
v1
/
simple-search
import aiohttp
import asyncio

async def search():
    payload = {
        "query": "latest developments in artificial intelligence",
        "max_websites": 5,
        "depth": 2
    }
    
    headers = {
        "Authorization": "Bearer YOUR_API_KEY",
        "Content-Type": "application/json"
    }
    
    async with aiohttp.ClientSession() as session:
        async with session.post(
            "https://api.browser-use.com/api/v1/simple-search",
            json=payload,
            headers=headers
        ) as response:
            return await response.json()

result = asyncio.run(search())
print(result)
{
  "results": [
    {
      "url": "https://example1.com",
      "content": "Relevant content extracted from the first website..."
    },
    {
      "url": "https://example2.com", 
      "content": "Relevant content extracted from the second website..."
    }
  ]
}

Overview

Search and extract content from multiple websites in real-time. Gets live data by actually visiting sites, not cached results. 💡 Complete working example: simple_search.py

Request

query
string
required
The search query to process
max_websites
integer
default:"5"
Maximum number of websites to process from search results (1-10)
depth
integer
default:"2"
How deep to navigate within each website (2-5). Higher depth = more thorough exploration through multiple page clicks.

Response

results
array
Array of results from processed websites
{
  "results": [
    {
      "url": "https://example1.com",
      "content": "Relevant content extracted from the first website..."
    },
    {
      "url": "https://example2.com", 
      "content": "Relevant content extracted from the second website..."
    }
  ]
}
import aiohttp
import asyncio

async def search():
    payload = {
        "query": "latest developments in artificial intelligence",
        "max_websites": 5,
        "depth": 2
    }
    
    headers = {
        "Authorization": "Bearer YOUR_API_KEY",
        "Content-Type": "application/json"
    }
    
    async with aiohttp.ClientSession() as session:
        async with session.post(
            "https://api.browser-use.com/api/v1/simple-search",
            json=payload,
            headers=headers
        ) as response:
            return await response.json()

result = asyncio.run(search())
print(result)

Pricing

Cost per request: 1 cent × depth × max_websites Examples:
  • depth=2, max_websites=5 = 10 cents per request (default values)
  • depth=2, max_websites=3 = 6 cents per request
  • depth=3, max_websites=2 = 6 cents per request