import aiohttpimport asyncioasync def search(): payload = { "query": "latest developments in artificial intelligence", "max_websites": 5, "depth": 2 } headers = { "Authorization": "Bearer YOUR_API_KEY", "Content-Type": "application/json" } async with aiohttp.ClientSession() as session: async with session.post( "https://api.browser-use.com/api/v1/simple-search", json=payload, headers=headers ) as response: return await response.json()result = asyncio.run(search())print(result)
Copy
Ask AI
{ "results": [ { "url": "https://example1.com", "content": "Relevant content extracted from the first website..." }, { "url": "https://example2.com", "content": "Relevant content extracted from the second website..." } ]}
Search API
Simple Search
Search Google and extract relevant content from multiple top results
POST
/
api
/
v1
/
simple-search
Copy
Ask AI
import aiohttpimport asyncioasync def search(): payload = { "query": "latest developments in artificial intelligence", "max_websites": 5, "depth": 2 } headers = { "Authorization": "Bearer YOUR_API_KEY", "Content-Type": "application/json" } async with aiohttp.ClientSession() as session: async with session.post( "https://api.browser-use.com/api/v1/simple-search", json=payload, headers=headers ) as response: return await response.json()result = asyncio.run(search())print(result)
Copy
Ask AI
{ "results": [ { "url": "https://example1.com", "content": "Relevant content extracted from the first website..." }, { "url": "https://example2.com", "content": "Relevant content extracted from the second website..." } ]}
Search and extract content from multiple websites in real-time. Gets live data by actually visiting sites, not cached results.💡 Complete working example: simple_search.py
{ "results": [ { "url": "https://example1.com", "content": "Relevant content extracted from the first website..." }, { "url": "https://example2.com", "content": "Relevant content extracted from the second website..." } ]}
Copy
Ask AI
import aiohttpimport asyncioasync def search(): payload = { "query": "latest developments in artificial intelligence", "max_websites": 5, "depth": 2 } headers = { "Authorization": "Bearer YOUR_API_KEY", "Content-Type": "application/json" } async with aiohttp.ClientSession() as session: async with session.post( "https://api.browser-use.com/api/v1/simple-search", json=payload, headers=headers ) as response: return await response.json()result = asyncio.run(search())print(result)