API Docs
RobotScraping.com API
Production-grade extraction endpoints for synchronous and asynchronous scraping.
Quickstart
Base URL: https://api.robotscraping.com
Authenticate with the x-api-key header.
cURL
curl -X POST https://api.robotscraping.com/extract \
-H "content-type: application/json" \
-H "x-api-key: YOUR_KEY" \
-d '{"url":"https://example.com","fields":["title","price"]}'Python
import requests
response = requests.post(
"https://api.robotscraping.com/extract",
headers={
"Content-Type": "application/json",
"x-api-key": "YOUR_KEY"
},
json={
"url": "https://example.com",
"fields": ["title", "price"]
}
)
print(response.json())Node.js
const response = await fetch('https://api.robotscraping.com/extract', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'x-api-key': 'YOUR_KEY'
},
body: JSON.stringify({
url: 'https://example.com',
fields: ['title', 'price']
})
});
const data = await response.json();
console.log(data);POST /extract
Render a page, distill content, and extract structured JSON using an LLM.
{
"url": "https://example.com/product/123",
"fields": ["product_name", "price"],
"instructions": "Prefer visible price",
"options": {
"screenshot": false,
"storeContent": true,
"waitUntil": "domcontentloaded",
"timeoutMs": 15000,
"proxy": {
"type": "proxy_grid",
"country": "us"
},
"headers": {
"User-Agent": "Mozilla/5.0...",
"Accept-Language": "en-US"
}
}
}Available proxy types: browser, proxy_grid, residential, datacenter
Async mode
Send async: true to enqueue work and poll the job status.
POST /batch
Process multiple URLs in a single request. Each URL creates an async job.
Jobs
- GET /jobs — list recent jobs
- GET /jobs/:id — fetch job status
- GET /jobs/:id/result — download stored JSON
Schedules
- POST /schedules — create a cron schedule
- GET /schedules — list schedules for the API key
- PATCH /schedules/:id — pause/resume or update cron/webhook
Usage
Use GET /usage to pull summary, recent logs, and daily series for the API key. Use GET /usage/export to download CSV exports.
Webhooks
Webhooks are signed with HMAC-SHA256. Verify using the X-Robot-Signature header.
Error codes
- bad_request — invalid payload or missing fields/schema
- missing | invalid | inactive — API key errors
- insufficient_credits — out of credits
- blocked — target site blocked rendering
- server_error — extraction failed