Back to Dashboard
Scraper Docs

CLI Reference

The Scraper.bot CLI lets you manage flows, trigger runs, and extract data directly from your terminal.

Installation

npm / Homebrew
npm install -g @scraper-bot/cli
# or
brew install scraper-bot

Authentication

bash
scraper auth login
# Opens browser for OAuth login

scraper auth token scr_live_your_key_here
# Set API key directly

Commands Reference

CommandDescription
scraper flows listList all flows
scraper flows create --url URL --mode extractCreate a flow
scraper flows run FLOW_IDTrigger a flow run
scraper flows export FLOW_ID > flow.jsonExport flow as JSON
scraper flows import flow.jsonImport flow from JSON
scraper runs list --flow FLOW_IDList runs for a flow
scraper runs logs RUN_IDStream run logs in real-time
scraper runs watch RUN_IDWatch a run with live progress
scraper extract --url URL --query "product names and prices"One-shot extraction
scraper templates listBrowse templates
scraper templates use TEMPLATE_IDCreate flow from template
scraper config set KEY VALUESet configuration
scraper statusCheck platform status

Example Workflows

Quick Extraction
scraper extract --url "https://news.ycombinator.com" \
  --query "top 10 stories with title, points, and URL" \
  --format json
Create and Schedule a Monitor
scraper flows create \
  --name "Price Watch" \
  --url "https://amazon.com/dp/B09V3KXJPB" \
  --mode monitor \
  --schedule "0 */6 * * *"
Watch a Run in Real-time
scraper runs watch run_abc123

Global Options

FlagDescription
--format json|table|csvOutput format (default: table)
--quietSuppress non-essential output
--verboseShow debug information
--api-key KEYOverride API key for this command
--profile NAMEUse a named profile

Configuration

Configuration is stored in ~/.scraper-bot/config.json. Use named profiles for multiple environments:

bash
scraper config set api-key scr_live_... --profile production
scraper config set api-key scr_test_... --profile staging

# Use a profile
scraper flows list --profile staging