Wraith Browser

Automation

Parallel browsing, playbook templates, deduplication tracking, workflows, and DAG orchestration

Automation tools handle multi-page and multi-step operations. Use swarm tools for local parallel browsing, playbooks for repeatable automation templates, deduplication tracking to avoid processing the same URL twice, workflows for recording and replaying browser sessions, and DAGs for complex task orchestration with dependencies.

Parallel Browsing

Local parallelism tools that visit multiple URLs concurrently using separate browser instances on your machine.

ToolDescriptionKey Parameters
swarm_fan_outVisit multiple URLs in parallel and collect resultsurls (required), max_concurrent
swarm_collectCollect results from a parallel browsing operation(none)

Playbook Templates

Pre-authored automation templates for common multi-step flows. Playbooks are defined in YAML and support variables, conditionals, loops, and error handling.

ToolDescriptionKey Parameters
swarm_run_playbookExecute a YAML playbook or built-in templateplaybook (required), variables, url
swarm_list_playbooksList all available playbook templates(none)
swarm_playbook_statusCheck progress of a running playbook(none)

Built-in templates:

  • greenhouse-apply -- Greenhouse application flow
  • ashby-apply -- Ashby application flow
  • lever-apply -- Lever application flow
  • indeed-apply -- Indeed Easy Apply flow

Playbook step types include: Navigate, Click, Fill, Select, Wait, Extract, Verify, Screenshot, CustomDropdown, UploadFile, SubmitForm, EvalJs, Conditional (if_url_contains, if_variable), and Repeat (for-each loops). Each step can define error handling: Abort, Skip, Retry (with count/delay), or Screenshot.

Deduplication & Verification

SQLite-backed tracking to avoid processing the same URL twice. Records submissions with metadata and verifies success.

ToolDescriptionKey Parameters
swarm_dedup_checkCheck if a URL has already been processedurl (required)
swarm_dedup_recordRecord that a submission was completedurl (required), company, title, platform
swarm_dedup_statsShow aggregate deduplication statistics(none)
swarm_verify_submissionCheck page for success/error indicators(none)

Workflow Recording & Replay

Record multi-step browser sessions and replay them with variable substitution.

ToolDescriptionKey Parameters
workflow_start_recordingBegin recording a replayable workflowname (required)
workflow_stop_recordingStop recording and save the workflowdescription (required)
workflow_replayReplay a workflow with variable substitutionname (required), variables
workflow_listList all saved workflows(none)

DAG Orchestration

Declarative task graphs with dependency management for complex parallel execution.

ToolDescriptionKey Parameters
dag_createCreate a new task DAGname (required)
dag_add_taskAdd a task nodetask_id (required), description (required), action_type (required), target
dag_add_dependencyAdd a dependency edge between taskstask_id (required), depends_on (required)
dag_readyGet tasks ready to execute (all dependencies met)(none)
dag_completeMark a task as completedtask_id (required), result (required)
dag_progressShow DAG completion progress(none)
dag_visualizeGenerate a Mermaid diagram of the DAG(none)

Examples

Fan out across multiple pages

{
  "tool": "swarm_fan_out",
  "arguments": {
    "urls": [
      "https://example.com/page-1",
      "https://example.com/page-2",
      "https://example.com/page-3",
      "https://example.com/page-4"
    ],
    "max_concurrent": 4
  }
}

Visits all URLs in parallel using separate local browser instances, then use swarm_collect to gather the results. The max_concurrent parameter controls how many pages are fetched simultaneously.

Run a playbook template

{
  "tool": "swarm_run_playbook",
  "arguments": {
    "playbook": "greenhouse-apply",
    "variables": {
      "resume_path": "/home/user/resume.pdf",
      "first_name": "Jane",
      "last_name": "Doe",
      "email": "jane@example.com"
    },
    "url": "https://boards.greenhouse.io/company/jobs/12345"
  }
}

Executes the automation template with variable substitution. Check progress with swarm_playbook_status.

Record and replay a workflow

{
  "tool": "workflow_start_recording",
  "arguments": {
    "name": "search-and-extract"
  }
}

All subsequent browse_* tool calls are captured as workflow steps. Stop recording, then replay later with different variables:

{
  "tool": "workflow_replay",
  "arguments": {
    "name": "search-and-extract",
    "variables": {
      "query": "Rust Developer",
      "location": "Remote"
    }
  }
}

On this page