Pricing
Get a demo

Data Export API

Export Search Data Anywhere

Bulk exports, scheduled deliveries, and webhook notifications in CSV, JSON, or direct BigQuery writes.

CSV/JSON
Export formats
Parquet
BigQuery native
Scheduled
Automated delivery
6+
Destinations

Core Endpoints

Export API Endpoints

Create, schedule, and manage bulk data exports across all DemandSphere data sets. Deliver to any destination.

POST

/v5/exports/create

Create a new export job. Specify the data set (SERP, LLM, logs), date range, filters, output format (CSV, JSON, Parquet), and destination. Returns a job ID for status polling.

GET

/v5/exports/{id}/status

Check the status of an export job. Returns progress percentage, estimated completion time, row count, and file size. Terminal states are completed, failed, or cancelled.

GET

/v5/exports/{id}/download

Download completed export files. Returns a signed URL valid for 24 hours. For large exports, files are split into chunks with a manifest file listing all parts.

POST

/v5/exports/schedule

Create recurring export schedules. Set frequency (daily, weekly, monthly), data filters, format, and destination. Scheduled exports run automatically and deliver to configured endpoints.

GET

/v5/exports/destinations

List and manage configured export destinations. Supported destinations include BigQuery, Amazon S3, Google Cloud Storage, Snowflake, SFTP servers, and webhook URLs.

POST

/v5/exports/webhooks

Register webhooks for export lifecycle events. Get notified when exports start, complete, or fail. Includes download URLs and row counts in the webhook payload.


Destinations

Deliver Data Where You Need It

BigQuery

Direct writes to your BigQuery project. Schema-managed tables with automatic partitioning and deduplication.

Amazon S3

Deliver files to any S3 bucket. Supports IAM role assumption, custom prefixes, and server-side encryption.

Google Cloud Storage

Write directly to GCS buckets. Service account authentication with configurable path templates.

Snowflake

Load data into Snowflake tables via staging. Supports auto-schema detection and incremental loads.

SFTP

Deliver to any SFTP server with key-based or password authentication. Custom directory paths and file naming templates.

Webhook

Push export data to any HTTP endpoint. Configurable payload format, headers, and retry policies.


Capabilities

Export Any Data Set

Multiple Formats

Export in CSV for spreadsheets, JSON for applications, or Parquet for data warehouses. Each format is optimized for its use case with proper typing and compression.

Scheduled Delivery

Set up recurring exports on daily, weekly, or monthly cadences. Exports run automatically at configured times and deliver to your chosen destination without manual intervention.

All Data Sets

Export from any DemandSphere data set: SERP rankings, LLM mentions, citations, log analytics, crawl data, and keyword research. Apply filters, date ranges, and column selections.

Secure Delivery

All exports are encrypted in transit and at rest. Signed download URLs expire after 24 hours. Destination credentials are stored encrypted and never logged.


Use Cases

How Teams Use the Export API

Data Warehouse Integration

Schedule daily exports of SERP and LLM data directly to BigQuery or Snowflake. Join with your own business data for custom attribution models, revenue correlation, and executive dashboards.

Client Reporting

Agencies use scheduled exports to generate weekly client reports automatically. CSV exports feed into Google Sheets or Looker Studio templates. No manual data pulls required.

Machine Learning Pipelines

Export historical ranking and visibility data to train predictive models. Parquet format integrates directly with pandas, Spark, and other data science tools for feature engineering and analysis.

Backup and Compliance

Maintain your own copy of all collected data for compliance or backup purposes. Schedule monthly full exports to S3 or GCS with configurable retention policies.


FAQ

Common Questions

You can export any data available in the DemandSphere platform: SERP rankings, keyword data, LLM mentions, citations, sentiment, log analytics, crawl data, and more. Each data set supports its own set of filters and column selections.

There is no hard limit on export size. Large exports are automatically chunked into manageable files with a manifest for reassembly. For very large data sets, we recommend Parquet format with direct BigQuery or S3 delivery for best performance.

Yes. Use the /v5/exports/schedule endpoint to create daily, weekly, or monthly export schedules. Each schedule specifies the data set, filters, format, and destination. Schedules can be paused, updated, or deleted at any time.

Destinations are configured through the platform dashboard or via the /v5/exports/destinations endpoint. Each destination type has its own authentication requirements: IAM roles for AWS, service accounts for GCP, key pairs for SFTP, and so on. Credentials are validated on creation.

Get started

See it with your own data.

30-minute demo. We'll run it on your domain - no prep required.

Get a demo View pricing