Pricing
Get a demo

LLM API

LLM Visibility Data via API

Access mention, citation, and sentiment data across all tracked AI platforms programmatically.

10+
AI platforms
REST
JSON endpoints
Real-time
Data freshness
Webhooks
Push notifications

Core Endpoints

LLM Visibility Endpoints

Query mention, citation, sentiment, and response data across every tracked AI platform through a consistent REST interface.

GET

/v5/llm/mentions

Retrieve brand mentions across ChatGPT, Perplexity, Gemini, and Copilot. Filter by platform, date range, prompt category, or sentiment score. Returns mention text, source URL, and confidence rating.

GET

/v5/llm/citations

Track when AI platforms cite your URLs in responses. Includes citation position, surrounding context, the prompt that triggered it, and the referring AI platform.

GET

/v5/llm/responses

Access full AI response data for tracked prompts. Includes the raw response text, structured entities, brand references, competitor mentions, and metadata for each platform.

GET

/v5/llm/sentiment

Sentiment analysis for every brand mention. Returns polarity score, magnitude, category (positive, neutral, negative), and trend data over configurable time windows.

GET

/v5/llm/brands

Manage tracked brands and competitors. List, add, or remove brands from monitoring. Returns current tracking status, mention counts, and share-of-voice summary per platform.

POST

/v5/llm/webhooks

Register webhook endpoints for real-time push notifications. Get alerted when new mentions appear, sentiment shifts beyond thresholds, or citation patterns change.


Capabilities

Built for AI Visibility Workflows

Cross-Platform Tracking

Query mentions and citations across ChatGPT, Perplexity, Gemini, Copilot, and other AI platforms from a single endpoint.

Real-Time Data

Data is available in the API as soon as it is collected. No waiting for batch processing or overnight syncs.

Structured JSON

Every response returns clean, documented JSON with consistent schemas. Pagination, filtering, and sorting are built into every list endpoint.

Webhook Alerts

Configure push notifications for mention spikes, sentiment changes, new citations, or competitive shifts. Deliver to Slack, email, or custom HTTP endpoints.


Use Cases

How Teams Use the LLM API

Competitive Intelligence Dashboards

Pull mention and sentiment data into internal dashboards to track how your brand compares to competitors across AI platforms. Combine with SERP data for a unified visibility picture.

Content Optimization Pipelines

Use citation data to identify which pages earn AI references and which do not. Feed results into content workflows to optimize pages for both traditional search and AI visibility.

Executive Reporting

Automate weekly or monthly AI visibility reports. Pull share-of-voice, mention trends, and sentiment summaries directly into slide decks or BI tools via the API.

Real-Time Alerts

Set up webhook-driven alerts for brand mention spikes, negative sentiment shifts, or competitor surges. Route alerts to Slack channels or incident management tools.


FAQ

Common Questions

The LLM API returns data from ChatGPT, Perplexity, Gemini, Copilot, and other major AI platforms. Coverage expands as new platforms gain market share. All platforms are queryable from the same endpoints.

Data is available in the API as soon as collection completes for each prompt cycle. For most prompts, this means data is refreshed within hours of collection. Webhooks can push notifications the moment new data lands.

There are no artificial rate limits on the LLM API. All endpoints support pagination for large result sets. If you need bulk historical exports, use the Data Export API for more efficient delivery.

The LLM API uses OAuth 2.0 bearer tokens, consistent with all DemandSphere APIs. Generate tokens in the platform dashboard or via the authentication endpoint. Tokens can be scoped to specific data sets.

Get started

See it with your own data.

30-minute demo. We'll run it on your domain - no prep required.

Get a demo View pricing