LLM API
LLM Visibility Data via API
Access mention, citation, and sentiment data across all tracked AI platforms programmatically.
LLM Visibility Endpoints
Query mention, citation, sentiment, and response data across every tracked AI platform through a consistent REST interface.
/v5/llm/mentions
Retrieve brand mentions across ChatGPT, Perplexity, Gemini, and Copilot. Filter by platform, date range, prompt category, or sentiment score. Returns mention text, source URL, and confidence rating.
/v5/llm/citations
Track when AI platforms cite your URLs in responses. Includes citation position, surrounding context, the prompt that triggered it, and the referring AI platform.
/v5/llm/responses
Access full AI response data for tracked prompts. Includes the raw response text, structured entities, brand references, competitor mentions, and metadata for each platform.
/v5/llm/sentiment
Sentiment analysis for every brand mention. Returns polarity score, magnitude, category (positive, neutral, negative), and trend data over configurable time windows.
/v5/llm/brands
Manage tracked brands and competitors. List, add, or remove brands from monitoring. Returns current tracking status, mention counts, and share-of-voice summary per platform.
/v5/llm/webhooks
Register webhook endpoints for real-time push notifications. Get alerted when new mentions appear, sentiment shifts beyond thresholds, or citation patterns change.
Built for AI Visibility Workflows
Cross-Platform Tracking
Query mentions and citations across ChatGPT, Perplexity, Gemini, Copilot, and other AI platforms from a single endpoint.
Real-Time Data
Data is available in the API as soon as it is collected. No waiting for batch processing or overnight syncs.
Structured JSON
Every response returns clean, documented JSON with consistent schemas. Pagination, filtering, and sorting are built into every list endpoint.
Webhook Alerts
Configure push notifications for mention spikes, sentiment changes, new citations, or competitive shifts. Deliver to Slack, email, or custom HTTP endpoints.
How Teams Use the LLM API
Competitive Intelligence Dashboards
Pull mention and sentiment data into internal dashboards to track how your brand compares to competitors across AI platforms. Combine with SERP data for a unified visibility picture.
Content Optimization Pipelines
Use citation data to identify which pages earn AI references and which do not. Feed results into content workflows to optimize pages for both traditional search and AI visibility.
Executive Reporting
Automate weekly or monthly AI visibility reports. Pull share-of-voice, mention trends, and sentiment summaries directly into slide decks or BI tools via the API.
Real-Time Alerts
Set up webhook-driven alerts for brand mention spikes, negative sentiment shifts, or competitor surges. Route alerts to Slack channels or incident management tools.
Common Questions
The LLM API returns data from ChatGPT, Perplexity, Gemini, Copilot, and other major AI platforms. Coverage expands as new platforms gain market share. All platforms are queryable from the same endpoints.
Data is available in the API as soon as collection completes for each prompt cycle. For most prompts, this means data is refreshed within hours of collection. Webhooks can push notifications the moment new data lands.
There are no artificial rate limits on the LLM API. All endpoints support pagination for large result sets. If you need bulk historical exports, use the Data Export API for more efficient delivery.
The LLM API uses OAuth 2.0 bearer tokens, consistent with all DemandSphere APIs. Generate tokens in the platform dashboard or via the authentication endpoint. Tokens can be scoped to specific data sets.
See it with your own data.
30-minute demo. We'll run it on your domain - no prep required.