Pricing
Get a demo

Log Analytics API

Log Analytics Data via API

Stream or batch access to crawl data, bot activity, and log file analysis.

Streaming
Real-time feed
24
Bot types tracked
Batch
Bulk export
Webhooks
Alert triggers

Core Endpoints

Log Analytics Endpoints

Query crawl activity, bot behavior, status codes, and log file summaries through a consistent REST interface.

GET

/v5/logs/crawls

Retrieve crawl activity by bot, date range, URL pattern, or status code. Returns hit counts, byte volumes, response times, and crawl budget consumption per bot per day.

GET

/v5/logs/bots

List all detected bot user agents with classification data. Includes Googlebot, Bingbot, GPTBot, ClaudeBot, and 20+ others. Filter by category: search engine, AI crawler, monitoring, or unknown.

GET

/v5/logs/status-codes

Aggregate status code distribution across your site. Break down by 2xx, 3xx, 4xx, and 5xx families. Filter by URL path, bot type, or date range to isolate issues.

GET

/v5/logs/summaries

Daily or hourly roll-ups of crawl activity. Total hits, unique URLs crawled, average response time, error rates, and bandwidth consumption. Useful for dashboards and trend analysis.

GET

/v5/logs/stream

Server-sent events stream for real-time log data. Subscribe to filtered event streams by bot type, status code, or URL pattern. Useful for live monitoring dashboards and alerting systems.

POST

/v5/logs/webhooks

Register webhook endpoints for crawl anomaly alerts. Get notified when error rates spike, crawl budget drops, new bots appear, or specific URL patterns see unusual activity.


Capabilities

What You Can Build

Bot Classification

Every bot hit is classified by type: search engine crawler, AI training bot, monitoring service, SEO tool, or unknown. Build reports segmented by bot category.

Streaming Access

The streaming endpoint delivers log events in real time via server-sent events. Build live dashboards that update as crawls happen, without polling.

Anomaly Alerts

Configure webhook alerts for crawl anomalies: sudden drops in Googlebot activity, spikes in 5xx errors, new unknown bots, or unusual crawl patterns on specific URL segments.

Batch Export

For large-scale analysis, request batch exports of historical log data. Delivered as compressed CSV or JSON files, or written directly to BigQuery for warehouse integration.


Use Cases

How Teams Use the Log API

Crawl Budget Monitoring

Track how search engine crawlers allocate their budget across your site. Identify sections that are over-crawled or under-crawled, and correlate crawl patterns with indexation changes.

AI Bot Tracking

Monitor GPTBot, ClaudeBot, and other AI crawlers hitting your site. Understand which pages they access most, how frequently they visit, and whether your robots.txt directives are being respected.

Infrastructure Alerting

Use webhook alerts to detect server errors before they impact indexation. Get notified when 5xx rates exceed thresholds, when response times degrade, or when critical pages start returning errors.

Custom Reporting

Pull log data into your own BI tools for custom analysis. Combine crawl data with ranking and traffic metrics to build a complete picture of how crawl behavior impacts search performance.


FAQ

Common Questions

Analytics AX supports several ingestion methods: direct log file upload, syslog forwarding, and CDN integrations with Cloudflare, Fastly, and AWS CloudFront. Once ingestion is configured, data appears in the API within minutes.

The API classifies 24+ bot types out of the box, including Googlebot, Bingbot, GPTBot, ClaudeBot, PerplexityBot, AppleBot, and others. Unknown user agents are flagged for manual review. Custom bot rules can be added through the platform dashboard.

Yes. The /v5/logs/stream endpoint provides a server-sent events stream with sub-second latency. Filter the stream by bot type, status code, or URL pattern to reduce noise. This is the same data feed that powers the live dashboard in the Analytics AX UI.

Log data is retained for 13 months by default, with extended retention available on Enterprise plans. Historical data can be exported to BigQuery or S3 for long-term storage at any time via the Data Export API.

Get started

See it with your own data.

30-minute demo. We'll run it on your domain - no prep required.

Get a demo View pricing