LLMPing LLM API status and debugging toolkit

Provider status guide

Is OpenRouter down?

Check official OpenRouter status and troubleshoot common API errors before blaming your code, key, quota, model name, network, or the provider.

Official provider status may not reflect your specific account, region, model, quota, or network conditions. Browser-side tests may be limited by CORS. For reliable monitoring, use server-side checks.

Common errors

What usually breaks

401

Invalid or missing API key

Authentication failed before the request reached the model.

402

Insufficient credits

The account or routed provider does not have enough credits to complete the request.

403

Permission denied

The key is recognized but not allowed to use this model, endpoint, region, or project.

429

Rate limit or quota exceeded

Your request rate, token throughput, or quota exceeded the provider's limit.

502

Upstream provider error

A gateway or routing provider could not get a valid response from the model backend.

503

Unavailable

The service or model is temporarily unavailable.

Troubleshooting checklist

Before calling it a OpenRouter outage

  1. Check the official provider status page.
  2. Confirm your API key is valid and belongs to the right project.
  3. Confirm your account has credits, quota, and model access.
  4. Confirm the model name and endpoint path are correct.
  5. Retry with exponential backoff instead of immediate loops.
  6. Test with cURL outside your app.
  7. Try another model or provider if production is impacted.
  8. Enable fallback routing before the next incident.

Decision tree

Official status is green, but my OpenRouter API still fails

  1. Run a minimal cURL command with the same key and model outside your app.
  2. If cURL fails with 401 or 403, check key scope, project, account access, and model permissions.
  3. If cURL returns 429, check quota, billing, token-per-minute limits, and concurrency.
  4. If only browser requests fail, treat it as CORS or preflight until cURL proves otherwise.
  5. If one model fails but another works, check model availability or fallback routing.
  6. If production fails but local works, compare environment variables, outbound network rules, and region.

FAQ

OpenRouter API status questions

Is OpenRouter down right now?

LLMPing links to the official OpenRouter status source and shows troubleshooting guidance. If live fetch is unavailable, use the official status page for the latest provider-owned incident data.

How do I check OpenRouter API status?

Start with the official status page, then run a minimal cURL request outside your app to separate provider issues from key, quota, model, network, or code issues.

Why is my OpenRouter API request failing?

Common causes include invalid API keys, missing model access, rate limits, quota, wrong endpoint path, CORS, provider incidents, or upstream model capacity.

What should I do if OpenRouter returns 429?

Check billing and quota, reduce concurrency, add exponential backoff with jitter, lower token usage, and route urgent production traffic to a fallback model or provider.

How can I monitor OpenRouter API outages?

The current MVP provides manual debugging tools. Join the early access list if you want server-side checks, Slack or email alerts, and historical latency reports.

Want alerts before users notice?

Join the early access list for server-side LLM API monitoring.

Phase 2 may add scheduled checks, email alerts, Slack alerts, and historical latency reports.

No backend yet. This records intent locally for validation.