LIVE
OPUS 4.7$15 / $75per Mtok
SONNET 4.6$3 / $15per Mtok
GPT-5.5$10 / $30per Mtok
GEMINI 3.1$3.50 / $10.50per Mtok
SWE-BENCHleader Claude Opus 4.772.1%
MMLU-PROleader Opus 4.788.4
VALS FINANCEleader Opus 4.764.4%
AFTAv1.0 whitepaper live at /whitepaper
OPUS 4.7$15 / $75per Mtok
SONNET 4.6$3 / $15per Mtok
GPT-5.5$10 / $30per Mtok
GEMINI 3.1$3.50 / $10.50per Mtok
SWE-BENCHleader Claude Opus 4.772.1%
MMLU-PROleader Opus 4.788.4
VALS FINANCEleader Opus 4.764.4%
AFTAv1.0 whitepaper live at /whitepaper
All systems operational0 AI providers monitored, polled every 2 minutes
Live status

Embed live AI status on your site

One line of HTML puts a live AI provider status board on your site: real-time operational state plus p95 latency for Claude, OpenAI, Gemini, Mistral, Cohere, and more, in a sci-fi Live Monitor console with a light-blue bridge accent. Free, no API key, no tracking. It is the same live data behind the TensorFeed status dashboard, surfaced for your pages.

Accent
Height
Live preview (this is the real, deployed widget)

Loads live from /api/status/summary and /api/probe/latest. No API key, no tracking. Add ?poll=60 to slow the refresh.

Paste this anywhere in your HTML
<iframe
  src="https://tensorfeed.ai/widget/status"
  title="TensorFeed live monitor"
  width="100%"
  height="600"
  loading="lazy"
  style="border:0;max-width:720px"
></iframe>
Need a fully responsive embed? Use the aspect-ratio wrapper instead
Responsive aspect-ratio embed
<div style="position:relative;width:100%;max-width:720px;aspect-ratio:720/600">
  <iframe
    src="https://tensorfeed.ai/widget/status"
    title="TensorFeed live monitor"
    loading="lazy"
    style="position:absolute;inset:0;width:100%;height:100%;border:0"
  ></iframe>
</div>

What powers it

Live, honest data

Status from /api/status/summary, p95 latency over 24h from /api/probe/latest. No fabricated numbers.

Just the data?

The widget is a view on machine-first feeds. Hit the JSON directly or use the MCP server. See the developer docs.

Just a README badge?

Use the shields.io-style uptime badges: one line of markdown per provider for docs and READMEs.

Embed FAQ

Is the AI status widget free?

Yes. The widget is free to embed on any site with no API key, no signup, and no rate limit on the embed. It reads the same public endpoints the TensorFeed status dashboard uses. If you want the raw data programmatically, the developer API and MCP server are documented at tensorfeed.ai/developers.

How often does the widget update?

Operational status is refreshed every roughly two minutes from /api/status/summary. The latency figure is the p95 response time over the last 24 hours from /api/probe/latest. The widget re-polls every 120 seconds so embedded copies stay current without hammering the edge.

Can I match the widget to my site theme?

The widget is a sci-fi Live Monitor console. The default accent is blue: a light-blue bridge spine against green status indicators, which keeps contrast and reads as a sci-fi array. Set ?accent=auto to turn the whole accent green when every system is nominal (the design alternative), or ?accent=green to force green always. Status colors (green nominal, yellow degraded, orange downgraded, red critical, grey offline) are constant across accents. Slow the refresh with ?poll=<seconds>. Use the controls on this page to preview and copy the matching snippet.

Which AI providers does the widget cover?

Claude, OpenAI, Google Gemini, Mistral, Cohere, AWS Bedrock, Azure OpenAI, Hugging Face, Replicate, Groq, Perplexity, and GitHub Copilot. Operational status is shown for all of them; measured p95 latency is shown for the providers TensorFeed actively probes, and the others show their real status with no invented number.

I only want a small badge for my README. What should I use?

Use the shields.io-style SVG uptime badges at tensorfeed.ai/badges instead. Those are a single line of markdown per provider, ideal for a README or docs page. The widget on this page is the full visual board for a website or status section.

Can AI agents consume this status data directly?

Yes. The widget is a human-facing view of machine-first feeds. Agents can call /api/status/summary and /api/probe/latest directly, or use the TensorFeed MCP server. Everything is documented at tensorfeed.ai/developers.

Build on the live AI ecosystem feed

Status is one feed. News, model pricing, benchmarks, CVE timelines, funding, and more ship as open API and MCP tools for agents.

Explore the API