Skip to content
← Back to Demos
Content & Design

AI Story Studio

Turn a dataset, article URL, or research topic into an animated multi-scene data story with Vega-Lite charts, narration, cross-model verification, and a shareable link — the 3Blue1Brown / Vox / Bloomberg treatment for any numbers you throw at it.

AI ContentData VizStorytellingPresentation
Share

Demo preview — limited to 3 uses per day. Sign up free for 1 use per tool per day + 15 bonus tokens, or try the full tool.

Like this demo? Get the full tool.

1 free use per day. No credit card required.

Use the Tool

Overview

AI Story Studio converts raw data into an animated, narrated, shareable story — the kind content creators use to make a trend land with a wide audience. Three input modes cover the common use cases: paste a dataset (CSV / TSV / JSON), paste an article URL (we extract the quantitative content), or type a topic for AI-powered research. The output is a multi-scene Vega-Lite story with narration beats, playable in-browser, screen-recordable for video work, and shareable as a public link.

How It Works

  1. Choose an input mode. Paste Data is available in every tier; URL and AI Research unlock with Advanced.
  2. Provide the source. CSV/TSV/JSON paste, article URL, or a topic + provider pick (Perplexity for search+synthesis in one call, Tavily for raw search with our composer synthesizing). Optional hint lets you steer the angle.
  3. We compose. A primary model (gpt-4o-mini in Basic, gpt-4.1 or claude-sonnet-4 in Advanced) identifies the most compelling insight, builds a narrative arc of 3–5 scenes (Basic) or 3–15 scenes (Advanced), and emits a Vega-Lite chart spec + narration for each.
  4. Advanced: cross-model verification. A second model (opposite vendor from the primary — Claude audits GPT and vice versa) fact-checks every scene against the source. You see a transparent confidence score and a flag panel with type (unsupported claim / number mismatch / contradiction / missing source), the exact excerpt, the verifier's reasoning, and an optional suggested fix. Accept fixes to re-render, or ignore them — you stay in control.
  5. Play or share. Auto-play runs the full story with Framer Motion transitions between scenes (fade / slide / morph). Space pauses, arrows scrub. Share Link saves to Supabase and returns a public URL with a 12-char token — opens as an immersive viewer with source attribution.

What You Can Do With It

  • Turn your quarterly sales CSV into a 30-second narrated animation for the board deck.
  • Paste a news article about EV adoption and get a properly-sourced visual breakdown, not a paraphrase.
  • Type "the shift to remote work, 2020–2025" and get a Perplexity-researched, citation-linked, cross-verified mini-doc.
  • Screen-record any story with QuickTime / OBS / Premiere to drop into YouTube or a pitch video.
  • Share the link in a Slack thread — viewers see the same interactive player, not a screenshot.

Pricing

  • Basic — 1 token. Paste data mode. 5 scenes max. gpt-4o-mini. Standard chart types (bar, line, pie, area). Single-pass composition.
  • Advanced — 4 tokens. All three input modes (paste, URL, AI research). 15 scenes max. Premium model. Cross-model verification with flag panel. Advanced chart types (faceted small multiples, scatter, heatmap, morph transitions). Custom branding on shared pages. GIF/WebM export.

Audiences

  • Business — executives, analysts, consultants who need to explain numbers to non-technical audiences.
  • Education — teachers and course builders who want data-driven explainers without custom D3 work.
  • Entertainment — content creators and journalists producing YouTube / TikTok / newsletter segments.

Roadmap

  • v1.1 — GIF and WebM encoding (currently wired but pending encoder polish).
  • v1.2 — Scene-level editing: adjust narration text, reorder scenes, pick alternate chart types.
  • v2.0 — Server-side MP4 render via Remotion + ElevenLabs narration.