Runway opened Characters on its developer platform with API access, custom voices, embedded knowledge, and a free starter allowance. Use it to build interactive hosts, guides, and assistants that can talk through tasks instead of relying on passive video.

Runway's launch post describes Characters as real-time intelligent avatars that can be embedded into apps, websites, products, and services through the API. The core creative hook is not just the avatar layer: developers can attach bespoke knowledge banks, custom voices, and instructions, then style the character across different visual looks.
The companion developer platform post says the product is available now and points builders to Runway's developer portal, with the first 30 minutes of conversation free. Runway executive Cristóbal Valenzuela's team also framed the launch around voice-first task navigation rather than button-heavy interfaces, with a staff post calling out accessibility as a key use case.
The clearest early pattern is domain-specific assistants. In one prototype, a creator loaded Characters with map knowledge from Bungie's Marathon, then had the avatar read the screen, guide players to objectives, and advise on what loot to extract.
A lighter demo used Characters to identify Japanese carts from a live camera view, turning object recognition into a spoken back-and-forth instead of a static label pass, as shown in the reposted clip. Together, those examples suggest the format works best when the avatar has narrow context and a live task to talk through, not just a generic chat persona.
A shared workflow converts GTA-style stills into photoreal images with Nano Banana 2, then animates them in LTX-2.3 Pro 4K using detailed material, skin, vehicle, and camera prompts. Try it for trailer-style previsualization if you want more control at lower cost.
releaseTopview added Seedance 2.0 to Agent V2, pairing multi-scene generation with a storyboard timeline and Business Annual access billed as 365 days of unlimited generations. That moves longform video workflows toward editable sequences instead of stitched clips.
workflowCreators are moving from V8 calibration complaints to darker film-still scenes, fashion shots, and worldbuilding tests, with ECLIPTIC remakes showing stronger depth and lighting. Retest saved SREF recipes if you rely on V8 for cinematic ideation.
workflowA shared workflow converts GTA-style stills into photoreal images with Nano Banana 2, then animates them in LTX-2.3 Pro 4K using detailed material, skin, vehicle, and camera prompts. Try it for trailer-style previsualization if you want more control at lower cost.
workflowShared Nano Banana 2 workflows now cover turnaround sheets, distinctive facial traits, and photoreal rerenders that keep the framing of a reference image. Use one prompt grammar for concept art, editorial portraits, and animation prep.
Using Runway Character's API, I built a character with the entire map knowledge base of Bungie's latest title @MarathonTheGame. It can read the screen, guide you to objectives, and help you decide which valuables to extract. Just a preview of how gaming will change with AI.
Using @runwayml's new real-time Characters to ID some Japanese carts I just picked up
Runway Characters are available now via our developer platform. Get started today with your first 30 minutes of conversation for free at dev.runwayml.com