Creator demos show Soul Cast generating cast candidates inside Higgsfield Cinema Studio, then placing those characters into scenes through Nano Banana references. Watch it if you want casting and shot planning in a more structured preproduction workflow.

The launch demo centers on a Cast tab inside Higgsfield Cinema Studio, available through Cinema Studio, where creators can build a character by selecting broad production attributes instead of prompting from scratch. The video walks through a female, age-30, 2020s, $250M setup and lands on a named character page for Mariana Cruz, where the interface says exclusive rights can be secured.
ProperPrompter's follow-up post adds the practical next step: use the generated actor and an existing avatar as dual references in Nano Banana 2 to place both in the same restaurant scene. The thread also claims lighter constraint-setting produced better results than over-specifying every field, with the creator relying on randomize first and then making small edits.
ARQ's tool breakdown is useful because it treats character generation as one step in a larger pipeline. ShotDeck is used for frame study before prompting; Qwen 3 VL for shot-by-shot video analysis; Gemini 3.1 Pro and Claude Opus 4.6 for script and prompt development; Nano Banana Pro for locked references across 300-plus shots; Kling 3 Pro for motion from stills; and Reve for environment comps.
That pipeline mindset matters because ARQ's music video breakdown says one project generated 884 shots across three runs, with only 90 making the cut and each finalist getting a custom motion prompt. Soul Cast looks most relevant at that front end, where consistent characters and rights-managed casting matter before a team starts iterating across hundreds of downstream images and video shots.
A shared workflow converts GTA-style stills into photoreal images with Nano Banana 2, then animates them in LTX-2.3 Pro 4K using detailed material, skin, vehicle, and camera prompts. Try it for trailer-style previsualization if you want more control at lower cost.
releaseTopview added Seedance 2.0 to Agent V2, pairing multi-scene generation with a storyboard timeline and Business Annual access billed as 365 days of unlimited generations. That moves longform video workflows toward editable sequences instead of stitched clips.
workflowCreators are moving from V8 calibration complaints to darker film-still scenes, fashion shots, and worldbuilding tests, with ECLIPTIC remakes showing stronger depth and lighting. Retest saved SREF recipes if you rely on V8 for cinematic ideation.
workflowA shared workflow converts GTA-style stills into photoreal images with Nano Banana 2, then animates them in LTX-2.3 Pro 4K using detailed material, skin, vehicle, and camera prompts. Try it for trailer-style previsualization if you want more control at lower cost.
workflowShared Nano Banana 2 workflows now cover turnaround sheets, distinctive facial traits, and photoreal rerenders that keep the framing of a reference image. Use one prompt grammar for concept art, editorial portraits, and animation prep.
This is getting awkward for Hollywood 😬 You can just generate actors with exclusive rights to cast in your AI films. Here's how with Soul Cast from Higgsfield: #higgsfieldpartner
All the tools we use to make films for Top Tier Brands like: @tether @rumblevideo ShotDeck - 1M+ frames from real films. Search any shot. Study it before you prompt it. Qwen 3 VL - Feed it any MP4. It watches the full video, breaks down every shot. Your personal video
To get my new character in a scene with my avatar, I used them both as references in Nano Banana 2. Raw output:
We generated 884 shots across 3 pipeline runs for one music video. Only 90 made the final cut. Each final shot got a custom dynamic motion prompt. Camera movement, subject motion, ambient layers. All hand-directed. Tomorrow I'm giving away every single shot, prompt and entire Show more
“Tethered Together Forever” - The Humans We made the first official song + music video for @tether Watch it. pic.x.com/JrzaROIbH6