A widely shared thread claims Higgsfield paid more than $1 million to license one creator's likeness for Soul ID and a full-length AI series. Track the business model, but verify contract terms and production claims independently before treating it as a template.

Hasantoxr says a creator named Adil did standard Higgsfield brand shoots and then licensed his likeness for more than $1 million. The same thread says Higgsfield used Soul ID to build a "perfect AI doppelganger," and that this rendered version — not a live performance — appeared in a full-length AI series, according to the Soul ID claim.
That is a meaningful shift for creative labor if true: the asset being sold is not a day rate or a performance, but reusable identity rights. The key missing piece is verification. None of the evidence here includes a contract, rights scope, term length, exclusivity, or revenue-share details.
The thread frames the story less as celebrity-tech hype and more as a new production stack. Its most specific claim is that a four-person team made a cinematic episode in four days with zero spend on cameras, sets, or crew, while test audiences supposedly could not distinguish it from traditionally shot work.
For AI filmmakers, the adjacent claim is a distribution path: Hasantoxr says Higgsfield has paid $500K through its Action Contest, that audiences vote on pilots, and that top entries can be pushed into sponsored production and an Original Series slot. Taken together, the pitch is clear — likeness licensing plus fast synthetic production plus platform-backed distribution — but every major number here still comes from one thread and needs independent confirmation before it becomes a repeatable playbook.
A shared workflow converts GTA-style stills into photoreal images with Nano Banana 2, then animates them in LTX-2.3 Pro 4K using detailed material, skin, vehicle, and camera prompts. Try it for trailer-style previsualization if you want more control at lower cost.
releaseTopview added Seedance 2.0 to Agent V2, pairing multi-scene generation with a storyboard timeline and Business Annual access billed as 365 days of unlimited generations. That moves longform video workflows toward editable sequences instead of stitched clips.
workflowCreators are moving from V8 calibration complaints to darker film-still scenes, fashion shots, and worldbuilding tests, with ECLIPTIC remakes showing stronger depth and lighting. Retest saved SREF recipes if you rely on V8 for cinematic ideation.
workflowA shared workflow converts GTA-style stills into photoreal images with Nano Banana 2, then animates them in LTX-2.3 Pro 4K using detailed material, skin, vehicle, and camera prompts. Try it for trailer-style previsualization if you want more control at lower cost.
workflowShared Nano Banana 2 workflows now cover turnaround sheets, distinctive facial traits, and photoreal rerenders that keep the framing of a reference image. Use one prompt grammar for concept art, editorial portraits, and animation prep.
Adil worked regular brand shoots for Higgsfield. The kind of gig that pays a few hundred bucks and gets forgotten in two weeks. Then they licensed his likeness. For over $1,000,000.
Recently Higgsfield has already paid $500K through their Action Contest to independent AI filmmakers. Their platform reaches 4 billion impressions total Audiences vote on which pilots get a full season. The best contest work gets sponsored production and a slot on the Original Show more
The production numbers are insane: → 4 people on the entire team → 4 days to produce a full cinematic episode → $0 spent on cameras, sets, or crew → Test audiences could NOT distinguish it from traditionally shot content Industry analysts predicted this was 5-10 years out. Show more