LangChain open-sourced Deep Agents v0.4.11, an MIT agent harness with planning, files, shell access, sub-agents, and auto-summarization. Study it if you want a readable template for building Claude Code-style tools on your own model stack.

Deep Agents v0.4.11 is a batteries-included agent harness rather than a single demo script. In LangChain's post, the stack is described as including task planning, filesystem access for reading and writing code, sandboxed shell execution, delegated sub-agents, and automatic summarization when conversations run long. The repo card in
frames that as an opinionated agent you can run first and customize later.
The linked repository in the Deep Agents repo adds the practical angle: this is meant to be inspectable and modifiable, with the usual production-minded plumbing around agent workflows instead of a black-box interface.
For designers, filmmakers, and musicians who build their own utilities, Deep Agents is useful as a template for multimodal-adjacent production tooling even though the release itself is code-first. The combination of planning, file operations, shell access, and isolated sub-agents maps cleanly to workflows like asset generation pipelines, batch file cleanup, render orchestration, or prompt-to-output automation.
The more important detail in the thread is that the harness is model-agnostic. That makes it easier to study the workflow separately from any one frontier model and adapt the same structure to a custom stack, which is the real creative takeaway here.
A shared workflow converts GTA-style stills into photoreal images with Nano Banana 2, then animates them in LTX-2.3 Pro 4K using detailed material, skin, vehicle, and camera prompts. Try it for trailer-style previsualization if you want more control at lower cost.
releaseTopview added Seedance 2.0 to Agent V2, pairing multi-scene generation with a storyboard timeline and Business Annual access billed as 365 days of unlimited generations. That moves longform video workflows toward editable sequences instead of stitched clips.
workflowCreators are moving from V8 calibration complaints to darker film-still scenes, fashion shots, and worldbuilding tests, with ECLIPTIC remakes showing stronger depth and lighting. Retest saved SREF recipes if you rely on V8 for cinematic ideation.
workflowA shared workflow converts GTA-style stills into photoreal images with Nano Banana 2, then animates them in LTX-2.3 Pro 4K using detailed material, skin, vehicle, and camera prompts. Try it for trailer-style previsualization if you want more control at lower cost.
workflowShared Nano Banana 2 workflows now cover turnaround sheets, distinctive facial traits, and photoreal rerenders that keep the framing of a reference image. Use one prompt grammar for concept art, editorial portraits, and animation prep.
LangChain just open-sourced a replica of Claude Code. It’s an MIT-licensed framework that recreates the core workflow behind coding agents like Claude Code but in an open system developers can inspect and modify. It is called Deep Agents. I spent a bit of time looking through Show more
Deep Agents repo : github.com/langchain-ai/d…