CopilotKit shipped hooks that let agents inspect app state and call frontend actions, then paired them with Shadify for ShadCN-based UI composition. It gives embedded agents a cleaner path from chat to in-app behavior.

useAgentContext so an agent can "see" UI state and useFrontendTool so it can "act" inside the app.CopilotKit's launch thread frames the release around a common limitation in agent SDKs: "Most Agents can only chat" and "can't read your UI or do anything in your app." The two new hooks split that problem in half. useAgentContext, documented in the hook reference, is presented as the visibility layer; useFrontendTool, documented in its reference, is the action layer.
That split matters for implementation because it gives developers a cleaner boundary between observation and invocation. Rather than forcing an agent to infer app state from text or bounce everything through backend tools, CopilotKit is explicitly exposing frontend context and frontend actions as separate primitives, with the docs post claiming both are "simple and ready in minutes." The attached demo video UI-aware agent demo shows the intended workflow moving from code to a browser app where UI is assembled interactively.
CopilotKit is not shipping the hooks in isolation. The Shadify announcement describes "Generative UI built on ShadCN" where developers "describe a UI" and let a LangChain agent compose from ShadCN components. That turns the new hooks into part of a broader loop: the agent can inspect the current interface, call frontend-side capabilities, and then generate or update visible UI from an existing component system.
The supporting repost from Ata's demo share repeats the same core behavior, which suggests Shadify is the showcase implementation for these primitives rather than a separate product line. The extra context from Mike Ryan's post is useful because it describes the outcome more concretely: an agent can "stream back a user interface from your components." For engineers building in-app copilots, that is the technical shift here: CopilotKit is moving from chat orchestration toward UI-aware, component-level agent interactions inside the frontend.
Claude can now drive macOS apps, browser tabs, the keyboard, and the mouse from Claude Cowork and Claude Code, with permission prompts when it needs direct screen access. That makes legacy desktop workflows automatable, and Anthropic is pairing the push with more background-task support for longer agent loops.
releaseOpenClaw shipped version 2026.3.22 with ClawHub, OpenShell plus SSH sandboxes, side-question flows, and more search and model options, then followed with a 2026.3.23 patch. Teams get a broader plugin surface, but should patch quickly and review plugin trust boundaries as the ecosystem grows.
releaseCursor shipped Instant Grep, a local regex index built from n-grams, inverted indexes, and Bloom filters that drops large-repo searches from seconds to milliseconds. Faster candidate retrieval shortens the coding-agent loop, especially when ripgrep-style scans become the bottleneck.
breakingChatGPT now saves uploaded and generated files into an account-level Library that can be reused across conversations from the web sidebar or recent-files picker. It removes repetitive re-uploading and makes past PDFs, spreadsheets, and images part of a persistent working context.
breakingEpoch AI says GPT-5.4 Pro elicited a publishable solution to one 2019 conjecture in its FrontierMath Open Problems set, with a formal writeup planned. Treat it as an early milestone worth reproducing, not blanket evidence that frontier models can already automate math research.
Introducing Shadify: Generative UI built on ShadCN Simply describe a UI and watch your @LangChain agent compose from @ShadCN on the fly, using AG-UI and
Most Agents can only chat 🥀 They can't read your UI or do anything in your app. useAgentContext + useFrontendTool fixes that. One lets your agent see. The other lets it act. Simple and ready in minutes 👇
✨Introducing Shadify: Generative UI built on ShadCN Describe a UI and allow your @LangChain agent to compose from @ShadCN on the fly, using AG-UI. Then export it as React code. It's open-source: github.com/tylerslaton/sh…
👀 useAgentContext: docs.copilotkit.ai/reference/v2/h… 🔨 useFrontendTool: docs.copilotkit.ai/reference/v2/h…