Imbue released Latchkey, a library that prepends ordinary curl calls so local agents can use SaaS and internal APIs while credentials stay on the developer machine. Try it where agents need many HTTP integrations but should not see raw secrets.

curl calls, and the tool detects the service and injects credentials without custom connectors or embedded secrets How it works.Latchkey is a local library that lets agents use services like Slack, GitHub, AWS, Linear, Notion, Stripe, and self-hosted HTTP tools through a single command path. Imbue's launch thread frames the key security boundary plainly: "credentials stay on your machine," not in logs or chat transcripts.
The mechanism is narrower and more practical than a new agent protocol. According to Imbue's implementation thread, agents prepend Latchkey to normal curl requests; Latchkey identifies the target service, injects the right credentials, and forwards the request without custom code or an intermediary. Imbue also says it works with Claude Code, OpenCode, and Codex, and points to its project page for setup details.
The implementation target is the messy part of agent automation: not code generation, but stitching together the APIs and operational systems around it. A practitioner reacting to the launch said they "pretty much refuse to use web dashboards" when agents can use CLIs instead, which captures the workflow Latchkey is trying to generalize from CLI-only tooling to arbitrary HTTP services dashboard complaint.
That fits a broader engineering view that the hard part is the surrounding stack, not just writing code. In Karpathy's DevOps context framing, real autonomy means handling services, payments, auth, databases, security, and deployment without humans clicking through admin pages; Latchkey addresses one slice of that problem by making authenticated HTTP calls agent-accessible without handing over the secrets themselves.
Every opened Plus One, a hosted OpenClaw that lives in Slack, comes preloaded with internal skills, and works with a ChatGPT subscription or other API keys. It lowers the ops burden for deployed coworkers, so teams can test packaged agents before building their own stack.
breakingAnthropic said free, Pro, and Max users will hit 5-hour Claude session limits faster on weekdays from 5am to 11am PT, while weekly caps stay the same. Shift long Claude Code jobs off-peak and watch prompt-cache misses.
releaseOpenAI rolled out Codex plugins across the app, CLI, and IDE extensions, with app auth, reusable skills, and optional MCP servers. Teams should test plugin-backed workflows and permission models before broad rollout.
releaseCline launched Kanban, a local multi-agent board that runs Claude, Codex, and Cline CLI tasks in isolated worktrees with dependency chains and diffs. Teams can use it as a visual control layer for parallel coding agents on repo chores that split cleanly.
releaseMistral released open-weight Voxtral TTS with low-latency streaming, voice cloning, and cross-lingual adaptation, and vLLM Omni shipped day-0 support. Voice-agent teams should compare quality, latency, and serving cost against closed APIs.
How it works: Agents prepend Latchkey to ordinary curl calls. Latchkey detects the service, injects credentials, and the request goes through without custom code, embedded tokens, or intermediary. Works with Claude Code, OpenCode, Codex. And did we mention it’s open-source? Show more
We built a library so local AI agents can use any tool you have access to: Slack, GitHub, AWS, Linear, Notion, Stripe, self-hosted tools, anything with an HTTP endpoint. One command, any API, no custom connectors We call it Latchkey 🔑
This is very smart. But I’ve been doing this informally for months to the greatest extent possible. I pretty much refuse to use web dashboards. The idea of trying to root around in endless menu systems on Cloudflare or Supabase is soul crushing to me when my agents can use CLIs.
When @karpathy built MenuGen (karpathy.bearblog.dev/vibe-coding-me…), he said: "Vibe coding menugen was exhilarating and fun escapade as a local demo, but a bit of a painful slog as a deployed, real app. Building a modern app is a bit like assembling IKEA future. There are all these services,