Hugging Face now serves Markdown when agents fetch Papers pages and published a skill for searching papers plus linked models, datasets, and Spaces. Research agents can cut token waste and retrieve paper context in a format that is easier to parse and ground.

When an agent such as Cursor or Claude Code requests a Hugging Face Papers page, Hugging Face now serves a Markdown version automatically. In the feature thread, the product claim is explicit: this is meant to cut token use and improve efficiency, while a matching repost describes the output as improving “content clarity” for agents as well.
The attached [img:1|Markdown paper view] shows the practical difference: the normal paper page is paired with a raw Markdown rendering that exposes the abstract and section structure directly. That matters for research and coding agents that would otherwise spend tokens pulling a full web page and stripping UI before they can extract the paper body.
The second change is a new paper-pages skill for agents. According to the launch thread, the skill lets an agent search papers by title, by author, or by semantic similarity, then read the paper content and follow related assets connected to that paper.
Those related assets include linked models, datasets, and Spaces on the Hub, which turns a paper page into a lightweight retrieval surface for implementation context rather than just a reading view. The same thread demonstrates that flow with MolmoPoint and links out to the paper, a model page, and a demo Space via the paper, the model, and the demo. Hugging Face's company repost positions the update as part of “AI powered research,” while the skill repost says agents can use a SKILL.md entry to learn how to work with Papers pages.
ChatGPT now saves uploaded and generated files into an account-level Library that can be reused across conversations from the web sidebar or recent-files picker. It removes repetitive re-uploading and makes past PDFs, spreadsheets, and images part of a persistent working context.
releaseOpenClaw shipped version 2026.3.22 with ClawHub, OpenShell plus SSH sandboxes, side-question flows, and more search and model options, then followed with a 2026.3.23 patch. Teams get a broader plugin surface, but should patch quickly and review plugin trust boundaries as the ecosystem grows.
releaseCursor shipped Instant Grep, a local regex index built from n-grams, inverted indexes, and Bloom filters that drops large-repo searches from seconds to milliseconds. Faster candidate retrieval shortens the coding-agent loop, especially when ripgrep-style scans become the bottleneck.
breakingChatGPT now saves uploaded and generated files into an account-level Library that can be reused across conversations from the web sidebar or recent-files picker. It removes repetitive re-uploading and makes past PDFs, spreadsheets, and images part of a persistent working context.
breakingEpoch AI says GPT-5.4 Pro elicited a publishable solution to one 2019 conjecture in its FrontierMath Open Problems set, with a formal writeup planned. Treat it as an early milestone worth reproducing, not blanket evidence that frontier models can already automate math research.
Hugging Face Papers for AI Agents When AI agents such as Cursor or Claude Code fetch a Hugging Face Papers page, Markdown versions are served automatically, saving tokens and improving efficiency A new hugging-face-paper-pages skill for AI agents lets agents search papers by Show more
Introducing the Paper Pages skill! Simply paste this SKILL.md, so your coding agent knows how to work with @huggingface papers Ask it to summarize papers, search papers, or list linked models or datasets