Ad 728 × 90

Breaking News

random

Show HN: Aura – Like robots.txt, but for AI actions https://ift.tt/RwcuHTg

Show HN: Aura – Like robots.txt, but for AI actions Hi, I've been watching the rise of AI agents with a mix of excitement and dread. We're building incredible tools that can browse the web, but we're forcing them to navigate a world built for human eyes. They scrape screens and parse fragile DOMs. We're trying to tame them to act like humans. I believe this is fundamentally wrong. The goal isn't to make AI operate at a human level, but to unlock its super-human potential. The current path is dangerous. When agents from OpenAI, Google, and others start browsing at scale and speed, concepts like UI/UX will lose meaning for them. The entire model of the web is threatened. Website owners are losing control over how their sites are used, and no one is offering a real solution. The W3C is thinking about it. I decided to build it. That's why I created AURA (Agent-Usable Resource Assertion). It's an open protocol with a simple, powerful idea: let website owners declare what an AI can and cannot do. Instead of letting an agent guess, the site provides a simple aura.json manifest. This gives control back to the site owner. It's a shift from letting AIs scrape data to being granted capabilities. We get to define the rules of engagement. This allows us to increase what AIs can do, not by letting them run wild, but by giving them clear, structured paths to follow. A confession: I'm not a hardcore programmer; I consider myself more of a systems thinker. I actually used AI extensively to help me write the reference implementation for AURA. It felt fitting to use the tool to build its own guardrails. The core of the protocol, a reference server, and a client are all open source on GitHub. You can see it work in 5 minutes: Clone & Install: git clone https://ift.tt/LrAeO6W && cd aura && pnpm install Run the Server: pnpm --filter aura-reference-server dev Run the Agent: (in a new terminal) pnpm --filter aura-reference-client agent -- http://localhost:3000 "list all the blog posts" You'll see the agent execute the task directly, no scraping or DOM parsing involved. The GitHub repo is here: https://ift.tt/TXfJtYs I don't know if AURA will become the standard, but I believe it's my duty to raise this issue and start the conversation. This is a foundational problem for the future of the web. It needs to be a community effort. The project is MIT licensed. I'm here all day to answer questions and listen to your feedback—especially the critical kind. Let's discuss it. https://ift.tt/TXfJtYs August 6, 2025 at 06:37AM
Show HN: Aura – Like robots.txt, but for AI actions https://ift.tt/RwcuHTg Reviewed by Technology World News on August 07, 2025 Rating: 5

No comments:

Contact Form

Name

Email *

Message *

Powered by Blogger.