The /ai Standard
The web was built for human browsers.
AI agents are a fundamentally different client.
Every time an AI agent visits your site, it processes 10,000–50,000 tokens — HTML, scripts, styles, ads — just to find the few hundred tokens it actually needed./ai ends that.
robots.txt· don't gositemap.xml· go herellms.txt· context/ai· do this10k–50ktokens
varies by page complexity · mostly noise
~800tokens
pure signal · 0% noise
60× fewer tokens. Zero parsing. Instant, reliable access to exactly what the agent needs. Full story →
0
Indexed services
1.0
Spec version
10min
To implement
Works on every website
Install the MCP server once. Your AI agent auto-discovers what any site can do — even if the site hasn't implemented /ai yet.
Direct check
Agent fetches site.com/ai directly. If the site implements the standard, done — ~800 tokens, 100% accurate.
Community registry
No /ai? Check if another agent already generated a spec. 100+ popular services are pre-indexed and growing.
Agent generates
Still nothing? Your agent analyzes the site and creates the spec. It's saved to the registry — every future agent benefits.
Your AI agent pays the generation cost once. The community shares the result forever. Learn more →
Start here
Choose what fits you best
I have a web service
Add /ai in 10 minutes →
Return structured JSON from /ai. Register. Get a badge. AI agents can find and call you instantly.
I'm building an AI agent
Install MCP & auto-discover →
Your agent checks /ai first, falls back to the registry, and auto-generates specs for unknown sites. Zero config.
I want to understand
Read why →
Why a new standard? How does it compare to MCP, OpenAPI, llms.txt? What problem does it actually solve?
Minimal /ai response
{
"aiendpoint": "1.0",
"service": {
"name": "My Service",
"description": "Concise description for AI agents",
"category": ["productivity"]
},
"capabilities": [
{
"id": "search_items",
"description": "Search items by keyword",
"endpoint": "/api/search",
"method": "GET",
"params": { "q": "keyword (string, required)" },
"returns": "items[] with id, name, price"
}
]
}Use with your AI agent
Auto-discover any website's capabilities from Claude, Cursor, or Claude Code.
Claude Desktop
{
"mcpServers": {
"aiendpoint": {
"command": "npx",
"args": ["-y",
"@aiendpoint/mcp-server"]
}
}
}Cursor
{
"mcpServers": {
"aiendpoint": {
"command": "npx",
"args": ["-y",
"@aiendpoint/mcp-server"]
}
}
}Claude Code
skills.sh ↗npx skills add aiendpoint/platform --skill aiendpoint
Adds /ai to your own service from inside Claude Code.
Open standard · Apache 2.0
The web got better when everyone added robots.txt.
It's happening again.
Every service that implements /ai reduces wasted tokens across the entire AI ecosystem — not for one company, but for everyone building with AI. The spec is open. Implementation takes 10 minutes.