A new open-source project called RestaRules wants to do for AI booking agents what robots.txt did for web crawlers: give the people being acted upon a simple, self-hosted way to set the rules before anyone else does it for them. The proposal is straightforward — restaurants publish a JSON file at /.well-known/agent-venue-rules.json spelling out how any AI agent, whether a voice concierge, a booking assistant, or a search tool, is expected to behave before taking action on a customer's behalf.

The schema, built on JSON Schema Draft 2020-12 and validated at runtime using Ajv, covers the practical specifics of agent conduct: mandatory disclosure phrasing, permitted communication channels (phone, web, SMS, email, and app), rate limits, party size caps, deposit and cancellation terms, and anti-resale constraints. Its most deliberate design choice is the default_policy field, which requires every venue file to declare either deny_if_unspecified or allow_if_unspecified — forcing operators to take a position on what happens when a rule is simply absent, rather than leaving agents to guess. The project recommends deny_if_unspecified for most venues. A ten-step decision procedure tells agents exactly how to work through a venue's file before acting.

RestaRules deliberately stops short of handling reservations itself, deferring availability queries and payments to complementary protocols like AgenticBooking and UCP. That layered approach is sensible on paper, but it multiplies the adoption problem considerably. For the standard to be useful, restaurants would need to publish RestaRules files, and the agents querying them would need to actually check — a chicken-and-egg dynamic familiar from every web standard that launched without tooling on both sides of the equation. At version 0.2, the project ships with a reference CLI demo and example venue files, including one for the fictional Bella Notte Trattoria served via GitHub Pages. What it doesn't have yet is any evidence of real-world restaurant adoption or agent-side integration outside its own demo. The project's README makes no such claim — but the gap between a well-designed schema and industry uptake is where most open standards quietly expire.

The robots.txt comparison is the project's strongest argument, and it's an honest one. When web crawlers proliferated in the early 1990s, website operators had no mechanism to signal behavioural expectations until a simple, informal convention — never formally standardised, never enforced — gradually became the default assumption baked into every serious crawler. RestaRules is betting the hospitality industry is at a structurally similar inflection point as AI agents begin handling bookings at scale. The question isn't whether restaurants need a mechanism like this. It's whether a volunteer-maintained JSON schema can establish itself before a platform with a business model — or a regulator with a mandate — decides to answer the question instead.