Manifesto

The web wasn't
built for AI

Every day, billions of AI agent requests hit websites designed exclusively for human eyeballs. The result? Wasted tokens, hallucinated facts, and an invisible tax on the entire AI ecosystem.

The Widening Gap

Optimized for humans.
Hostile to machines.

Modern web applications ship hundreds of kilobytes to megabytes of HTML, CSS, and JavaScript per page. Within that payload, only a few kilobytes of actual business-critical text — prices, descriptions, documentation, policies — matter for AI consumption.

When AI crawlers ingest these pages, their token budgets are burned on layout divs, Tailwind utility classes, tracking scripts, and animation wrappers. The semantic signal drowns in visual noise.

95%+

of typical page bytes are invisible to AI understanding

10-100x

payload reduction possible with clean Markdown extraction

3x

increase in AI agent traffic to content sites in the past year

$0.00

most sites invest in AI-readability optimization today

Ontology: the study of being,
abstracted from appearance

Onto borrows its name from philosophy. Just as ontology studies the nature of existence independent of how things look, Onto separates what your website is from how it appears. The same content, faithfully served in two formats: rich HTML for human browsers, clean Markdown for AI agents.

“We don't create a shadow web for bots. We faithfully represent the same underlying content in multiple formats — HTML for humans, Markdown for agents — rather than cloaking or injecting deceptive content.”

— Onto Design Philosophy
The Business Risk

What happens without AI optimization

Hallucinated Facts

AI models fill in gaps with plausible-sounding fiction. Wrong prices, outdated features, fabricated policies — all attributed to your brand.

"Nike offers free returns on all items" → Actually only within 30 days, with exceptions

Invisible to AI Search

AI-powered search engines and shopping assistants can't extract your value proposition from React noise. Your competitors with cleaner markup win the citation.

Perplexity answers "best project management tool" → Cites competitor's clean docs, skips yours

Token Cost Explosion

Every API call that includes your page burns 10-100x more tokens than necessary. Your customers pay more for worse AI integrations of your content.

596KB HTML page → 148,000 tokens per query → $1.48/call instead of $0.01
Industry Validation

We're not alone in seeing this

Source
Vercel
99% reduction

Serving Markdown instead of HTML to AI agents reduces payload sizes by ~99%. Complex React pages drop from hundreds of KBs to a few KBs of pure Markdown.

Source
Cloudflare
10x token savings

Our 'Markdown for Agents' feature converts HTML to Markdown at the edge when requests carry Accept: text/markdown, positioning AI crawlers as first-class citizens.

Source
Anthropic Research
Higher hallucination

Token noise — the ratio of irrelevant boilerplate to semantic text — directly degrades a model's ability to extract accurate facts. The 'Lost in the Middle' problem.

The dual-representation web is coming

Large infrastructure providers expect a web where the same URL yields multiple formats, chosen via the Accept header and bot detection. Early adopters gain better AI-driven discovery, richer citations, and more accurate summaries.

Laggards risk being misrepresented — or worse, omitted entirely — in AI interfaces that are rapidly becoming the new search front-ends.

Don't wait for the shift

See how your site looks to AI agents right now. It takes 10 seconds.