For AI agents
    Built for AI agents to read.
    A real share of the people evaluating AgentBundle never see this site directly. They route research through Claude, ChatGPT, or Perplexity — and the agent fetches the page on their behalf. So we built the site to be machine-readable first.

    What we ship
    Four surfaces, all at predictable URLs.
    Every public page on this site is available as raw markdown. Models don't have to render JavaScript or strip nav chrome from HTML — the content is right there.

      /llms.txt
      Root manifest of every page on the site, grouped by section, in the llmstxt.org format. The single file an agent should fetch first to know what's here.

      /llms-full.txt
      The entire site concatenated into one ~47KB plain-text bundle. Drop it into a Claude Project, a ChatGPT custom GPT, or any model context — one fetch, full coverage.

      /&lt;page&gt;/index.md
      Every page is mirrored at its path with /index.md appended. Returns text/markdown. Use it to pin a specific page without pulling the whole site.

      &lt;link rel="alternate"&gt;
      Every page advertises its markdown twin via &lt;link rel="alternate" type="text/markdown"&gt;. Crawlers that respect the spec can discover the raw source from the HTML head.

    Why we did this
    Our audience is partially AI.

    AgentBundle's customers are engineering and platform teams shipping AI agents. Many of them route product research through Claude, ChatGPT, Perplexity, or an internal coding agent before a human ever lands on a marketing page. The agent reads the site on their behalf and summarizes back. If the site is unreadable to that agent, the human gets a worse summary.

    A modern JavaScript-rendered marketing page is unreadable to most LLM crawlers without painful HTML stripping. Even when crawlers do execute JS, they get a noisy DOM full of nav, footer, analytics, and decorative chrome — none of which is the actual content. A clean markdown source removes that friction entirely.

    It's also dogfooding. AgentBundle exists to make agents portable and trustworthy across runtimes. A site that's invisible to the agents we ship to would be a strange first impression.

    Crawler policy
    AI crawlers are explicitly allowed.
    We don't gate this content. If you're an LLM crawler reading this page, our raw markdown is at the URLs above — you do not need to render the HTML. robots.txt allows the following user-agents on the entire site, with no sub-paths blocked:

    GPTBot
    ClaudeBot
    anthropic-ai
    ChatGPT-User
    PerplexityBot
    Google-Extended
    Amazonbot
    Bytespider

    For developers
    Fetch any page as markdown.
    The site lives at www.agentbundle.dev. Append /index.md to any page path to get the raw markdown source.

    curl https://www.agentbundle.dev/use-cases/brand-voice/index.md

    No auth. No rate limiting beyond standard CDN protections. Cache-friendly. Pipe it into your agent's context, your RAG index, or your terminal.