Your website has a robots.txt for search engine crawlers and a sitemap.xml for indexing. But neither was designed for the AI agents that are rapidly becoming the primary way people find information online. LLMs.txt is a Markdown file that fills that gap — giving ChatGPT, Claude, Perplexity, and other AI platforms a structured summary of what your site is about and where to find the content that matters.
The protocol has been adopted by companies including Anthropic, Stripe, Cloudflare, Shopify, and Vercel. BuiltWith tracks over 844,000 implementations. But does your website actually need one? The answer depends on what you are trying to achieve and how much you understand about what llms.txt actually does — and does not — do.
Key Takeaways
- LLMs.txt is a Markdown file at your domain root that gives AI agents a curated table of contents for your website — proposed by Jeremy Howard in 2024, now adopted by 844,000+ sites.
- No major AI platform has confirmed using llms.txt as a ranking or citation signal — but Anthropic, Stripe, Cloudflare, and Shopify have all implemented it as a forward-looking standard.
- The file takes 30 minutes to create, costs nothing, and follows a simple format: H1 title, optional summary, and H2 sections with curated links to your most important pages.
- LLMs.txt complements — not replaces — structured data, content clarity, and technical accessibility, which remain the primary signals that determine whether AI platforms cite your brand.
- The companion file llms-full.txt provides detailed content for AI agents that need deeper context, while the base file stays concise for quick consumption.
What Is LLMs.txt?
LLMs.txt is a Markdown-formatted file placed at your domain root (yoursite.com/llms.txt) that provides AI agents with a curated, machine-readable summary of your website. The protocol was proposed by Jeremy Howard — creator of fast.ai — in September 2024 as a way to help language models access website information more efficiently.
The concept is straightforward. Large language models work with limited context windows. They cannot efficiently process an entire website. A sitemap gives them hundreds or thousands of URLs with no prioritisation. Robots.txt tells them what they cannot crawl, not what they should focus on. LLMs.txt fills the middle ground: a selective guide to your most important content, written in the format AI models understand best — Markdown.
The specification requires a specific structure:
- H1 title — your site or project name (required)
- Blockquote — a brief summary of what the site does (optional)
- Body content — additional context in any Markdown format except headings (optional)
- H2 sections — categorised lists of links to your key pages, each with a URL and optional description
There is also a companion file, llms-full.txt, which contains the same structure but with expanded content — full page text, detailed descriptions, and supplementary context that would make the base file too large for quick consumption.
Does Your Website Actually Need One?
Here is where the conversation gets more nuanced than most guides suggest.
What the Adoption Data Says
The numbers are growing but still early. SE Ranking's analysis of 300,000 domains found a roughly 10% adoption rate. The sites implementing it skew heavily toward technology companies, developer documentation platforms, and SaaS products — sectors where AI-readiness is already a strategic priority.
What the Platform Support Data Says
This is the part most llms.txt guides gloss over. No major AI platform has publicly confirmed that llms.txt influences how they retrieve, rank, or cite content. Google has stated that AI Overviews continue to rely on traditional search signals. There is no published evidence that ChatGPT, Perplexity, or Claude use llms.txt as a primary retrieval input.
That does not mean the file is useless. It means the file is a forward-looking investment, not a proven ranking lever. The distinction matters because it determines how much time and effort you should dedicate to it relative to other AI optimisation priorities.
The Pragmatic View
LLMs.txt makes sense if you think about it as one layer of a broader AI visibility strategy — not the strategy itself. The signals that currently determine whether AI platforms cite your brand are structured data, content clarity, entity authority, and technical accessibility. Those signals are well-documented, measurable, and confirmed by how AI retrieval systems work.
Adding llms.txt on top of those foundations takes 30 minutes, costs nothing, and positions you for a future where AI agents may start consuming these files as a standard input — similar to how robots.txt went from a loose convention to a fundamental web protocol.
Bottom line: yes, your website should probably have an llms.txt file. But treat it as the finishing touch on a well-optimised site, not the starting point.
How to Create an LLMs.txt File: Step-by-Step
Step 1: Identify Your Key Pages
LLMs.txt is not a sitemap. You are not listing every URL — you are curating the 10–30 pages that best represent your business, expertise, and offerings. Think about it as answering: "If an AI agent could only read a handful of pages on my site, which ones would give it the best understanding of who we are and what we do?"
Typical selections include:
- Homepage — your core value proposition
- Product or service pages — what you sell and why it matters
- About page — who you are, your credentials, your history
- Key blog posts or guides — your highest-authority content
- Pricing page — if publicly available
- Contact or support — how to reach you
- Policy pages — terms, privacy, any industry-specific compliance
Step 2: Write the File
Create a file called llms.txt (plain text, Markdown format) and structure it like this:
# Your Company Name
> Brief description of what your company does,
> who it serves, and what makes it different.
Additional context about your business, products,
or services. Keep this concise — a few sentences
that give an AI agent the essential background.
## Core Pages
- [Homepage](https://yoursite.com): Main landing page with value proposition and services overview
- [About Us](https://yoursite.com/about): Company background, team, mission
- [Pricing](https://yoursite.com/pricing): Service tiers and pricing details
## Products & Services
- [Product A](https://yoursite.com/products/a): Description of what Product A does
- [Product B](https://yoursite.com/products/b): Description of what Product B does
## Resources
- [Blog](https://yoursite.com/blog): Industry insights and guides
- [Getting Started Guide](https://yoursite.com/docs/getting-started): Step-by-step onboarding
- [API Documentation](https://yoursite.com/docs/api): Full API reference
## Legal
- [Terms of Service](https://yoursite.com/terms): Service agreement
- [Privacy Policy](https://yoursite.com/privacy): Data handling and privacy practices
Each link follows the format: [Page Name](URL): Brief description. The description is optional but recommended — it helps AI agents understand what they will find on the page without having to fetch it.
Step 3: Create llms-full.txt (Optional but Recommended)
The companion file llms-full.txt follows the same structure but includes expanded content. Where the base file links to your pricing page with a one-line description, the full file might include the actual pricing details inline. Where the base file links to a guide, the full file might include the full text.
This is particularly useful for AI agents with larger context windows that can consume more information in a single pass. If your site has complex products, technical documentation, or detailed service descriptions, a llms-full.txt file gives AI agents the depth they need to answer specific questions about your business.
Step 4: Deploy and Verify
Place the file so it is accessible at yoursite.com/llms.txt. Depending on your platform:
- Static sites (Next.js, Hugo, Gatsby): Serve it from the
public/directory or create a route handler that returns the Markdown content withtext/plaincontent type - WordPress: Use a plugin like Yoast (which now supports llms.txt generation) or manually place the file in your root directory
- Shopify: Shopify has activated llms.txt support natively — check your storefront settings
After deployment, verify:
- Visit
yoursite.com/llms.txtin a browser — you should see raw Markdown, not an HTML page - Check that all linked URLs return 200 status codes — broken links mean the AI gets incomplete context
- Confirm the file starts with an H1 heading (
# Title) - Keep the file under 50KB — concise is better than comprehensive
Step 5: Maintain It
LLMs.txt is not a set-and-forget file. When you add major new pages, launch new products, or restructure your site, update the file. An outdated llms.txt that points AI agents to deleted pages or deprecated products does more harm than having no file at all.
What LLMs.txt Does Not Replace
A common misconception is that llms.txt is a shortcut to AI visibility. It is not. The file tells AI agents where your content lives — it does not make that content worth citing.
The factors that actually determine whether AI platforms cite your brand are deeper:
- Structured data (JSON-LD) gives AI models a machine-readable identity for your business, products, and content — this is the single highest-impact technical signal for AI discoverability
- Content clarity determines whether AI agents can extract clean, citable statements from your pages — vague marketing copy gets skipped, specific factual claims get cited
- Entity authority builds over time through consistent information across the web — knowledge graphs, citations, brand mentions, and AI-readable trust signals
- Technical accessibility ensures AI crawlers can actually reach and parse your pages — JavaScript rendering, meta tags, and robots directives all play a role
LLMs.txt is useful as a navigation layer on top of these foundations. Without them, the file is a table of contents pointing to content that AI agents will not cite regardless.
Who Should Prioritise LLMs.txt Now?
Some sites benefit more from early implementation than others:
High priority: SaaS companies, developer tools, documentation-heavy products, API providers, and ecommerce stores with large catalogues. These sites have complex structures that benefit from curated AI navigation. If you run an ecommerce store specifically, our ecommerce llms.txt guide covers product-specific implementation patterns.
Medium priority: Professional services firms, agencies, B2B companies, and content publishers. These sites have moderate complexity and would benefit from helping AI agents understand their service offerings and expertise areas.
Lower priority (but still worth doing): Local businesses, personal sites, and small portfolios. The investment is minimal — but the return is also smaller because these sites are typically simple enough for AI agents to understand without a curated guide.
The Bottom Line
LLMs.txt is a 30-minute investment that positions your website for the direction AI search is heading. It will not replace the fundamentals — structured data, content clarity, and technical accessibility are still what determine whether AI platforms cite your brand. But as AI agents become the primary interface between users and the web, having a machine-readable table of contents for your site moves from "nice to have" to "expected standard."
Create the file. Keep it current. Then focus your real energy on making the content it points to worth citing.






