The llms.txt file is a Markdown document that helps AI systems (ChatGPT, Claude, Perplexity) understand your website. Place it at yoursite.com/llms.txt and AI can instantly know what your business does, what pages matter, and how to accurately describe you.
What is llms.txt?
The llms.txt file is a standardized plain text document that helps artificial intelligence systems understand your website's content, structure, and purpose. Think of it as a "cheat sheet" for AI, a concise summary that tells ChatGPT, Claude, and other large language models exactly what your business does and where to find your most important content.
The standard was proposed by Jeremy Howard, founder of fast.ai and Answer.AI, as a solution to a growing problem: AI systems struggle to parse complex websites. They get lost in navigation menus, miss important pages, and often misrepresent businesses in their responses.
The file in a nutshell
- Lives at: yourdomain.com/llms.txt
- Format: Plain text with Markdown syntax
- Contains: Business description + page URLs + descriptions
- Purpose: Help AI understand and accurately represent your site
Without an llms.txt file, AI systems must crawl your entire website, parse complex HTML, and guess what matters. This often leads to AI hallucinations, where AI invents information about your business that isn't true, or simply ignores your site entirely in favor of competitors who have made their content more accessible.
Live examples from real websites
Want to see what llms.txt files look like in practice? Here are real examples from companies that have already implemented the standard:
Creator of Claude AI. Their llms.txt helps other AI systems understand their documentation.
https://docs.anthropic.com/llms.txtThe AI-first code editor. Uses llms.txt to document their features and pricing.
https://cursor.com/llms.txtAI-powered search engine. Their docs include an llms.txt for API documentation.
https://docs.perplexity.ai/llms.txtPayment infrastructure. Uses llms.txt for their extensive developer documentation.
https://docs.stripe.com/llms.txtPro tip: Check any website's llms.txt by adding /llms.txt to the domain. Try it with your favorite sites!
File structure & syntax
An llms.txt file uses Markdown syntax, a simple formatting language that's readable by both humans and machines. Here's the complete structure:
# Your Website Name > A comprehensive description of your business, products, > or services. This should be 2-4 sentences that capture > your core value proposition and target audience. ## Main Pages - [Homepage](https://yoursite.com/): Brief description of what visitors find on your main page. - [Products](https://yoursite.com/products): Overview of your product offerings with key features. - [Pricing](https://yoursite.com/pricing): Pricing tiers and what's included in each plan. - [Documentation](https://yoursite.com/docs): Technical guides and API reference. ## Resources - [Blog](https://yoursite.com/blog): Industry insights, tutorials, and company updates. - [Case Studies](https://yoursite.com/cases): Success stories from customers. ## Optional - [Additional links that are less critical]
Key elements explained
#Your website or company name. Only one per file.
>Your business description. This is critical, it's the context AI uses to understand everything else.
##Organize your pages into logical groups like 'Products', 'Resources', 'Support'.
- []()Markdown links with descriptions: - [Title](URL): Description
Real-world example
Example: SaaS analytics company
# DataPulse Analytics > DataPulse provides real-time business intelligence dashboards > for e-commerce companies. We help online retailers track sales, > inventory, and customer behavior with AI-powered insights. > Founded in 2022, we serve 500+ businesses from startups to > enterprise retailers. ## Product - [Features](https://datapulse.io/features): Real-time dashboards, AI anomaly detection, custom reports, and 50+ integrations including Shopify, WooCommerce, and BigCommerce. - [Pricing](https://datapulse.io/pricing): Three tiers - Starter ($29/mo for up to 10k orders), Growth ($99/mo for 100k orders), and Enterprise (custom pricing with dedicated support). - [Integrations](https://datapulse.io/integrations): Native connections to major e-commerce platforms, payment processors, and marketing tools. ## Resources - [Documentation](https://docs.datapulse.io/): API reference, setup guides, and webhook documentation for developers. - [Blog](https://datapulse.io/blog): E-commerce analytics tips, industry benchmarks, and product updates. - [Case Studies](https://datapulse.io/customers): How companies like Acme Store increased revenue 34% using our insights. ## Company - [About](https://datapulse.io/about): Our story, team, and mission to democratize e-commerce analytics. - [Careers](https://datapulse.io/careers): Open positions in engineering, sales, and customer success.
Why llms.txt matters for AI visibility
The way people find information is undergoing its biggest shift since Google. Instead of searching, clicking links, and reading, users now ask AI directly and expect complete answers without leaving the chat.
US adults will use AI for search by 2028
Growth in AI search usage since 2024
Of Gen Z prefers AI over traditional search
This shift has created a new field: Generative Engine Optimization (GEO), optimizing your content so AI systems can accurately understand and cite your website. The llms.txt file is the foundation of AI SEO.
What happens without llms.txt
- AI hallucinations: AI invents features you don't have or prices that are wrong
- Missed opportunities: AI recommends competitors because it can't parse your site
- Outdated information: AI cites old content because it can't find what's current
What happens with llms.txt
- Accurate representation: AI describes your business exactly as you intend
- Higher visibility: AI can confidently recommend you for relevant queries
- Competitive edge: You're ahead of the 99% of sites without llms.txt
Who uses llms.txt?
Any website that wants AI to understand its content:
Developer docs
Help AI answer questions about your APIs and libraries
Company websites
Let AI explain your products and services accurately
E-commerce
Guide AI through your catalog, policies, and support
Education
Make courses and resources discoverable to AI
Portfolios
Help AI describe your work and experience
Government
Make policies and regulations easier for AI to explain
How AI systems use llms.txt
Large language models process text to understand meaning and context. When AI encounters your website, it needs to quickly determine what you do and which pages matter most. Here's the process:
AI requests your llms.txt
When processing queries about your industry, AI checks if yourdomain.com/llms.txt exists
Parses your business context
The description block tells AI what you do, who you serve, and your key differentiators
Indexes page content
Each URL with description gets added to AI's understanding of your site
Cites you accurately
When users ask relevant questions, AI can confidently recommend and describe your offerings
The llms.txt file doesn't guarantee AI will cite you. It ensures that when AI does talk about you, the information is accurate. It's the difference between AI saying "I think they might offer X" versus "They offer X, Y, and Z with pricing starting at $29/month."
How to create an llms.txt file
You have three options for creating your llms.txt file. Choose based on your technical comfort level and how much customization you want:
Method 1: Use our free LLM txt generator
The fastest way. Our free llms.txt generator automatically scans your sitemap, analyzes each page, and creates complete documentation with AI-optimized descriptions. Takes about 30 seconds.
- 1Click 'Generate Your llms.txt' below
- 2Enter your website URL
- 3Wait ~30 seconds for AI analysis
- 4Download your llms.txt file
- 5Upload to your website root directory
100% free ยท No signup required
Method 2: Create manually
If you prefer full control, create the file yourself using any text editor:
- 1Create a new file called llms.txt
- 2Add your site name as a heading: # Your Site Name
- 3Write a 2-4 sentence description using blockquote syntax: > Your description
- 4List each important page with URL and description
- 5Upload to your website's root directory
Method 3: AI-assisted writing
Use ChatGPT or Claude to help write your descriptions. Here's a prompt you can use:
I need to create an llms.txt file for my website [YOUR SITE]. Here are my main pages: - Homepage: [URL] - Product/Service page: [URL] - Pricing: [URL] - Blog: [URL] Please generate an llms.txt file with: 1. A compelling 3-4 sentence business description 2. Detailed descriptions for each page (2-3 sentences each) 3. Proper Markdown formatting My business does: [BRIEF DESCRIPTION] My target audience is: [TARGET AUDIENCE]
Platform-specific guides
Here's how to upload your llms.txt file on popular website platforms:
WordPress
- 1.Access via FTP or File Manager
- 2.Navigate to /public_html/
- 3.Upload llms.txt file
- 4.Verify at yoursite.com/llms.txt
Shopify
- 1.Go to Online Store โ Themes
- 2.Edit code โ Assets folder
- 3.Add new file: llms.txt
- 4.Create redirect to /llms.txt
Wix
- 1.Dashboard โ Settings โ SEO
- 2.Add file to site files
- 3.Or use a static page
- 4.Set URL as /llms.txt
Squarespace
- 1.Create a new page
- 2.Add Code Block with content
- 3.Set URL slug to /llms.txt
- 4.Hide from navigation
Webflow
- 1.Project Settings โ Hosting
- 2.Add custom file to assets
- 3.Or create static page
- 4.Publish and verify
Custom / Self-hosted
- 1.FTP to your server
- 2.Upload to document root
- 3.Usually /var/www/html/
- 4.Set correct permissions
llms.txt vs robots.txt
Both files help machines understand your website, but they serve completely different purposes. Here's the breakdown:
| llms.txt | robots.txt | |
|---|---|---|
| Purpose | Describe content for AI understanding | Control search engine crawling access |
| Target audience | AI systems (ChatGPT, Claude, etc.) | Search engine bots (Google, Bing) |
| Format | Markdown syntax | Plain text directives |
| Content | URLs + descriptions + business context | Allow/disallow rules for crawlers |
| Goal | Get featured accurately in AI responses | Control which pages appear in search |
You need both files. robots.txt tells search engines what to index, while llms.txt tells AI how to understand what's indexed. They work together, not as alternatives. For a deeper comparison, see our guide: llms.txt vs robots.txt vs sitemap.xml.
Best practices
Write detailed, specific descriptions
Don't just list page titles. Include 2-3 sentences per page explaining the content, audience, and value. More context = better AI understanding.
Include concrete details
Mention specific pricing, features, integrations, and numbers. "Starting at $29/month" is better than "affordable pricing."
Prioritize important pages first
List your most valuable content at the top. AI systems often weight earlier content as more important.
Keep it current
Update your llms.txt when you add pages, change pricing, or update products. Outdated info leads to AI mistakes.
Use natural language
Write descriptions as if explaining to a smart colleague. Avoid jargon or marketing fluff that could confuse AI.
Don't include sensitive pages
Skip admin areas, internal tools, or confidential content. Only include publicly accessible pages you want AI to know about.
Tools & plugins
Plugins and libraries for working with llms.txt:
llms_txt2ctx
CLI / PythonCLI and Python module for parsing llms.txt files and generating LLM context
vitepress-plugin-llms
VitePressVitePress plugin that automatically generates LLM-friendly documentation
docusaurus-plugin-llms
DocusaurusDocusaurus plugin for generating LLM-friendly documentation
Drupal LLM Support
Drupal 10.3+A Drupal Recipe providing full support for the llms.txt proposal
llms-txt-php
PHPA PHP library for writing and reading llms.txt Markdown files
VS Code PagePilot Extension
VS CodeVS Code Chat participant that automatically loads external context for enhanced responses