llms.txt: The New robots.txt for AI Agents (Spec + Implementation)
In 2024, robots.txt told search engines how to crawl your site. In 2026, llms.txt tells AI agents what your product does. The problem When an AI agent needs to recommend a tool, it has limited cont...

Source: DEV Community
In 2024, robots.txt told search engines how to crawl your site. In 2026, llms.txt tells AI agents what your product does. The problem When an AI agent needs to recommend a tool, it has limited context. It might read your homepage, scan your GitHub README, or check package.json. None of these are optimized for machine consumption. llms.txt solves this with a standardized, machine-readable file. The spec Place llms.txt at your site root (e.g., yoursite.com/llms.txt or in /public/llms.txt for Next.js): # Product Name > One-line description. Detailed description of what this product does, who it is for, and when to use it. ## Use this when you need to: - Use case 1 - Use case 2 - Use case 3 ## Tech Stack - Technology 1 - Technology 2 ## Quick Start command to get started ## Links - GitHub: url - Docs: url - Purchase: url Real example Here is the llms.txt from LaunchKit, a SaaS starter kit: # LaunchKit — AI-Native SaaS Starter Kit > Production-ready Next.js 16 SaaS foundation with aut