10 min read

How to Make Your Laravel App AI-Agent Friendly (The Complete 2026 Guide)

AI agents are browsing the web now. Here's how to make your Laravel app speak their language.

How to Make Your Laravel App AI-Agent Friendly (The Complete 2026 Guide)

Your Laravel app is serving garbage to AI agents right now. Not because your code is bad. Because AI agents don't read HTML the way browsers do.

When ChatGPT Search, Claude, or Perplexity visits your site, they get a wall of navigation menus, cookie banners, JavaScript bundles, and footer links mixed in with the actual content. They burn through context window tokens parsing <div class="flex items-center justify-between"> just to find your page title. And if your content is rendered client-side with Vue or React? They might get nothing at all.

This is about to matter a lot. AI-driven search is projected to handle 10% of all queries by late 2026. That's real traffic you're either capturing or losing.

The good news? Making your Laravel app AI-agent friendly isn't hard. It takes four layers, and most of them you can set up in an afternoon.

The Four Layers of AI-Agent Readiness

Think of AI-agent optimization like regular SEO, but for a different kind of crawler. Search engines had robots.txt and sitemaps. AI agents need their own set of signals. Here's the stack:

  1. llms.txt - Tell agents what your site is about and where the good stuff lives
  2. Markdown responses - Serve clean, parseable content instead of messy HTML
  3. Structured data - Help agents understand the meaning behind your content
  4. Coding guidelines - Make sure AI coding agents write code that fits your project

Each layer solves a different problem. Let's build them one by one.

Layer 1: Create an llms.txt File

The llms.txt standard was proposed by Jeremy Howard of Answer.AI in late 2024. It's basically a sitemap for AI agents, a markdown file at your site root that tells language models what your site is about and links to your most important content.

Is it officially supported by OpenAI, Google, or Anthropic? Not yet. But over 844,000 websites have already implemented it, including Anthropic themselves, Cloudflare, Vercel, and Stripe. Anthropic specifically requested llms.txt support for their docs. Google included it in their Agents to Agents (A2A) protocol. The signal is clear: this is heading toward adoption.

Here's how to set it up in Laravel:

// routes/web.php
Route::get('/llms.txt', [LlmsTxtController::class, 'show']);
// app/Http/Controllers/LlmsTxtController.php
class LlmsTxtController extends Controller
{
    public function show()
    {
        $content = Cache::remember('llms-txt', 3600, function () {
            return $this->generateContent();
        });

        return response($content, 200)
            ->header('Content-Type', 'text/markdown; charset=UTF-8');
    }

    private function generateContent(): string
    {
        $postCount = Post::published()->count();
        $toolCount = Tool::active()->count();

        return <<<MARKDOWN
        # Your App Name

        > Brief description of what your app does and who it's for.

        ## Key Pages
        - [Documentation](/docs): Complete API and usage docs
        - [Blog](/blog): {$postCount} technical articles
        - [Tools](/tools): {$toolCount} free developer tools

        ## Popular Content
        - [Getting Started Guide](/docs/getting-started): Setup and first steps
        - [API Reference](/docs/api): Full endpoint documentation
        MARKDOWN;
    }
}

The dynamic approach is key here. Hard-coding your llms.txt means it goes stale the moment you publish a new post. By pulling counts and top content from your database and caching for an hour, it stays current without hitting the database on every request.

A few tips on what to include: focus on your highest-value pages, not every URL. Think documentation, key product pages, and your best content. AI agents have limited context windows, so a 200-line llms.txt defeats the purpose. Keep it concise.

One thing worth comparing: llms.txt is not robots.txt. Your robots.txt controls what crawlers can access. Your llms.txt tells AI agents what's worth reading. They're complementary, not competing.

Layer 2: Serve Markdown Responses with Spatie's New Package

This is the most impactful layer. Spatie just released laravel-markdown-response, a package that automatically serves markdown versions of your HTML pages to AI agents. It dropped on February 17, 2026 and it solves a problem that's been bugging developers for months.

The concept is simple. When a regular user visits your page, they get normal HTML. When an AI agent visits, they get clean markdown. Same content, different format.

composer require spatie/laravel-markdown-response
use Spatie\MarkdownResponse\Middleware\ProvideMarkdownResponse;

Route::middleware(ProvideMarkdownResponse::class)->group(function () {
    Route::get('/about', [PageController::class, 'show']);
    Route::get('/posts/{post}', [PostController::class, 'show']);
    Route::get('/docs/{slug}', [DocController::class, 'show']);
});

That's it. The middleware detects AI agents through three mechanisms: an Accept: text/markdown header, known bot user agents like GPTBot and ClaudeBot, or a .md URL suffix. So /about.md returns the markdown version of your about page, which is great for testing.

What makes this package smart is what it doesn't convert. JSON responses, redirects, error pages, and streamed responses pass through untouched. If you're using Inertia.js, only the initial full-page HTML load gets converted, the XHR requests work normally.

You can also apply it globally:

// bootstrap/app.php
->withMiddleware(function (Middleware $middleware) {
    $middleware->append(ProvideMarkdownResponse::class);
})

And exclude specific routes where markdown doesn't make sense:

use Spatie\MarkdownResponse\Attributes\DoNotProvideMarkdown;

class DashboardController
{
    #[DoNotProvideMarkdown]
    public function index()
    {
        return view('dashboard');
    }
}

The conversion is driver-based. The default uses league/html-to-markdown and works locally without external services. You can swap to Cloudflare Workers AI or markdown.new for better quality on complex pages. And converted responses are cached by default, so repeated requests don't re-run the conversion.

Why does this matter so much? Because AI agents waste massive amounts of tokens parsing HTML. Companies using markdown responses have reported up to 10x token reductions when serving markdown instead of HTML. That means faster, more accurate responses when someone asks an AI about your content.

You can verify how your pages look in markdown preview to get a sense of what agents will actually see.

Layer 3: Add Structured Data for AI Understanding

Markdown gives AI agents clean text. Structured data gives them meaning.

JSON-LD schema markup has been around forever for SEO, but it's becoming more important for AI agents too. When ChatGPT or Perplexity pulls information from your site, structured data helps them understand that this number is a price, that string is an author name, and this block is a FAQ.

Here's a practical Laravel approach using a Blade component:

// resources/views/components/json-ld.blade.php
<script type="application/ld+json">
{!! json_encode($schema, JSON_UNESCAPED_SLASHES | JSON_PRETTY_PRINT) !!}
</script>
// In your blog post view
<x-json-ld :schema="[
    '@context' => 'https://schema.org',
    '@type' => 'TechArticle',
    'headline' => $post->title,
    'author' => [
        '@type' => 'Person',
        'name' => 'Your Name',
        'url' => 'https://yoursite.com/about',
    ],
    'datePublished' => $post->published_at->toIso8601String(),
    'dateModified' => $post->updated_at->toIso8601String(),
    'description' => $post->excerpt,
    'mainEntityOfPage' => url()->current(),
]" />

For SaaS apps, the schemas that matter most are SoftwareApplication, FAQPage, HowTo, and Organization. For blogs, focus on TechArticle or BlogPosting with proper author markup.

Don't overthink this. Start with the basics (article schema on blog posts, organization schema on your homepage) and expand from there. The JSON formatter is useful for validating your schema output during development.

One important note: Google's John Mueller has confirmed that traditional search crawlers don't treat llms.txt as special. But structured data? That's been confirmed to help both regular search and AI-powered search features. If you're going to invest time in only one layer, make it structured data and markdown responses.

Layer 4: Set Up AI Coding Guidelines with Laravel Boost

The first three layers are about making your app readable to AI agents that browse the web. This layer is different. It's about making your codebase understandable to AI coding agents like Claude Code, Cursor, and GitHub Copilot.

If you've been using Laravel Boost and MCP servers, you know how much context matters. Without project-specific guidelines, AI agents write generic code that might work but doesn't follow your conventions.

Laravel Boost solves this by loading Spatie's PHP and Laravel guidelines directly into your AI agent's context. Combined with project-level AGENTS.md or CLAUDE.md files, your coding agent understands your specific style preferences, testing patterns, and architectural decisions.

# Install Boost skill for Claude Code
claude mcp add boost -- npx -y @nicoverbruggen/laravel-boost-mcp

Then create a project guidelines file:

# CLAUDE.md

## Architecture
- Use Action classes for business logic, not fat controllers
- All API responses go through JsonResource classes
- Queue anything that takes over 500ms

## Testing
- Every feature needs a Pest test
- Use factories, never manual model creation in tests
- Test the behavior, not the implementation

Freek Van der Herten from Spatie recently shared his Claude Code setup in his dotfiles repository, including custom skills for package scaffolding with Spatie's skeleton. The pattern is the same: give the AI context about how you work, and it produces code that actually fits your project.

This matters for developer-driven AI workflows where you're guiding the agent rather than letting it guess. The more context you provide upfront, the less time you spend fixing generic AI output.

Putting It All Together

Here's a practical implementation order for an existing Laravel app:

Hour 1: Install spatie/laravel-markdown-response and apply the middleware to your public routes. Test with the .md URL suffix to verify the output. This gives you the biggest immediate impact.

Hour 2: Create your llms.txt route with dynamic content generation. Cache it. Keep it under 50 lines.

Hour 3: Add JSON-LD structured data to your most important pages. Start with your homepage and your top 5 blog posts or product pages.

Hour 4: Set up Laravel Boost and create your CLAUDE.md or AGENTS.md file for AI coding agents.

That's four hours for a complete AI-agent readiness stack. Not bad.

What This Doesn't Do

Let's be honest about limitations. None of this guarantees your content will appear in AI-generated answers. Just like traditional SEO doesn't guarantee a #1 ranking, AEO (AI Engine Optimization) improves your chances, it doesn't control outcomes.

The llms.txt standard is still in its early stages. Major AI providers haven't officially confirmed they read these files during inference. But the trend is clear. AI agents are consuming more web content every month, and the sites that make their content easy to parse will get cited more often.

The real-world impact shows up in two ways: better accuracy when AI tools do reference your content, and lower token costs for agents interacting with your site. Both matter if you're building anything that AI agents might discover.

FAQ

Do I need all four layers, or can I just pick one?

Start with markdown responses (Layer 2) and structured data (Layer 3). These have the most proven impact. Add llms.txt when you have time, and set up Boost if you're using AI coding tools. You don't need everything on day one.

Does llms.txt actually affect my Google rankings?

No. Google's John Mueller confirmed that llms.txt doesn't affect search rankings. It's aimed at AI agents, not traditional search crawlers. Think of it as a separate channel entirely.

Will Spatie's markdown-response package slow down my app?

No. Conversions are cached by default, so only the first request from an AI agent triggers the HTML-to-markdown conversion. Regular users never see any difference. The package also skips JSON responses, redirects, and streaming, so your API endpoints aren't affected.

Should I worry about AI agents scraping my content?

That's a separate question from making content AI-friendly. You can use robots.txt to block specific AI crawlers if you want. Making your content AI-readable and controlling crawler access are two different decisions. Some developers do both: they block training crawlers but serve markdown to inference-time agents.

Does this work with Inertia.js and Vue/React SPAs?

Yes. Spatie's package is smart about this. It only converts full-page HTML responses, not Inertia's XHR requests. The initial page load that an AI agent would see gets converted. Your SPA navigation stays untouched.

The Bottom Line

AI agents are the new browsers. They're browsing your site right now, and most Laravel apps are serving them a mess of HTML they can barely parse.

The fix isn't complicated. Four layers, four hours. Markdown responses give agents clean content. An llms.txt file guides them to what matters. Structured data adds meaning. And coding guidelines keep your AI tools writing code your way.

The developers who set this up now will have a head start when AI-agent traffic really takes off. And based on how fast things are moving, that's not years away. It's months.

If you need help implementing any of this for a production app, get in touch. I've been building Laravel apps for 9+ years and have been deep in the AI integration space since the early days of the SDK.


Got a Product Idea?

I build MVPs, web apps, and SaaS platforms in 7 days. Fixed price, real code, deployed and ready to use.

⚡ Currently available for 2-3 new projects

Hafiz Riaz

About Hafiz

Full Stack Developer from Italy. I help founders turn ideas into working products fast. 9+ years of experience building web apps, mobile apps, and SaaS platforms.

View My Work →

Get web development tips via email

Join 50+ developers • No spam • Unsubscribe anytime