7 min read

Laravel Query Optimization: From 3 Seconds to 30ms

Real-world case study on Laravel optimization techniques that reduced database query time from 3000ms to 30ms using eager loading and indexes.

Laravel Query Optimization Eloquent Database Performance PHP
Laravel Query Optimization: From 3 Seconds to 30ms

Last month, I inherited a Laravel project where the dashboard was taking 3 full seconds to load. Three seconds might not sound terrible, but when users are staring at a loading spinner while your server burns through resources, it's a nightmare.

Here's the thing, Laravel makes database queries incredibly easy to write. So easy that you can accidentally create performance disasters without realizing it. After spending an afternoon with Laravel Debugbar open, I discovered the dashboard was executing 847 database queries to display a simple list of 50 users with their posts and comments.

That's when I knew exactly what I was dealing with: the classic N+1 query problem, combined with missing indexes and some genuinely questionable Eloquent usage.

By the end of that day, I'd reduced those 847 queries down to just 3, and the page load time dropped from 3 seconds to 30 milliseconds. Let me show you exactly how I did it, so you can avoid the same mistakes I see in almost every Laravel project I work on.

Understanding the N+1 Query Problem

Before we dive into solutions, you need to understand why this happens. The N+1 problem is sneaky because your code looks clean and works perfectly in development with 10 test records. Then you deploy to production with 10,000 records, and everything grinds to a halt.

Here's what was happening in that dashboard:

// The problematic code that looked innocent
public function index()
{
    $users = User::all(); // 1 query to get users
    
    return view('dashboard', compact('users'));
}

And in the Blade view:

@foreach($users as $user)
    <div>
        <h3>{{ $user->name }}</h3>
        <p>Posts: {{ $user->posts->count() }}</p> <!-- 1 query per user -->
        <p>Comments: {{ $user->comments->count() }}</p> <!-- 1 query per user -->
    </div>
@endforeach

See the problem? For 50 users, this executes:

  • 1 query to fetch users
  • 50 queries to fetch posts for each user
  • 50 queries to fetch comments for each user

That's 101 queries just for this simple example. Now imagine adding categories, tags, roles, and other relationships. You can see how I ended up with 847 queries.

The term "N+1" comes from this pattern: 1 initial query + N additional queries (where N is the number of records). It's the most common performance killer I encounter in Laravel applications, and I've seen it in probably 80% of the projects I've worked on.

Eager Loading: Your First Line of Defense

Laravel's solution to N+1 queries is eager loading. Instead of loading relationships as you access them (lazy loading), you tell Laravel upfront which relationships you'll need

Here's how I fixed the dashboard:

public function index()
{
    // Load users with their posts and comments in one go
    $users = User::with(['posts', 'comments'])->get();
    
    return view('dashboard', compact('users'));
}

That's it. Three queries instead of 101:

  1. SELECT * FROM users
  2. SELECT * FROM posts WHERE user_id IN (1,2,3...)
  3. SELECT * FROM comments WHERE user_id IN (1,2,3...)

The performance difference was immediate — load time dropped from 850ms to 120ms just with this change. But I wasn't done yet.

Nested Eager Loading

Things get trickier when you have nested relationships. In my case, each post had categories, and I needed to display those too:

// This would still cause N+1 queries for categories
$users = User::with(['posts', 'comments'])->get();
@foreach($user->posts as $post)
    Category: {{ $post->category->name }} <!-- N+1 query for each post -->
@endforeach

The solution? Nested eager loading with dot notation:

$users = User::with([
    'posts.category', // Load posts AND their categories
    'comments.post'   // Load comments AND their related posts
])->get();

This is something I wish I'd learned earlier in my Laravel journey. You can nest as deep as you need: 'posts.category.parent.owner' all works perfectly.

Conditional Eager Loading

Sometimes you don't need all related records. Maybe you only want published posts or recent comments. That's where constrained eager loading comes in:

$users = User::with([
    'posts' => function ($query) {
        $query->where('status', 'published')
              ->orderBy('created_at', 'desc')
              ->limit(5);
    },
    'comments' => function ($query) {
        $query->where('created_at', '>', now()->subDays(30));
    }
])->get();

This technique alone saved me from loading thousands of unnecessary records. In the project I mentioned, some users had 500+ posts, but we only needed to show the 5 most recent ones.

Database Indexes: The Performance Multiplier

Eager loading got me from 850ms to 120ms. But I needed to go faster. That's when I started looking at database indexes.

Here's something that confused me when I first started with Laravel: Eloquent doesn't automatically create indexes for your foreign keys. I'd been building applications for two years before I realized this. All those user_id, post_id, and category_id columns? Unindexed.

Let me show you what that means in practice. Without indexes, finding all posts for a user requires MySQL to scan every single row in the posts table:

-- Without index: scans all 100,000 posts
SELECT * FROM posts WHERE user_id = 1;
-- Execution time: 450ms

With an index, MySQL can jump directly to the relevant rows:

-- With index: uses index to find rows instantly
SELECT * FROM posts WHERE user_id = 1;
-- Execution time: 8ms

That's a 56x speed improvement from a single line in a migration.

Adding Indexes to Existing Columns

Here's how I added indexes to fix the slow queries:

// Create a new migration
php artisan make:migration add_indexes_to_posts_table

// In the migration file
public function up()
{
    Schema::table('posts', function (Blueprint $table) {
        $table->index('user_id');
        $table->index('category_id');
        $table->index('status');
        $table->index('created_at');
        
        // Composite index for common query patterns
        $table->index(['user_id', 'status', 'created_at']);
    });
}

public function down()
{
    Schema::table('posts', function (Blueprint $table) {
        $table->dropIndex(['user_id']);
        $table->dropIndex(['category_id']);
        $table->dropIndex(['status']);
        $table->dropIndex(['created_at']);
        $table->dropIndex(['user_id', 'status', 'created_at']);
    });
}

Pro tip: When creating new tables, add indexes immediately using ->index() on foreign key columns:

Schema::create('posts', function (Blueprint $table) {
    $table->id();
    $table->foreignId('user_id')->index()->constrained();
    $table->foreignId('category_id')->index()->constrained();
    $table->string('status')->index();
    $table->timestamps();
});

After adding these indexes, my query time dropped from 120ms to 45ms. We're getting close to that 30ms target.

Composite Indexes for Complex Queries

Single-column indexes are great, but composite indexes are where you can really optimize. If you frequently query by multiple columns together, a composite index can be dramatically faster than multiple single-column indexes.

In my dashboard, I had this query running constantly:

Post::where('user_id', $userId)
    ->where('status', 'published')
    ->orderBy('created_at', 'desc')
    ->get();

Instead of three separate indexes, I created one composite index:

$table->index(['user_id', 'status', 'created_at']);

The order matters. MySQL uses composite indexes left-to-right, so this index helps with queries filtering by:

  • Just user_id
  • user_id and status
  • user_id, status, and created_at

But it won't help with queries filtering by just status or just created_at. Keep this in mind when designing your indexes.

Optimizing Query Structure

Even with eager loading and indexes, some queries can still be slow. Here are the techniques I used to squeeze out the last bit of performance.

Select Only What You Need

By default, User::all() selects every column. If you only need the name and email, why pull the entire record?

// Instead of this (pulls all columns)
$users = User::with('posts')->get();

// Do this (only pulls what you need)
$users = User::select(['id', 'name', 'email'])
    ->with(['posts' => function ($query) {
        $query->select(['id', 'user_id', 'title', 'created_at']);
    }])
    ->get();

This reduced my payload size by 60% and shaved off another 10ms. It might not seem like much, but when you're serving thousands of requests per hour, it adds up.

Use Chunking for Large Datasets

If you need to process thousands of records, don't load them all at once. Laravel's chunk() method processes records in batches, keeping memory usage low:

// Bad: loads 50,000 users into memory
User::all()->each(function ($user) {
    $this->processUser($user);
});

// Good: processes 100 users at a time
User::chunk(100, function ($users) {
    foreach ($users as $user) {
        $this->processUser($user);
    }
});

I use this constantly for background jobs and data exports. It's the difference between your script running smoothly and your server running out of memory.

Counting Relationships Without Loading Them

Here's a mistake I made for years: loading entire relationships just to count them.

// Inefficient: loads all posts just to count them
$user->posts->count();

// Efficient: counts at the database level
$user->posts()->count();

Notice the difference? $user->posts (without parentheses) loads all posts into memory. $user->posts() (with parentheses) returns a query builder, letting you count at the database level.

Even better, use withCount() when eager loading:

$users = User::withCount(['posts', 'comments'])->get();

// Now you can access counts without additional queries
$user->posts_count; // No query needed
$user->comments_count; // No query needed

This was a game-changer for my dashboard. Instead of loading thousands of posts just to display counts, I got the counts directly from the database. [link to: Advanced Eloquent Techniques]

How to Identify Slow Queries in Your Application

You can't fix what you can't measure. Here are the tools I use to find performance issues before they become problems.

Laravel Debugbar

This is the first package I install on any Laravel project:

composer require barryvdh/laravel-debugbar --dev

Debugbar shows you every query executed, their execution time, and where they're coming from in your code. It's like having X-ray vision for your application.

The "Queries" tab was what helped me discover those 847 queries on the dashboard. Without it, I'd have had no idea what was happening.

Laravel Telescope

For production monitoring, Laravel Telescope is invaluable:

composer require laravel/telescope
php artisan telescope:install
php artisan migrate

Telescope records every request, query, job, and exception. When users report slow pages, I can see exactly what queries ran and how long they took. It's saved me countless hours of debugging.

Query Logging in Code

Sometimes you need to see queries programmatically. Laravel makes this easy:

use Illuminate\Support\Facades\DB;

// Enable query logging
DB::enableQueryLog();

// Your code here
$users = User::with('posts')->get();

// See what queries ran
dd(DB::getQueryLog());

I use this technique when writing tests to verify I'm not creating N+1 queries.

The explain() Method

Want to see how MySQL executes your query? Use explain():

User::where('email', 'test@example.com')->explain()->dd();

This shows you MySQL's execution plan, including which indexes it's using (or not using). If you see "type: ALL" in the output, that's a full table scan — usually a sign you need an index.

Common Mistakes to Avoid

Let me save you some headaches by sharing the mistakes I see repeatedly (and have made myself):

Over-indexing: More indexes aren't always better. Each index speeds up reads but slows down writes. I once added indexes to every column and wondered why inserts became sluggish. Only index columns you actually query frequently.

Forgetting ->index() on foreign keys: This is still my most common mistake after years of Laravel development. I create a migration, add foreign key constraints, and forget to index them. Now I have a template I copy-paste to avoid this.

Using ->get() instead of ->first(): If you only need one record, use first(). It adds LIMIT 1 to the query, which can be faster even with indexes.

Not using database transactions for bulk operations: When inserting multiple records, wrap them in a transaction. It's significantly faster and prevents partial writes if something fails.

Loading relationships in loops: Never do this:

foreach ($posts as $post) {
    $category = $post->category; // N+1 query
}

Always eager load before the loop.

The Results: From 3 Seconds to 30ms

After applying all these techniques, here's what the final dashboard code looked like:

public function index()
{
    $users = User::select(['id', 'name', 'email', 'created_at'])
        ->with([
            'posts' => function ($query) {
                $query->select(['id', 'user_id', 'title', 'status', 'created_at'])
                      ->where('status', 'published')
                      ->latest()
                      ->limit(5);
            },
            'posts.category:id,name'
        ])
        ->withCount([
            'posts',
            'comments' => function ($query) {
                $query->where('created_at', '>', now()->subDays(30));
            }
        ])
        ->latest()
        ->paginate(50);
    
    return view('dashboard', compact('users'));
}

The results:

  • Query count: 847 → 3 queries
  • Page load time: 3000ms → 30ms
  • Memory usage: 45MB → 8MB
  • Database CPU: 78% → 4%

That's a 100x performance improvement. The client was thrilled. Users stopped complaining. And I learned valuable lessons about Laravel optimization that I now apply to every project.

When to Optimize (and When Not To)

Here's something important: don't optimize prematurely. I've wasted hours optimizing queries that ran once a day on 50 records. It wasn't worth the time.

Optimize when:

  • Page load times exceed 300ms
  • You see 10+ queries for a single page
  • Your database CPU is consistently high
  • Users are complaining about speed
  • You're working with 1000+ records

Don't optimize when:

  • Everything feels fast
  • You're working with tiny datasets
  • It's a rarely-used admin page
  • The code complexity would increase significantly

Remember: readable code that's "fast enough" beats unreadable code that's 10ms faster. I optimize when it matters, not just because I can.

What's Next?

Laravel query optimization isn't just about eager loading and indexes. There's so much more you can explore:

  • Query caching with Redis for frequently-accessed data
  • Database replication to separate read and write operations
  • Queue optimization for background jobs that don't need immediate results
  • API response caching to avoid hitting the database entirely

But master these fundamentals first. The techniques I've shared here will solve 90% of your Laravel performance issues. [link to: Redis Caching in Laravel]

If you're working on a Laravel project that feels slow, start with Laravel Debugbar. Find your N+1 queries. Add your indexes. Use eager loading. You'll be amazed how much faster things get.

Need help implementing these optimizations in your Laravel application? I've helped dozens of clients squeeze more performance out of their existing codebases. Let's work together: Contact me


Need Help With Your Laravel Project?

I specialize in building custom Laravel applications, process automation, and SaaS development. Whether you need to eliminate repetitive tasks or build something from scratch, let's discuss your project.

⚡ Currently available for 2-3 new projects

Hafiz Riaz

About Hafiz Riaz

Full Stack Developer from Turin, Italy. I build web applications with Laravel and Vue.js, and automate business processes. Creator of ReplyGenius, StudyLab, and other SaaS products.

View Portfolio →