DEV Community

Cover image for How I Optimized EF Core Query from 30 Seconds to 30 Milliseconds
Anton Martyniuk
Anton Martyniuk

Posted on • Originally published at antondevtips.com on

How I Optimized EF Core Query from 30 Seconds to 30 Milliseconds

Performance is crucial for any application.

Developers often add a Caching Layer over slow database queries.
They are hiding symptoms instead of fixing the root problem.

In this post, we will do a challenge: how to optimize a slow real-world EF Core query.
EF Core offers great tools but can lead to slow queries if used improperly.

I will show you how I optimized EF Core query step-by-step from an unacceptable 30 seconds down to a blazing-fast 30 milliseconds.

Let's dive in!

On my website: antondevtips.com I share .NET and Architecture best practices.
Subscribe to my newsletter to improve your .NET skills.
Download the source code for this newsletter for free.

Challenge and the Slow Query

We will explore a Social Media platform where we have the following entities:

public class User
{
    public int Id { get; set; }
    public string Username { get; set; } = null!;
    public ICollection<Post> Posts { get; set; } = new List<Post>();
    public ICollection<Comment> Comments { get; set; } = new List<Comment>();
}

public class Post
{
    public int Id { get; set; }
    public string Content { get; set; } = null!;
    public int UserId { get; set; }
    public User User { get; set; } = null!;
    public int CategoryId { get; set; }
    public Category Category { get; set; } = null!;
    public ICollection<Like> Likes { get; set; } = new List<Like>();
    public ICollection<Comment> Comments { get; set; } = new List<Comment>();
}

public class Category
{
    public int Id { get; set; }
    public string Name { get; set; } = null!;
    public ICollection<Post> Posts { get; set; } = new List<Post>();
}

public class Comment
{
    public int Id { get; set; }
    public int UserId { get; set; }
    public User User { get; set; } = null!;
    public int PostId { get; set; }
    public Post Post { get; set; } = null!;
    public string Text { get; set; } = null!;
    public DateTime CreatedAt { get; set; }
}

public class Like
{
    public int Id { get; set; }
    public int PostId { get; set; }
    public Post Post { get; set; } = null!;
}
Enter fullscreen mode Exit fullscreen mode

These entities have the following relations:

  • Users: Each user has many posts and many comments.
  • Comments: Each comment belongs to a user and is linked to a post.
  • Categories: Posts are categorized.
  • Posts: Each post has a category and can have multiple likes.
  • Likes: Each like is associated with a post.

Here is our challenge requirement:

  • Select the top 5 users who made the most comments in the last 7 days on posts in the ".NET" category.

For each user return:

  • UserId
  • Username
  • Count of comments by a user (only comments on ".NET" posts in the last 7 days)
  • Top 3 ".NET" posts, by likes, each user commented on most (PostId, LikesCount)

Here's the initial very slow query:

public List<ActiveUserDto> GetTopCommenters_Slow()
{
    var since = DateTime.UtcNow.AddDays(-7);

    // 1) Eagerly load every user + all their comments → post → category → likes
    var users = _dbContext.Users
        .Include(u => u.Comments)
            .ThenInclude(c => c.Post)
                .ThenInclude(p => p.Category)
        .Include(u => u.Comments)
            .ThenInclude(c => c.Post)
                .ThenInclude(p => p.Likes)
        .ToList();

    var result = new List<ActiveUserDto>();

    foreach (var u in users)
    {
        // 2) Filter to recent ".NET" comments
        var comments = u.Comments
            .Where(c =>
                c.CreatedAt >= since &&
                c.Post.Category.Name == ".NET")
            .ToList();

        var commentsCount = comments.Count;
        if (commentsCount == 0)
        {
            continue;
        }

        // 3) Top 3 posts by like count
        var topPosts = comments
            .GroupBy(c => c.Post)
            .Select(g => new PostDto(
                g.Key.Id,
                _dbContext.Likes.Count(l => l.PostId == g.Key.Id)))
            .OrderByDescending(p => p.LikesCount)
            .Take(3)
            .ToList();

        // 4) Latest 2 comments _on those top 3 posts_
        var topPostIds = topPosts.Select(p => p.PostId).ToList();

        var recentTexts = _dbContext.Comments
            .Where(c =>
                c.UserId == u.Id &&
                c.CreatedAt >= since &&
                topPostIds.Contains(c.PostId))
            .OrderByDescending(c => c.CreatedAt)
            .Take(2)
            .Select(c => c.Text)
            .ToList();

        result.Add(new ActiveUserDto(
            u.Id, u.Username,
            commentsCount,
            topPosts,
            recentTexts));
    }

    // 5) Final top-5 in memory
    return result
        .OrderByDescending(x => x.CommentsCount)
        .Take(5)
        .ToList();
}
Enter fullscreen mode Exit fullscreen mode

This implementation loads everything eagerly into memory and then performs extensive filtering, sorting, and aggregation purely client-side.
This approach results in massive datasets being transferred from the database to the application, consuming a large amount of memory and dramatically slowing performance.

This query runs in 29-30 s in Postgres database.

Note: every benchmark result depends on your PC hardware and a database provider and location

Optimization 1: Pre-Filter Users

We loaded every user and all their comments, even those who hadn't commented on ".NET" posts in the last week.

Let's add a filter to return only users who actually made comments on ".NET" posts.

This simple filter cuts the number of records retrieved from the database, and thus other requests will be faster:

/// <summary>
/// Optimization 1: Pre-filter users who have any recent .NET comments
/// </summary>
public List<ActiveUserDto> GetTopCommenters_Optimization1_PreFilter()
{
    var since = DateTime.UtcNow.AddDays(-7);

    // Only users with at least one .NET comment in the window
    var users = _dbContext.Users
        .Where(u => u.Comments
            .Any(c =>
                c.CreatedAt >= since &&
                c.Post.Category.Name == ".NET"))
        .Include(u => u.Comments)
            .ThenInclude(c => c.Post)
                .ThenInclude(p => p.Category)
        .Include(u => u.Comments)
            .ThenInclude(c => c.Post)
                .ThenInclude(p => p.Likes)
        .ToList();

    // The same code as in the slow query
}
Enter fullscreen mode Exit fullscreen mode

Let's run the query and compare the performance:

Screenshot_1

We won just 1 second, but it can be more on bigger datasets.

Let's improve further.

Optimization 2: Limit Top-5 Users

After we've filtered out users in the previous step, the next step is to tell the database to only return the top 5 commenters by their activity.
That way, we never load more than 5 users we actually care about.

/// <summary>
/// Optimization 2: Limit users (take top-5 by comment count early)
/// </summary>
public List<ActiveUserDto> GetTopCommenters_Optimization2_LimitUsers()
{
    var since = DateTime.UtcNow.AddDays(-7);

    // Compute comment count in the database, then take top 5
    var users = _dbContext.Users
        .Where(u => u.Comments
            .Any(c =>
                c.CreatedAt >= since &&
                c.Post.Category.Name == ".NET"))
        .OrderByDescending(u => u.Comments
            .Count(c =>
                c.CreatedAt >= since &&
                c.Post.Category.Name == ".NET")
        )
        .Take(5)
        .Include(u => u.Comments)
            .ThenInclude(c => c.Post)
                .ThenInclude(p => p.Category)
        .Include(u => u.Comments)
            .ThenInclude(c => c.Post)
                .ThenInclude(p => p.Likes)
        .ToList();

    // The same code as in the slow query
}
Enter fullscreen mode Exit fullscreen mode

Here we sort users by the most comments and get only top-5 users back.

After we run the benchmarks, we see that we are down from 27 s to 17 s.

Screenshot_2

That's a significant performance boost.

Optimization 3: Limit the number of JOINs

Instead of JOINing all tables together: users, comments, posts, categories, likes - let's join only the needed data:

/// <summary>
/// Optimization 3: Filter child comments at query time
/// </summary>
public List<ActiveUserDto> GetTopCommenters_Optimization3_FilterComments()
{
    var since = DateTime.UtcNow.AddDays(-7);

    // Only include comments that match our filter
    var users = _dbContext.Users
        .Include(u => u.Comments.Where(c =>
            c.CreatedAt >= since &&
            c.Post.Category.Name == ".NET"))
        .ThenInclude(c => c.Post)
            .ThenInclude(p => p.Likes)
        .Where(u => u.Comments.Any())
        .OrderByDescending(u => u.Comments
            .Count(c =>
                c.CreatedAt >= since &&
                c.Post.Category.Name == ".NET")
        )
        .Take(5)
        .ToList();

    var result = new List<ActiveUserDto>();
    foreach (var u in users)
    {
        // u.Comments now only contains recent .NET comments
        var comments = u.Comments.ToList();
        var commentsCount = comments.Count;

        var topPosts = comments
            .GroupBy(c => c.Post)
            .Select(g => new PostDto(
                g.Key.Id,
                g.Key.Likes.Count))
            .OrderByDescending(p => p.LikesCount)
            .Take(3)
            .ToList();

        var topPostIds = topPosts.Select(p => p.PostId).ToList();
        var recentTexts = _dbContext.Comments
            .Where(c =>
                c.UserId == u.Id &&
                c.CreatedAt >= since &&
                topPostIds.Contains(c.PostId))
            .OrderByDescending(c => c.CreatedAt)
            .Take(2)
            .Select(c => c.Text)
            .ToList();

        result.Add(new ActiveUserDto(
            u.Id,
            u.Username,
            commentsCount,
            topPosts,
            recentTexts));
    }

    return result;
}
Enter fullscreen mode Exit fullscreen mode

After running the benchmarks you can see that we are down to 10 s and consume almost twice less memory:

Screenshot_3

Optimization 4: Project Only Required Columns

Let's optimize further and select only the needed columns.

First, we ask for just the top 5 users and their total comment counts.
Then, in a loop, we run two small queries:

  • Top 3 posts they commented on (by likes).
  • Latest 2 comment texts on those top posts.
/// <summary>
/// Step 4: Project only required columns—two-phase approach to avoid deep nested subqueries.
/// </summary>
public List<ActiveUserDto> GetTopCommenters_Optimization4_Projection()
{
    var since = DateTime.UtcNow.AddDays(-7);

    // Get users with comment counts and pre-filter them
    var topUsers = _dbContext.Users
        .AsNoTracking() // Add AsNoTracking for read-only operations
        .Where(u => u.Comments.Any(c =>
            c.CreatedAt >= since &&
            c.Post.Category.Name == ".NET"))
        .Select(u => new
        {
            u.Id,
            u.Username,
            CommentsCount = u.Comments.Count(c =>
                c.CreatedAt >= since &&
                c.Post.Category.Name == ".NET")
        })
        .OrderByDescending(u => u.CommentsCount)
        .Take(5)
        .ToList();

    var result = new List<ActiveUserDto>();

    foreach (var user in topUsers)
    {
        // Get the top posts for this user using projection
        var topPosts = _dbContext.Comments
            .AsNoTracking()
            .Where(c =>
                c.UserId == user.Id &&
                c.CreatedAt >= since &&
                c.Post.Category.Name == ".NET")
            .GroupBy(c => c.PostId)
            .Select(g => new
            {
                PostId = g.Key,
                LikesCount = _dbContext.Posts
                    .Where(p => p.Id == g.Key)
                    .Select(p => p.Likes.Count)
                    .FirstOrDefault()
            })
            .OrderByDescending(p => p.LikesCount)
            .Take(3)
            .Select(p => new PostDto(p.PostId, p.LikesCount))
            .ToList();

        // Get the post IDs to use in the next query
        var topPostIds = topPosts.Select(p => p.PostId).ToList();

        // Get the latest comments for this user on their top posts
        var recentTexts = _dbContext.Comments
            .AsNoTracking()
            .Where(c =>
                c.UserId == user.Id &&
                c.CreatedAt >= since &&
                topPostIds.Contains(c.PostId))
            .OrderByDescending(c => c.CreatedAt)
            .Take(2)
            .Select(c => c.Text)
            .ToList();

        // Add to result
        result.Add(new ActiveUserDto(
            user.Id,
            user.Username,
            user.CommentsCount,
            topPosts,
            recentTexts
        ));
    }

    return result;
}
Enter fullscreen mode Exit fullscreen mode

Despite having N+1 queries, we are down from 10 s to 40 milliseconds:

Screenshot_4

Here we send 1 + (2 × 5) = 11 simple calls to the database.

Now it's time to get rid of the loop and try to make everything in a single query.

Optimization 5: One-Query Projection

Let's try to put filtering, counting, grouping, sorting, and paging into one LINQ statement:

/// <summary>
/// Optimization 5: One Query
/// </summary>
public List<ActiveUserDto> GetTopCommenters_Optimization5_OneQuery()
{
    var since = DateTime.UtcNow.AddDays(-7);

    var projected = _dbContext.Users
        // 1) Only users with at least one ".NET" comment in the window
        .Where(u => u.Comments
            .Any(c =>
                c.CreatedAt >= since &&
                c.Post.Category.Name == ".NET"))

        // 2) Project everything in one go
        .Select(u => new
        {
            u.Id,
            u.Username,
            CommentsCount = u.Comments
                .Count(c =>
                    c.CreatedAt >= since &&
                    c.Post.Category.Name == ".NET"),

            // 3) Top 3 posts by like count
            TopPosts = u.Comments
                .Where(c =>
                    c.CreatedAt >= since &&
                    c.Post.Category.Name == ".NET")
                .GroupBy(c => new { c.Post.Id, c.Post.Likes.Count })
                .Select(g => new { g.Key.Id, LikesCount = g.Key.Count })
                .OrderByDescending(p => p.LikesCount)
                .Take(3),

            // 4) Latest 2 comments on those top-3 posts
            RecentComments = u.Comments
                .Where(c =>
                    c.CreatedAt >= since &&
                    c.Post.Category.Name == ".NET" &&
                    // subquery filter: only these top IDs
                    u.Comments
                        .Where(d =>
                            d.CreatedAt >= since &&
                            d.Post.Category.Name == ".NET")
                        .GroupBy(d => d.PostId)
                        .OrderByDescending(g => g.Count())
                        .Take(3)
                        .Select(g => g.Key)
                        .Contains(c.PostId))
                .OrderByDescending(c => c.CreatedAt)
                .Take(2)
                .Select(c => c.Text)
        })

        // 5) Order & take top 5 users
        .OrderByDescending(x => x.CommentsCount)
        .Take(5)

        // 6) Shape into our DTO
        .Select(x => new ActiveUserDto(
            x.Id,
            x.Username,
            x.CommentsCount,
            x.TopPosts
                .Select(p => new PostDto(p.Id, p.LikesCount))
                .ToList(),
            x.RecentComments.ToList()
        ))
        .ToList();

    return projected;
}
Enter fullscreen mode Exit fullscreen mode

Screenshot_5

Interesting that instead of 11 small queries - we did 1 and got only 3 ms of performance boost.

Let's examine the underline SQL query:

SELECT u0.id, u0.username, u0.c, s0.id0, s0."Count", s1.text, s1.id, s1.id0, s1.id1
  FROM (
      SELECT u.id, u.username, (
          SELECT count(*)::int
          FROM devtips_optimization_challenge.comments AS c3
          INNER JOIN devtips_optimization_challenge.posts AS p1 ON c3.post_id = p1.id
          INNER JOIN devtips_optimization_challenge.categories AS c4 ON p1.category_id = c4.id
          WHERE u.id = c3.user_id AND c3.created_at >= @__since_0 AND c4.name = '.NET') AS c, (
          SELECT count(*)::int
          FROM devtips_optimization_challenge.comments AS c1
          INNER JOIN devtips_optimization_challenge.posts AS p0 ON c1.post_id = p0.id
          INNER JOIN devtips_optimization_challenge.categories AS c2 ON p0.category_id = c2.id
          WHERE u.id = c1.user_id AND c1.created_at >= @__since_0 AND c2.name = '.NET') AS c0
      FROM devtips_optimization_challenge.users AS u
      WHERE EXISTS (
          SELECT 1
          FROM devtips_optimization_challenge.comments AS c
          INNER JOIN devtips_optimization_challenge.posts AS p ON c.post_id = p.id
          INNER JOIN devtips_optimization_challenge.categories AS c0 ON p.category_id = c0.id
          WHERE u.id = c.user_id AND c.created_at >= @__since_0 AND c0.name = '.NET')
      ORDER BY (
          SELECT count(*)::int
          FROM devtips_optimization_challenge.comments AS c1
          INNER JOIN devtips_optimization_challenge.posts AS p0 ON c1.post_id = p0.id
          INNER JOIN devtips_optimization_challenge.categories AS c2 ON p0.category_id = c2.id
          WHERE u.id = c1.user_id AND c1.created_at >= @__since_0 AND c2.name = '.NET') DESC
      LIMIT @__p_1
  ) AS u0
  LEFT JOIN LATERAL (
      SELECT s.id0, s."Count"
      FROM (
          SELECT p2.id AS id0, (
              SELECT count(*)::int
              FROM devtips_optimization_challenge.likes AS l
              WHERE p2.id = l.post_id) AS "Count"
          FROM devtips_optimization_challenge.comments AS c5
          INNER JOIN devtips_optimization_challenge.posts AS p2 ON c5.post_id = p2.id
          INNER JOIN devtips_optimization_challenge.categories AS c6 ON p2.category_id = c6.id
          WHERE u0.id = c5.user_id AND c5.created_at >= @__since_0 AND c6.name = '.NET'
      ) AS s
      GROUP BY s.id0, s."Count"
      ORDER BY s."Count" DESC
      LIMIT 3
  ) AS s0 ON TRUE
  LEFT JOIN LATERAL (
      SELECT c7.text, c7.id, p3.id AS id0, c8.id AS id1, c7.created_at
      FROM devtips_optimization_challenge.comments AS c7
      INNER JOIN devtips_optimization_challenge.posts AS p3 ON c7.post_id = p3.id
      INNER JOIN devtips_optimization_challenge.categories AS c8 ON p3.category_id = c8.id
      WHERE u0.id = c7.user_id AND c7.created_at >= @__since_0 AND c8.name = '.NET' AND c7.post_id IN (
          SELECT c9.post_id
          FROM devtips_optimization_challenge.comments AS c9
          INNER JOIN devtips_optimization_challenge.posts AS p4 ON c9.post_id = p4.id
          INNER JOIN devtips_optimization_challenge.categories AS c10 ON p4.category_id = c10.id
          WHERE u0.id = c9.user_id AND c9.created_at >= @__since_0 AND c10.name = '.NET'
          GROUP BY c9.post_id
          ORDER BY count(*)::int DESC
          LIMIT 3
      )
      ORDER BY c7.created_at DESC
      LIMIT 2
  ) AS s1 ON TRUE
  ORDER BY u0.c0 DESC, u0.id, s0."Count" DESC, s0.id0, s1.created_at DESC, s1.id, s1.id0
Enter fullscreen mode Exit fullscreen mode

This looks horrible and is subject to Cartesian Explosion.
Let's address this issue.

Optimization 6: AsSplitQuery

When you project deep object graphs in one query, EF Core often generates massive single‐SQL with multiple JOINs.
That can lead to a "cartesian explosion", where the same data is repeated across rows, bloating transfer size and slowing execution.

With .AsSplitQuery(), EF Core breaks one big query into multiple, simpler SQL statements.
This results in additional requests to get the JOINed data:

/// <summary>
/// Optimization 6: Split query to avoid cartesian explosion
/// </summary>
public List<ActiveUserDto> GetTopCommenters_Optimization6_SplitQuery()
{
    var since = DateTime.UtcNow.AddDays(-7);

    var projected = _dbContext.Users
        // The same code as in Optimization 5
        .AsSplitQuery()
        .ToList();

    return projected;
}
Enter fullscreen mode Exit fullscreen mode

The results are surprising:

Screenshot_6

Instead of performance boost our query now runs 51 ms which is even slower than our approach with 11 SQL requests.

Let's try something different to optimize our query further.

Optimization 7: Three-Phase Query

Let's try to make our query in three phases:

  • Phase 1: Get the top 5 users with their comment counts and top posts
  • Phase 2: Get the top posts for these users by like count
  • Phase 3: Get the most recent comments for each user on their top posts
/// <summary>
/// Optimization 7: Three-phase approach with optimized queries
/// </summary>
public List<ActiveUserDto> GetTopCommenters_Optimization7_ThreePhaseOptimized()
{
    var since = DateTime.UtcNow.AddDays(-7);

    // Phase 1: Get the top 5 users with their comment counts and top posts
    // We first identify the users who commented on .NET posts in the last 7 days
    // and get the top 5 based on comment count
    var topUsers = _dbContext.Users
        .Where(u => u.Comments.Any(c =>
            c.CreatedAt >= since &&
            c.Post.Category.Name == ".NET"))
        .Select(u => new
        {
            u.Id,
            u.Username,
            CommentsCount = u.Comments.Count(c =>
                c.CreatedAt >= since &&
                c.Post.Category.Name == ".NET")
        })
        .OrderByDescending(x => x.CommentsCount)
        .Take(5)
        .ToList();

    // Get the user IDs to use in subsequent queries
    var userIds = topUsers.Select(u => u.Id).ToArray();

    // Get the top posts for these users by like count
    // This avoids the cartesian explosion from the one-query approach
    var topPostsPerUser = _dbContext.Comments
        .Where(c =>
            userIds.Contains(c.UserId) &&
            c.CreatedAt >= since &&
            c.Post.Category.Name == ".NET")
        .GroupBy(c => new { c.UserId, c.PostId })
        .Select(g => new { g.Key.UserId, g.Key.PostId, LikesCount = g.First().Post.Likes.Count })
        .ToList()
        .GroupBy(x => x.UserId)
        .ToDictionary(
            g => g.Key,
            g => g.OrderByDescending(x => x.LikesCount)
                .Take(3)
                .Select(x => new PostDto(x.PostId, x.LikesCount))
                .ToList()
        );

    // Get the post IDs for the top posts
    var allTopPostIds = topPostsPerUser
        .SelectMany(kvp => kvp.Value.Select(p => p.PostId))
        .Distinct()
        .ToArray();

    // Get the most recent comments for each user on their top posts
    var recentCommentsPerUser = _dbContext.Comments
        .Where(c =>
            userIds.Contains(c.UserId) &&
            c.CreatedAt >= since &&
            allTopPostIds.Contains(c.PostId))
        .OrderByDescending(c => c.CreatedAt)
        .Select(c => new { c.UserId, c.Text, c.CreatedAt })
        .ToList()
        .GroupBy(c => c.UserId)
        .ToDictionary(
            g => g.Key,
            g => g.OrderByDescending(c => c.CreatedAt)
                .Take(2)
                .Select(c => c.Text)
                .ToList()
        );

    // Combine all the data into the final result
    var result = topUsers
        .Select(u => new ActiveUserDto(
            u.Id,
            u.Username,
            u.CommentsCount,
            topPostsPerUser.TryGetValue(u.Id, out var posts) ? posts : [],
            recentCommentsPerUser.TryGetValue(u.Id, out var comments) ? comments : []
        ))
        .ToList();

    return result;
}
Enter fullscreen mode Exit fullscreen mode

Let's run the benchmark and see the results:

Screenshot_7

Now we are down to 31 ms.

Optimization 8: Two-Phase Query

Let's try to run the 2 queries:

  • Phase 1: Fetch the top 5 users, their top 3 .NET posts and comment counts
  • Phase 2: Fetch the latest 2 comments per user
/// <summary>
/// Optimization 8: Two-phase approach with optimized queries
/// </summary>
public List<ActiveUserDto> GetTopCommenters_Optimization8_TwoPhaseOptimized()
{
    var since = DateTime.UtcNow.AddDays(-7);

    // Phase 1: Fetch the top 5 users plus their top 3 .NET posts and comment counts
    var summaries = _dbContext.Users
        .Where(u => u.Comments
            .Any(c =>
                c.CreatedAt >= since &&
                c.Post.Category.Name == ".NET"))
        .Select(u => new
        {
            u.Id,
            u.Username,
            CommentsCount = u.Comments
                .Count(c =>
                    c.CreatedAt >= since &&
                    c.Post.Category.Name == ".NET"),
            TopPosts = u.Comments
                .Where(c =>
                    c.CreatedAt >= since &&
                    c.Post.Category.Name == ".NET")
                .GroupBy(c => c.PostId)
                .Select(g => new
                {
                    PostId = g.Key,
                    LikesCount = _dbContext.Likes
                        .Count(l => l.PostId == g.Key)
                })
                .OrderByDescending(p => p.LikesCount)
                .Take(3)
                .ToList()
        })
        .OrderByDescending(x => x.CommentsCount)
        .Take(5)
        .ToList();

    var userIds = summaries.Select(x => x.Id).ToArray();

    var postIds = summaries
        .SelectMany(s => s.TopPosts.Select(tp => tp.PostId))
        .Distinct()
        .ToArray();

    // Phase 2: Fetch the latest 2 comments per (user, top-post) pair
    var recentCommentsLookup = _dbContext.Comments
        .Where(c =>
            c.CreatedAt >= since &&
            c.Post.Category.Name == ".NET" &&
            userIds.Contains(c.UserId) &&
            postIds.Contains(c.PostId))
        .GroupBy(c => new { c.UserId, c.PostId })
        .Select(g => new
        {
            g.Key.UserId,
            g.Key.PostId,
            LatestTwo = g
                .OrderByDescending(c => c.CreatedAt)
                .Take(2)
                .Select(c => c.Text)
                .ToList()
        })
        .ToList()
        .ToLookup(x => x.UserId, x => x);

    // Compose final DTOs
    var result = summaries
        .Select(s => new ActiveUserDto(
            s.Id,
            s.Username,
            s.CommentsCount,
            s.TopPosts
                .Select(tp => new PostDto(tp.PostId, tp.LikesCount))
                .ToList(),
            recentCommentsLookup[s.Id]
                .SelectMany(x => x.LatestTwo)
                .OrderByDescending(text => text) // ensure latest-first
                .Take(2)
                .ToList()
        ))
        .ToList();

    return result;
}
Enter fullscreen mode Exit fullscreen mode

Screenshot_8

Now we are down to ~29-30 ms which is so far the fastest.

Comparing All the Benchmarks

Let's run and compare all the benchmarks to see the full picture:

Screenshot_9

Optimization 8 is the fastest at ~30 ms but uses ~228 KB.

Optimization 5 ("One Query") runs in ~38 ms (9 ms slower) but allocates only ~104 KB, less than half the memory of Optimization 8.

In practice, I would start with Optimization 5 because it's the most balanced: very fast and has lower allocations.

Always monitor execution time and memory as your data set grows, and be ready to switch to a multi-phase approach (Optimization 7/8) if you see your single-query plan degrade.

Summary

In this post, we did EF Core Challenge on optimizing a very slow database query.

We went from 30 seconds and ending with a two-phase pattern that runs in around 30 ms.

Here are the key takeaways:

  • Filter data (Opt 1-3), so you only load the data you need.
  • Two-phase projections (Opt 4, 7, 8) break big work into small, focused queries.
  • Single-query projection (Opt 5) balances simplicity, speed, and low memory.
  • Split queries (Opt 6) prevent cartesian blow-up when projecting deep graphs.

Pro tip: consider adding database indexes on frequently accessed fields in the query - to make it even faster.

Finally, remember the golden rule: measure first, then optimize.
Often, you will realize that your queries are pretty fast and you don't need optimization.

Use logs or a profiler to identify the real bottlenecks in your environment, apply the targeted improvements you've learned here, and verify that each change delivers the expected performance improvements.

On my website: antondevtips.com I share .NET and Architecture best practices.
Subscribe to my newsletter to improve your .NET skills.
Download the source code for this newsletter for free.

Top comments (0)