New Discussion

Notifications

You’re not receiving notifications from this thread.

How to Optimize ActiveRecord Queries with Large Data Sets in Rails 7?

3
General

I'm working with a large dataset in Rails 7 and some of my Active Record queries are getting slow. I've tried using .includes and .select, but performance still lags. What are the best practices for optimizing Active Record queries with millions of records? Also, is using This tool a better option in some cases?

Any tips or code examples would be appreciated!

Optimize Active Record with .includes, .select, and .in_batches to handle millions of records efficiently. Don't forget to add indexes and use smart caching.

To optimize ActiveRecord in Rails 7 for large datasets:
Use .select to limit columns
Use .includes to avoid N+1 queries

Use .pluck for faster data retrieval
Use .in_batches to process data in chunks
Add proper database indexes
Tools like Bullet help catch inefficiencies, and raw SQL or Arel can be faster for complex queries.
Check out for more information via: https://medium.com/@stephendolan/rails-shrine-dropzonejs-f9d0944d17e3-geometry arrow

I think when working with millions of records in Rails 7, you should combine .in_batches, .load_async, and explicitly specify .select to avoid loading unnecessary data. If queries are still slow, consider using raw SQL via find_by_sql or connection.execute to improve performance. Additionally, for more complex queries, tools like Sequel or ROM.rb might be better alternatives to ActiveRecord and Geometry Dash Spam Test.

Join the discussion
Create an account Log in

Want to stay up-to-date with Ruby on Rails?

Join 91,226+ developers who get early access to new tutorials, screencasts, articles, and more.

    We care about the protection of your data. Read our Privacy Policy.