How to Optimize ActiveRecord Queries with Large Data Sets in Rails 7?
I'm working with a large dataset in Rails 7 and some of my Active Record queries are getting slow. I've tried using .includes and .select, but performance still lags. What are the best practices for optimizing Active Record queries with millions of records? Also, is using This tool a better option in some cases?
Any tips or code examples would be appreciated!
Optimize Active Record with .includes, .select, and .in_batches to handle millions of records efficiently. Don't forget to add indexes and use smart caching.
To optimize ActiveRecord in Rails 7 for large datasets:
Use .select to limit columns
Use .includes to avoid N+1 queries
Use .pluck for faster data retrieval
Use .in_batches to process data in chunks
Add proper database indexes
Tools like Bullet help catch inefficiencies, and raw SQL or Arel can be faster for complex queries.
Check out for more information via: https://medium.com/@stephendolan/rails-shrine-dropzonejs-f9d0944d17e3-geometry arrow
I think when working with millions of records in Rails 7, you should combine .in_batches, .load_async, and explicitly specify .select to avoid loading unnecessary data. If queries are still slow, consider using raw SQL via find_by_sql or connection.execute to improve performance. Additionally, for more complex queries, tools like Sequel or ROM.rb might be better alternatives to ActiveRecord and Geometry Dash Spam Test.
For large datasets in Rails 7, use find each for batch loading, joins instead of multiple queries, and update all when callbacks are not needed. Paginate with limit/offset, push heavy work to background jobs, and use caching or materialized views for complex queries. Always run, EXPLAIN to catch slow parts.