'Optimizing slow join on large-ish tables
Any suggestions on how to speed up this join?
Activity.joins('JOIN relationships as owner ON owner.person_id = activities.owner_id ')
There's an index on both 'person_id' and 'owner_id'
ANALYZE:
=> EXPLAIN ANALYZE for: SELECT "activities".* FROM "activities" JOIN relationships as owner ON owner.person_id = activities.owner_id
QUERY PLAN
---------------------------------------------------------------------------------------------------------------------------------------------
Hash Join (cost=45827.59..598928.22 rows=17869287 width=78) (actual time=376.884..4954.592 rows=2963220 loops=1)
Hash Cond: (activities.owner_id = owner.person_id)
-> Seq Scan on activities (cost=0.00..164430.24 rows=6484724 width=78) (actual time=0.036..646.086 rows=6484724 loops=1)
-> Hash (cost=25685.15..25685.15 rows=1227715 width=4) (actual time=376.635..376.636 rows=1221946 loops=1)
Buckets: 131072 Batches: 32 Memory Usage: 2382kB
-> Seq Scan on relationships owner (cost=0.00..25685.15 rows=1227715 width=4) (actual time=106.584..228.241 rows=1221946 loops=1)
Planning Time: 0.236 ms
JIT:
Functions: 10
Options: Inlining true, Optimization true, Expressions true, Deforming true
Timing: Generation 1.108 ms, Inlining 2.677 ms, Optimization 61.629 ms, Emission 42.139 ms, Total 107.552 ms
Execution Time: 5032.112 ms
(12 rows)
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|