Re: memory explosion on planning complex query - Mailing list pgsql-hackers

From Andrew Dunstan
Subject Re: memory explosion on planning complex query
Date
Msg-id 54766F22.8050409@dunslane.net
Whole thread Raw
In response to memory explosion on planning complex query  (Andrew Dunstan <andrew@dunslane.net>)
Responses Re: memory explosion on planning complex query  (Robert Haas <robertmhaas@gmail.com>)
List pgsql-hackers
On 11/26/2014 05:00 PM, Andrew Dunstan wrote:
>
> Attached is some anonymized DDL for a fairly complex schema from a 
> PostgreSQL Experts client. Also attached is an explain query that runs 
> against the schema. The client's problem is that in trying to run the 
> explain, Postgres simply runs out of memory. On my untuned 9.3 test 
> rig, (Scientific Linux 6.4 with 24Gb of RAM and 24Gb of swap) vmstat 
> clearly shows the explain chewing up about 7Gb of memory. When it's 
> done the free memory jumps back to where it was. On a similar case on 
> the clients test rig we saw memory use jump lots more.
>
> The client's question is whether this is not a bug. It certainly seems 
> like it should be possible to plan a query without chewing up this 
> much memory, or at least to be able to limit the amount of memory that 
> can be grabbed during planning. Going from humming along happily to 
> OOM conditions all through running "explain <somequery>" is not very 
> friendly.
>

Further data point - thanks to Andrew Gierth (a.k.a. RhodiumToad) for 
pointing this out. The query itself grabs about 600Mb to 700Mb to run, 
whereas the EXPLAIN takes vastly more - on my system 10 times more. 
Surely that's not supposed to happen?

cheers

andrew




pgsql-hackers by date:

Previous
From: Jeff Janes
Date:
Subject: Re: 9.2 recovery/startup problems
Next
From: Peter Geoghegan
Date:
Subject: Re: INSERT ... ON CONFLICT {UPDATE | IGNORE}