Thread: Fwd: Query out of memory
Sending to a performance group instead of PLPGSQL.
.
.
Hi,
I am running the below query. Table has 21 million records. I get an Out Of Memory error after a while.(from both pgadmin and psql). Can someone review DB parameters given below.
select t.*,g.column,a.column from
gk_staging g, transaction t,account a
where
g.accountcodeis not null AND
g.accountcode::text <> '' AND
length(g.accountcode)=13 AND
g.closeid::text=t.transactionid::text AND
subsrting(g.accountcode::text,8)=a.mask_code::text
Below are system parameters.
shared_buffers=3GB
work_mem=2GB
effective_cache_size=10GB
maintenance_work_mem=1GB
max_connections=250
I am unable to paste explain plan here due to security concerns.
Regards,
Aditya.
That work_mem value could be way too high depending on how much ram your server has...which would be a very important bit of information to help figure this out. Also, what Postgres / OS versions?
On Tue, 19 Oct 2021 at 05:54, Adam Brusselback <adambrusselback@gmail.com> wrote:
That work_mem value could be way too high depending on how much ram your server has...which would be a very important bit of information to help figure this out. Also, what Postgres / OS versions?
WORK_MEM is definitely too high. With 250 connections there is no way you could allocate 2G to each one of them if needed
Dave Cramer
www.postgres.rocks
On Tue, Oct 19, 2021 at 11:28:46AM +0530, aditya desai wrote: > I am running the below query. Table has 21 million records. I get an Out Of > Memory error after a while.(from both pgadmin and psql). Can someone review Is the out of memory error on the client side ? Then you've simply returned more rows than the client can support. In that case, you can run it with "explain analyze" to prove that the server side can run the query. That returns no data rows to the client, but shows the number of rows which would normally be returned. -- Justin
Hi Justin,
On Tuesday, October 19, 2021, Justin Pryzby <pryzby@telsasoft.com> wrote:
Out of memory on pgadmin and psql. I executed it with explain analyze. Still going out of memory.
Also currently 250 user connections are not being made. There are hardly 10 connections to database. When I run thi query it is going out of memory.
Also this query is part of a view that gets referred in a procedure.Transaction table is partitioned table but due to business requirements partition key is not part of where clause.
Regards,
Aditya.
On Tuesday, October 19, 2021, Justin Pryzby <pryzby@telsasoft.com> wrote:
On Tue, Oct 19, 2021 at 11:28:46AM +0530, aditya desai wrote:
> I am running the below query. Table has 21 million records. I get an Out Of
> Memory error after a while.(from both pgadmin and psql). Can someone review
Is the out of memory error on the client side ?
Then you've simply returned more rows than the client can support.
In that case, you can run it with "explain analyze" to prove that the server
side can run the query. That returns no data rows to the client, but shows the
number of rows which would normally be returned.
--
Justin
På tirsdag 19. oktober 2021 kl. 07:58:46, skrev aditya desai <admad123@gmail.com>:
Sending to a performance group instead of PLPGSQL.
..Hi,I am running the below query. Table has 21 million records. I get an Out Of Memory error after a while.(from both pgadmin and psql). Can someone review DB parameters given below.select t.*,g.column,a.column fromgk_staging g, transaction t,account awhereg.accountcodeis not null ANDg.accountcode::text <> '' ANDlength(g.accountcode)=13 ANDg.closeid::text=t.transactionid::text ANDsubsrting(g.accountcode::text,8)=a.mask_code::textBelow are system parameters.shared_buffers=3GBwork_mem=2GBeffective_cache_size=10GBmaintenance_work_mem=1GBmax_connections=250I am unable to paste explain plan here due to security concerns.
You have not provided schema, explain-output nor the error-message.
Without this it's pretty much guess-work...
--
Andreas Joseph Krogh
Andreas Joseph Krogh
Check explain plan, change work mem to 100MBs and then check explain plan again. If it changed, then try explain analyze.
Work mem is limit is used per node in the plan, so especially with partitioned tables, that limit is way too high.
Thanks Michael. I will check this further.
On Tue, Oct 19, 2021 at 7:09 PM Michael Lewis <mlewis@entrata.com> wrote:
Check explain plan, change work mem to 100MBs and then check explain plan again. If it changed, then try explain analyze.Work mem is limit is used per node in the plan, so especially with partitioned tables, that limit is way too high.
Do you see any issue in PostgreSQL log files?
Regards,
Ninad Shah
On Tue, 19 Oct 2021 at 16:17, aditya desai <admad123@gmail.com> wrote:
Hi Justin,Out of memory on pgadmin and psql. I executed it with explain analyze. Still going out of memory.Also currently 250 user connections are not being made. There are hardly 10 connections to database. When I run thi query it is going out of memory.Also this query is part of a view that gets referred in a procedure.Transaction table is partitioned table but due to business requirements partition key is not part of where clause.Regards,Aditya.
On Tuesday, October 19, 2021, Justin Pryzby <pryzby@telsasoft.com> wrote:On Tue, Oct 19, 2021 at 11:28:46AM +0530, aditya desai wrote:
> I am running the below query. Table has 21 million records. I get an Out Of
> Memory error after a while.(from both pgadmin and psql). Can someone review
Is the out of memory error on the client side ?
Then you've simply returned more rows than the client can support.
In that case, you can run it with "explain analyze" to prove that the server
side can run the query. That returns no data rows to the client, but shows the
number of rows which would normally be returned.
--
Justin