Thread: Never Ending query in PostgreSQL
Attachment
Hi, On Sun, Feb 27, 2022 at 04:40:16AM +0000, Kumar, Mukesh wrote: > > Can you please help in tunning the attached query as , i am trying to run > this query and it runs for several hours and it did not give any output. > > I am not able to generate the explain analyze plan as well and it keeps on > running for several hours and did not give output. > > I have attached the query and explain plan without analyze. Please help if > nayone has any idea how to tune that query. You attached the explain plan in both files. Also even if there was the query there wouldn't be enough information to be able to help, please consult https://wiki.postgresql.org/wiki/Slow_Query_Questions to provide more details.
Hi Team,Can you please help in tunning the attached query as , i am trying to run this query and it runs for several hours and it did not give any output.
P {margin-top:0;margin-bottom:0;} Hi Team,Can you please help in tunning the attached query as , i am trying to run this query and it runs for several hours and it did not give any output.I am not able to generate the explain analyze plan as well and it keeps on running for several hours and did not give output.I have attached the query and explain plan without analyze. Please help if nayone has any idea how to tune that query.Regards,Mukesh Kumar
Hi Team Member,
Your attachments are not SQL, they are plans. Judging by the size of the plans, your best course of action is to completely rewrite the queries, probably using CTE and temporary tables. May the Force be with you.
Regards
-- Mladen Gogala Database Consultant Tel: (347) 321-1217 https://dbwhisperer.wordpress.com
Several hours is not all that long.
Well, the pyramids in the Valley of the Kings last for around 4500 years. Dinosaurs have ruled the Earth for approximately 120 million years. Solar system is 5 billion years old. Cosmos is around 13 billion years old. Compared to those numbers, indeed, several hours isn't that long. Furthermore, you have to account for the time dilatation. One hour on the planet that's evolving and revolving at 900 miles an hour is not the same as one hour of standing still. To make things even more interesting, it's orbiting at 19 miles a second, so it's reckoned, The sun that is the source of all our power. So, several hours is relative. Each object has its relative time so it's not possible to conclude whether several hours is a long time or not.
-- Mladen Gogala Database Consultant Tel: (347) 321-1217 https://dbwhisperer.wordpress.com
On 2/27/22 18:20, Jeff Janes wrote: > > On Sun, Feb 27, 2022 at 7:09 AM Kumar, Mukesh <MKumar@peabodyenergy.com > <mailto:MKumar@peabodyenergy.com>> wrote: > > Hi Team, > > Can you please help in tunning the attached query as , i am trying > to run this query and it runs for several hours and it did not give > any output. > > > Several hours is not all that long. Without an EXPLAIN ANALYZE, we > could easily spend several hours scratching our heads and still get > nowhere. So unless having this running cripples the rest of your > system, please queue up another one and let it go longer. But first, do > an ANALYZE (and preferably a VACUUM ANALYZE) on all the tables. If you > have a test db which is a recent clone of production, you could do it > there so as not to slow down production. The problem is that the row > estimates must be way off (otherwise, it shouldn't take long) and if > that is the case, we can't use the plan to decide much of anything, > since we don't trust it. > I'd bet Jeff is right and poor estimates are the root cause. The pattern with a cascade of "nested loop" in the explain is fairly typical. This is likely due to the complex join conditions and correlation. > In parallel you could start evicting table joins from the query to > simplify it until it gets to the point where it will run, so you can > then see the actual row counts. To do that it does help if you know > what the intent of the query is (or for that matter, the text of the > query--you attached the plan twice). > Right, simplify the query. Or maybe do it the other way around - start with the simplest query (the inner-most part of the explain) and add joins one by one (by following the explains) until it suddenly starts being much slower. regards -- Tomas Vondra EnterpriseDB: http://www.enterprisedb.com The Enterprise PostgreSQL Company
Hi Tomas , Thanks for replying , We have identified a Join condition which is creating a problem for that query. Accept my apologies for pasting the plan twice. I am attaching the query again in this mail We have found that by evicting the View paymenttransdetails_view from the attached query runs in approx. 10 secs and theview contains multiple conditions and 1 jojn as well. I am attaching the View definition as well. Please suggest if there is a work around for this query to run faster without evicting the above from the query. Thanks and Regards, Mukesh Kumar -----Original Message----- From: Tomas Vondra <tomas.vondra@enterprisedb.com> Sent: Tuesday, March 1, 2022 7:35 PM To: Jeff Janes <jeff.janes@gmail.com>; Kumar, Mukesh <MKumar@peabodyenergy.com> Cc: pgsql-performance@postgresql.org Subject: Re: Never Ending query in PostgreSQL On 2/27/22 18:20, Jeff Janes wrote: > > On Sun, Feb 27, 2022 at 7:09 AM Kumar, Mukesh > <MKumar@peabodyenergy.com <mailto:MKumar@peabodyenergy.com>> wrote: > > Hi Team, > > Can you please help in tunning the attached query as , i am trying > to run this query and it runs for several hours and it did not give > any output. > > > Several hours is not all that long. Without an EXPLAIN ANALYZE, we > could easily spend several hours scratching our heads and still get > nowhere. So unless having this running cripples the rest of your > system, please queue up another one and let it go longer. But first, > do an ANALYZE (and preferably a VACUUM ANALYZE) on all the tables. If > you have a test db which is a recent clone of production, you could do > it there so as not to slow down production. The problem is that the > row estimates must be way off (otherwise, it shouldn't take long) and > if that is the case, we can't use the plan to decide much of anything, > since we don't trust it. > I'd bet Jeff is right and poor estimates are the root cause. The pattern with a cascade of "nested loop" in the explain isfairly typical. This is likely due to the complex join conditions and correlation. > In parallel you could start evicting table joins from the query to > simplify it until it gets to the point where it will run, so you can > then see the actual row counts. To do that it does help if you know > what the intent of the query is (or for that matter, the text of the > query--you attached the plan twice). > Right, simplify the query. Or maybe do it the other way around - start with the simplest query (the inner-most part of theexplain) and add joins one by one (by following the explains) until it suddenly starts being much slower. regards -- Tomas Vondra EnterpriseDB: https://urldefense.com/v3/__http://www.enterprisedb.com__;!!KupS4sW4BlfImQPd!P_2LgOrDOnTxBqFECBDdQolWyDNytft5mDbiJF_Bn827W6GdEOflXZ8a-NWSzdi6nJgewzgEJom8uFDBFgGKSETUD5VHA38$ The Enterprise PostgreSQL Company
Attachment
On 3/1/22 16:01, Kumar, Mukesh wrote: > Hi Tomas , > > Thanks for replying , We have identified a Join condition which is > creating a problem for that query. > > Accept my apologies for pasting the plan twice. I am attaching the > query again in this mail > Queries without explain (or even better "explain analyze") are useless. We don't have the data, we don't know what the executed plan is, we don't know what plan might be a better one. There's a wiki page about reporting slow queries (what info to include, etc): https://wiki.postgresql.org/wiki/Slow_Query_Questions > We have found that by evicting the View paymenttransdetails_view from > the attached query runs in approx. 10 secs and the view contains > multiple conditions and 1 jojn as well. > You need to add individual tables, not a view which is itself a join of 10+ tables. The idea is that you start with a fast query, add tables one by one (in the join order from the explain). You'll be able to do EXPLAIN ANALYZE and watch estimate accuracy, and then at some point it gets much slower, which is the join that causes trouble. But you might still be able to do explain analyze. So looking at the explain plan you shared before, you'd start with a join of so_vendor_address_base + so_vendor_base, and then you'd add - sapecc_lfa1_assoc - lms_payment_item_vendor_base - lms_payment_line_item_base - lms_payment_check_request - lms_pay_line_item_acct_base - ... (in this order). I'd bet "lms_payment_check_request" is where things start to go south. regards -- Tomas Vondra EnterpriseDB: http://www.enterprisedb.com The Enterprise PostgreSQL Company