Re: Query takes around 15 to 20 min over 20Lakh rows - Mailing list pgsql-general

From David G. Johnston
Subject Re: Query takes around 15 to 20 min over 20Lakh rows
Date
Msg-id CAKFQuwZecBri-p0Zh-Fn8-1T6FNPH94wPYXOZ3QQaON2pk1=6w@mail.gmail.com
Whole thread Raw
In response to Query takes around 15 to 20 min over 20Lakh rows  (Shubham Mittal <mittalshubham30@gmail.com>)
Responses Re: Query takes around 15 to 20 min over 20Lakh rows  (Tom Lane <tgl@sss.pgh.pa.us>)
List pgsql-general
On Thu, Sep 2, 2021 at 3:16 PM Shubham Mittal <mittalshubham30@gmail.com> wrote:
Hi ,

Please help in optimizing this query. I need to actually generate reports daily using this query.. It takes almost 15 to 20 min to execute this query due to joins.. 
Here common_details is a jsonB column.

SELECT T.order_id,
       ( ( T.common_details -> 'commonDetails' :: text ) -> 'bundle' ::
         text )
       ->> 'srType' :: text                                                   AS
       product,
       ( ( T.common_details -> 'commonDetails' :: text ) -> 'bundle' ::
         text )
       ->> 'mobileNumber' :: text                                             AS
       msisdn,

Use jsonb_populate_recordset (or one of its siblings) to get rid of as many of these key-based value extraction operations as possible and build a table from the contents of the jsonb.

Possibly into a temporary table to which you add indexes.

David J.

pgsql-general by date:

Previous
From: Michael Lewis
Date:
Subject: Re: Query takes around 15 to 20 min over 20Lakh rows
Next
From: Tom Lane
Date:
Subject: Re: Query takes around 15 to 20 min over 20Lakh rows