Re: Consecutive Inserts Freeze Execution of Psycopg3 - Mailing list psycopg

From Adrian Klaver
Subject Re: Consecutive Inserts Freeze Execution of Psycopg3
Date
Msg-id 237c1e3f-4118-6ffc-6e0b-ddce824f1f55@aklaver.com
Whole thread Raw
In response to Consecutive Inserts Freeze Execution of Psycopg3  ("Michael P. McDonnell" <bzaks1424@gmail.com>)
Responses Re: Consecutive Inserts Freeze Execution of Psycopg3  ("Michael P. McDonnell" <bzaks1424@gmail.com>)
List psycopg
On 4/23/23 13:45, Michael P. McDonnell wrote:
> Python 3.10.6
> psycopg library 3.1.8
> 
> Running consecutive inserts sourced in files.
> All inserts are of the same format:
> 
> INSERT INTO _____ (field1, field2, field3)
> SELECT field1, field2, field3 FROM ____, Join ___, join ___ etc...
> 
> The code I've written is this:
> 
> for qi in range(qlen):
>              query = queries[qi]
>              qparams = params[qi]
>              with self.connection.cursor() as conn:
>                  conn.execute(query, qparams)

In above you are running the context manager(with) over the cursor not 
the connection. This will not automatically commit the transaction. You 
will need to either explicitly do connection.commit() or use the with 
over the connection per:

https://www.psycopg.org/psycopg3/docs/basic/transactions.html

> 
> When I run the queries in dbeaver - the first query takes 120s (it's 
> 1.9M rows), the second query takes 2s (7000 rows).
> When I run the queries in python - it freezes on the second query.
> 
> Any guidance on how to attack this would be awesome as I have re-written 
> my code a dozen times and am just slinging mud to see what sticks.

-- 
Adrian Klaver
adrian.klaver@aklaver.com




psycopg by date:

Previous
From: "Michael P. McDonnell"
Date:
Subject: Consecutive Inserts Freeze Execution of Psycopg3
Next
From: "Michael P. McDonnell"
Date:
Subject: Re: Consecutive Inserts Freeze Execution of Psycopg3