On 4/23/23 13:45, Michael P. McDonnell wrote:
> Python 3.10.6
> psycopg library 3.1.8
>
> Running consecutive inserts sourced in files.
> All inserts are of the same format:
>
> INSERT INTO _____ (field1, field2, field3)
> SELECT field1, field2, field3 FROM ____, Join ___, join ___ etc...
>
> The code I've written is this:
>
> for qi in range(qlen):
> query = queries[qi]
> qparams = params[qi]
> with self.connection.cursor() as conn:
> conn.execute(query, qparams)
In above you are running the context manager(with) over the cursor not
the connection. This will not automatically commit the transaction. You
will need to either explicitly do connection.commit() or use the with
over the connection per:
https://www.psycopg.org/psycopg3/docs/basic/transactions.html
>
> When I run the queries in dbeaver - the first query takes 120s (it's
> 1.9M rows), the second query takes 2s (7000 rows).
> When I run the queries in python - it freezes on the second query.
>
> Any guidance on how to attack this would be awesome as I have re-written
> my code a dozen times and am just slinging mud to see what sticks.
--
Adrian Klaver
adrian.klaver@aklaver.com