Thread: Slow update query

Slow update query

From
"elias ghanem"
Date:

Hi,

 

For the explain analyze here’s the output:

"Seq Scan on in_sortie  (cost=0.00..171140.19 rows=114449 width=84) (actual time=15.074..28461.349 rows=99611 loops=1)"

"  Output: type, site_id, fiche_produit_id, numero_commande, ligne_commande, date_sortie, quantite_sortie, date_livraison_souhaitee, quantite_souhaitee, client_ref, valeur, type_mouvement, etat_sortie_annulation, etat_sortie_prevision, etat_sortie_taux_service, date_commande, valide"

"  Filter: (valeur < 0.83)"

"Total runtime: 104233.651 ms"

 

(Although the total runtime is 104233.651 ms when I run the query it takes 2.5 mins)

 

-Concerning the exact version of postgresql I’m using, here is the result of the select version() :

PostgreSQL 8.4.2 on i686-pc-linux-gnu, compiled by GCC gcc (GCC) 3.4.6 20060404 (Red Hat 3.4.6-10), 32-bit

 

- for the postgresql.conf I’ve attached the file.

 

-Concerning the query, I’m sorry; it seems that I did not explain the problem clearly enough. Here’s a better explanation:

This update, shown below, is just one step in a long process. After processing certain rows, these rows have to be flagged so they don’t get processed another time.

UPDATE IN_SORTIE SET VALIDE = ‘O’ WHERE VALEUR < 0.83

The [SET VALIDE = ‘O’] merely flags this row as already processed.

The where clause that identifies these rows is rather simple: [WHERE VALEUR < 0.83]. It affects around 100,000 records in a table that contains around 3,000,000.

We are running this process on both Oracle and Postgres. I have noticed that this particular UPDATE statement for the same table size and the same number of rows affected, takes 11 seconds on Oracle while it takes 2.5 minutes on Postgres.

Knowing that there are no indexes on either database for this table;

 

So the problem can be resumed by the following: why a query like UPDATE IN_SORTIE SET VALIDE = ‘O’ WHERE VALEUR < 0.83 takes 2.5 min on Postgresql knowing that it is issued on a table containing around 3 000 000 records and affects around 1 00 000 record

 

Thanks again for your advise

Attachment