On Friday 05 September 2003 18:07, Edwin Quijada wrote:
> Hi this is my code of sql select
>
[snip]
> This select get 20 seconds to doing. My 2 first table has 500000 reords
> each one
>
> Explain ////////////////
> Merge Join (cost=79.44..7127.72 rows=226 width=347)
> Merge Cond: ("outer".f_wholenum = "inner".f_wholenum)
> -> Nested Loop (cost=0.00..7025.77 rows=1807 width=116)
> -> Nested Loop (cost=0.00..17.13 rows=1 width=44)
> Join Filter: ("outer".f_agente = "inner".f_codigo_agente)
> -> Index Scan using t_poliza_vehiculos_f_wholenum_idx on
> t_poliza_vehiculos a (cost=0.00..17.07 rows=5 width=40)
> Index Cond: (f_wholenum = 'POL000001'::bpchar)
Index scan here - so that's good.
> -> Seq Scan on t_agentes c (cost=0.00..0.00 rows=1 width=4)
> -> Index Scan using f_id on t_clientes b (cost=0.00..6986.05
> rows=1807 width=72)
Index scan here too.
> Index Cond: ("outer".f_codigo_cliente = b.f_codigo_cliente)
> -> Sort (cost=79.44..79.76 rows=125 width=231)
> Sort Key: d.f_wholenum
> -> Hash Join (cost=43.53..75.09 rows=125 width=231)
> Hash Cond: ("outer".f_codigo_marca = "inner".f_idmarca)
> -> Seq Scan on t_marcas_vehiculos g (cost=0.00..20.00
> rows=1000 width=43)
Is 1000 rows a reasonable estimate here?
> -> Hash (cost=43.47..43.47 rows=25 width=188)
> -> Hash Join (cost=18.15..43.47 rows=25 width=188)
> Hash Cond: ("outer".f_idvehiculo =
> "inner".f_keyvehiculo)
> -> Seq Scan on t_vehiculos_asegurados d
> (cost=0.00..20.00 rows=1000 width=28)
And 1000 here too.
Couple of things to try: run "analyse" to redo the statistics and see if that
helps.
If not, have you changed the configuration settings - the default ones are
very low. You can find more at
http://www.varlena.com/varlena/GeneralBits/Tidbits/index.php
--
Richard Huxton
Archonet Ltd