Re: Query optimizer 8.0.1 (and 8.0) - Mailing list pgsql-hackers
From | Oleg Bartunov |
---|---|
Subject | Re: Query optimizer 8.0.1 (and 8.0) |
Date | |
Msg-id | Pine.GSO.4.62.0502132001240.21142@ra.sai.msu.su Whole thread Raw |
In response to | Re: Query optimizer 8.0.1 (and 8.0) (pgsql@mohawksoft.com) |
Responses |
Re: Query optimizer 8.0.1 (and 8.0)
("Jim C. Nasby" <decibel@decibel.org>)
Re: Query optimizer 8.0.1 (and 8.0) (pgsql@mohawksoft.com) |
List | pgsql-hackers |
Probably off-topic, but I think it's worth to see what astronomers are doing with their very big spatial databases. For example, we are working with more than 500,000,000 rows catalog and we use some special transformation of coordinates to integer numbers with preserving objects closeness. I hope we could show postgresql is good enough to be used in astronomy for very big catalogs. Currently, MS SQL is in use. See http://www.sdss.jhu.edu/htm/ for details. We use another technique. Oleg On Wed, 9 Feb 2005 pgsql@mohawksoft.com wrote: > I wrote a message caled "One Big trend vs multiple smaller trends in table > statistics" that, I think, explains what we've been seeing. > > >> pgsql@mohawksoft.com wrote: >>> >>> In this case, the behavior observed could be changed by altering the >>> sample size for a table. I submit that an arbitrary fixed sample size is >>> not a good base for the analyzer, but that the sample size should be >>> based >>> on the size of the table or some calculation of its deviation. >>> >> >> Mark, >> >> Do you have any evidence that the Sample Size had anything to do >> with the performance problem you're seeing? > > Sample size is only a bandaid for the issue, however, more samples always > provide more information. > > >> >> I also do a lot with the complete Census/TIGER database. >> >> Every problem I have with the optimizer comes down to the >> fact that the data is loaded (and ordered on disk) by >> State/County FIPS codes, and then queried by zip-code >> or by city name. Like this: >> >> Alabama 36101 [hundreds of pages with zip's in 36***] >> Alaska 99686 [hundreds of pages with zip's in 9****] >> Arizona 85701 [hundreds of pages with zip's in 855**] >> >> Note that the zip codes are *NOT* sequential. > > Again, read "One Big Trend..." and let me know what you think. I think it > describes exactly the problem that we see. > > For now, the solution that works for me is to seriously up the value of > "targrows" in analyze.c. It makes it take longer, and while the stats are > not "correct" because they are not designed to detect these sorts of > patterns, a larger sample allows them to be "less wrong" enough to give a > better hint to the planner. > > > > ---------------------------(end of broadcast)--------------------------- > TIP 7: don't forget to increase your free space map settings > Regards, Oleg _____________________________________________________________ Oleg Bartunov, sci.researcher, hostmaster of AstroNet, Sternberg Astronomical Institute, Moscow University (Russia) Internet: oleg@sai.msu.su, http://www.sai.msu.su/~megera/ phone: +007(095)939-16-83, +007(095)939-23-83
pgsql-hackers by date: