Greetings-
The database for my current project has grown very large (four tables: 17
records, 10,000 records, 3,000,000 records, and 5,000,000 records,
respectively). Doing things with the data has, therefore, become rather
cumbersome, as operations on the large-N tables can take quite a while. I
wonder if anyone can offer tips on boosting performance? I've done the
obvious, such as building indices on the columns used in searches and
joins.
The computer is a 1Ghz PIII (IBM NetVista) running debian linux
(woody) and PostgreSQL 7.1.3. There's 512M of RAM in it, and top shows
that swap rarely gets used, so one possibility is to try to have pg keep
more workspace in RAM at once. I could also potentially buy more RAM for
the machine.
Thanks for any advice.
----------------------------------------------------------------------
Andrew J Perrin - andrew_perrin@unc.edu - http://www.unc.edu/~aperrin
Assistant Professor of Sociology, U of North Carolina, Chapel Hill
269 Hamilton Hall, CB#3210, Chapel Hill, NC 27599-3210 USA