Thread: Large database

Large database

From
Alexander Antonakakis
Date:
I would like to ask the more experienced users on Postgres database a
couple of questions I have on a db I manage with a lot of data. A lot of
data means something like 15.000.000 rows in a table. I will try to
describe the tables and what I will have to do on them :)
There is a table that has product data in the form of
Table product:
product_id varchar(8),
product_name text

and

product actions table:

product_id varchar(8),
flow char(1),
who int,
where int,
value float.

I will have to make sql queries in the form "select value from
product_actions where who='someone' and where='somewhere' and maybe make
also some calculations on these results. I allready have made some
indexes on these tables and a view that joins the two of them but I
would like to ask you people if someone is using such a big db and how
can I speed up things as much as it is possible on this ... these
product_actions tables exists for each year from 1988 till 2003 so this
means a lot of data...

Thanks in Advance



Re: Large database

From
Michael Fuhr
Date:
On Wed, Nov 10, 2004 at 04:10:43PM +0200, Alexander Antonakakis wrote:

> I will have to make sql queries in the form "select value from
> product_actions where who='someone' and where='somewhere' and maybe make
> also some calculations on these results. I allready have made some
> indexes on these tables and a view that joins the two of them but I
> would like to ask you people if someone is using such a big db and how
> can I speed up things as much as it is possible on this

Can you give us an example of a query you'd like to speed up?  Please
post the EXPLAIN ANALYZE output for the query as well so we can see
what the planner is doing.

Have you tuned any settings in postgresql.conf?  The following page
has some tuning tips:

http://www.varlena.com/varlena/GeneralBits/Tidbits/perf.html

--
Michael Fuhr
http://www.fuhr.org/~mfuhr/