Re: large table support 32,000,000 rows - Mailing list pgsql-admin

From Zhang, Anna
Subject Re: large table support 32,000,000 rows
Date
Msg-id 5511D658682A7740BA295CCF1E1233A635A881@vsvapostal2.bkup3
Whole thread Raw
In response to large table support 32,000,000 rows  (Christopher Smith <christopherl_smith@yahoo.com>)
List pgsql-admin
I have multi tables with over 10,000, 000 rows, the biggest one is 70, 000, 000 rows. For each table, there are several indexes, almost all columns are varchar2. In my experiences, with many indexes on large table, data insertion will be a pain. In my case, I have 30, 000 rows to be inserted to a table every day, it takes hours for each table, if I drop indexes, insertion speeds up, but recreate such indexes takes 7 hours. Usually querying data is not the problem, but I think you must consider the performance if you have to insert, update data frequently like me.
 
Hope it helps!
 
Anna Zhang
-----Original Message-----
From: Christopher Smith [mailto:christopherl_smith@yahoo.com]
Sent: Wednesday, March 20, 2002 5:26 PM
To: pgsql-admin@postgresql.org
Subject: [ADMIN] large table support 32,000,000 rows

I have a set of data that will compose a table with 32 million rows.  I currently run postgresql with tables as large as 750,000 rows.

Does anyone have experience with such large tables data.    In addition,  I have been reading information on moving postgresql tables to

another hard-drive  can anyone advise me.

Thanks



Do You Yahoo!?
Yahoo! Movies - coverage of the 74th Academy Awards®

pgsql-admin by date:

Previous
From: Oliver Elphick
Date:
Subject: Re: to --enable-locale or not to --enable-locale?
Next
From: Tom Lane
Date:
Subject: Re: to --enable-locale or not to --enable-locale?