On Wed, 2010-07-28 at 11:09 -0600, Bill Thoen wrote:
> I'm building a national database of agricultural information and one
> of the layers is a bit more than a gigabyte per state. That's 1-2
> million records per state, with a mult polygon geometry, and i've got
> about 40 states worth of data. I trying to store everything in a
> single PG table. What I'm concerned about is if I combine every state
> into one big table then will performance will be terrible, even with
> indexes? On the other hand, if I store the data in several smaller
> files, then if a user zooms in on a multi-state region, I've got to
> build or find a much more complicated way to query multiple files.
>
> So I'm wondering, should I be concerned with building a single
> national size table (possibly 80-100 Gb) for all these records, or
> should I keep the files smaller and hope there's something like
> ogrtindex out there for PG tables? what do you all recommend in this
> case?
80-100Gb isn't that much. However it may be worth looking into
partitioning by state.
Sincerely,
Joshua D. Drake
--
PostgreSQL.org Major Contributor
Command Prompt, Inc: http://www.commandprompt.com/ - 509.416.6579
Consulting, Training, Support, Custom Development, Engineering
http://twitter.com/cmdpromptinc | http://identi.ca/commandprompt