Re: Splitting one big table into smaller ones - Mailing list pgsql-general

From Ed Loehr
Subject Re: Splitting one big table into smaller ones
Date
Msg-id 38EE1239.BFAEA600@austin.rr.com
Whole thread Raw
In response to Splitting one big table into smaller ones  (Guillaume Perréal <perreal@lyon.cemagref.fr>)
List pgsql-general
Guillaume Perréal wrote:
>
> I've got a big table of measures like that:
>
> CREATE TABLE measures (
>     stationCode varchar(8),
>     when datetime,
>     value float,
>     quality char,
>     PRIMARY KEY (stationCode, when)
> );
>
> Each station (identified by stationCode) could have up to 10**6 measures. So I
> want to split it into smaller tables to increase perfomance :
>
> CREATE TABLE measures<1st stationCode> (
>     when datetime,
>     value float,
>     quality char,
>     PRIMARY KEY (when)
> );
> CREATE TABLE measures<2nd stationCode> (
>     ... and so on.
>
> Is there a way to handle that using SQL and PL/pgSQL languages ?

Pretty straight-forward to do that with perl/DBI, or even bash & friends.  But I recall that PL/pgSQL has major
problemswith CREATE TABLE and other DDL statements.   

BTW, I suspect your easiest significant performance improvement would come from adding an integer primary key for
stationCoderather than a varchar key, eg., 

CREATE TABLE stations (
    id serial,  # or just integer
    code varchar(*),
    ...
);

CREATE TABLE measures (
    stationId integer,
    when datetime,
    value float,
    quality char,
    PRIMARY KEY (stationId, when)
);

If you try this, I'd be curious to hear the results.  Your approach is surely faster, though by how much I don't know.
Itjust looks like a bit of a headache to manage, that's all. 

Regards,
Ed Loehr

pgsql-general by date:

Previous
From: Andy Lewis
Date:
Subject: Re: Searching www.postgresql.org - TIP
Next
From: Bruce Momjian
Date:
Subject: Re: Changes doc for 7.0 beta 4?