Re: Strategies/Best Practises Handling Large Tables - Mailing list pgsql-general

From Igor Romanchenko
Subject Re: Strategies/Best Practises Handling Large Tables
Date
Msg-id CAP95Gq=ypN-mapDnYBnXeGfJUMgXtyonfLQ8j=yC3r_tGNiPDQ@mail.gmail.com
Whole thread Raw
In response to Re: Strategies/Best Practises Handling Large Tables  (Chitra Creta <chitracreta@gmail.com>)
List pgsql-general


On Thu, Nov 15, 2012 at 1:34 PM, Chitra Creta <chitracreta@gmail.com> wrote:
Thanks for your example Chris. I will look into it as a long-term solution.

Partitioning tables as a strategy worked very well indeed. This will be my short/medium term solution. 

Another strategy that I would like to evaluate as a short/medium term solution is archiving old records in a table before purging them.

I am aware that Oracle has a tool that allows records to be exported into a file / archive table before purging them. They also provide a tool to import these records.

Does PostgreSQL have similar tools to export to a file and re-import? 

If PostgreSQL does not have a tool to do this, does anyone have any ideas on what file format (e.g. text file containing a table of headers being column names and rows being records) would be ideal for easy re-importing into a PostgreSQL table?

Thank you for your ideas.

PostgreSQL has COPY TO to export records to a file ( http://wiki.postgresql.org/wiki/COPY ).

pgsql-general by date:

Previous
From: sk baji
Date:
Subject: Re: How to list all schema names inside a PostgreSQL database through SQL
Next
From: Sébastien Lardière
Date:
Subject: Plproxy with returns table() make PG segfault