Seeking Suggestions for Best Practices: Archiving and Migrating Historical Data in PostgreSQL - Mailing list pgsql-admin

From Motog Plus
Subject Seeking Suggestions for Best Practices: Archiving and Migrating Historical Data in PostgreSQL
Date
Msg-id CAL5GnivMgBgRdY9YTLmAQKQa=TQVTRwghiGovK6Q6XxScdGOzg@mail.gmail.com
Whole thread Raw
Responses Re: Seeking Suggestions for Best Practices: Archiving and Migrating Historical Data in PostgreSQL
List pgsql-admin
Hi Team,

We are currently planning a data archival initiative for our production PostgreSQL databases and would appreciate suggestions or insights from the community regarding best practices and proven approaches.

**Scenario:**
- We have a few large tables (several hundred million rows) where we want to archive historical data (e.g., older than 1 year).
- The archived data should be moved to a separate PostgreSQL database (on a same or different server).
- Our goals are: efficient data movement, minimal downtime, and safe deletion from the source after successful archival.

- PostgreSQL version: 15.12
- Both source and target databases are PostgreSQL.

We explored using `COPY TO` and `COPY FROM` with CSV files, uploaded to a SharePoint or similar storage system. However, our infrastructure team raised concerns around the computational load of large CSV processing and potential security implications with file transfers.

We’d like to understand:
- What approaches have worked well for you in practice?
- Are there specific tools or strategies you’d recommend for ongoing archival?
- Any performance or consistency issues we should watch out for?

Your insights or any relevant documentation/pointers would be immensely helpful.

Thanks in advance for your guidance!

Best regards,  
Ramzy

pgsql-admin by date:

Previous
From: Sbob
Date:
Subject: Signing key not found error
Next
From: hubert depesz lubaczewski
Date:
Subject: Re: pg_dump verbose start and stop times?