I'll tell you a little about what I need. Normally, during the day, records are made or recorded in the main database, which at the end of the day are consolidated (accounting closings) and are recorded in the database. In order not to make the main database grow without measure (which will only maintain the range between 3 months to 1 year). For this reason, this data must be transferred to another database so that it lasts over time and can be consulted by other areas. (This action is done humanly every day of the year at the end of the day) Therefore, the project seeks to be able to carry out this extraction of the consolidated data to another database, but automatically.
I was thinking of doing this with some triggers or with jobs that allow me to carry out these actions. I also thought of creating a replication of only the consolidated tables to the new historical database server, but I have not yet defined the method.
That's why I need to know if there is a tool that allows me to create this database.
I hope this clarifies a little the scope of the new historical database.
Thank you very much in advance Regards
Erik R. Serrano Saavedra
Data Base Administrator
I would first recommend looking into partitioning for managing data retention like this. As Ron says, you'll want to look into the performance implications of this, but it allows for the most efficient method of removing old data from PostgreSQL and is typically worth the overhead costs. Otherwise you're dealing with potentially expensive deletion operations and managing bloat vs just detaching/dropping a table.
From there, you can decide how to process the table to move it to another database. The simplest method would be to pg_dump it from the old database, drop it, then restore it to the "archive" database.
For time-based partitioning, pg_partman is a common tool that is used. It also includes features to help manage retention as well as reliably dumping out old tables to either be archived or used elsewhere.