On 2/28/22 18:54, Keith Fiske wrote:
On 2/24/22 12:37 PM, Joseph Hammerman wrote:
Hi postgresql-admins,
Has anyone put any thought or effort into figuring out how to measure the total volume of data in a database against how much of it is hot? I'm looking for some automatable approaches. Similarly, is there a way to measure rarely queried columns, or unused functions & triggers?
--
Angular momentum makes the world go 'round.
For monitoring data itself, I'm not sure of anything else built in other than what Ron shared with the stats table. If this is something you really need to monitor, perhaps look into adding a "changed_at" column to the tables that really need it and set a trigger to automatically update the timestamp of that column whenever the row is updated. Not sure how to track column usage.
--
One way would be the proper table design, with TIMESTAMP column which records the time of the last modification. Another way would be a table trigger which would do time accounting for the table. You can implement a "clock" algorithm which would set a "modified" column to 0 and then wait for any application to modify the column and set it to 1. The second hand of the clock would then set the "old" column. That is roughly analogous to the way Unix and Linux paging systems work.
--
Mladen Gogala
Database Consultant
Tel: (347) 321-1217
https://dbwhisperer.wordpress.com