Re: PoC: history of recent vacuum/checkpoint runs (using new hooks) - Mailing list pgsql-hackers

From Tomas Vondra
Subject Re: PoC: history of recent vacuum/checkpoint runs (using new hooks)
Date
Msg-id 25dce92f-43f4-42b3-8370-313e4be7796b@vondra.me
Whole thread Raw
In response to Re: PoC: history of recent vacuum/checkpoint runs (using new hooks)  (Robert Treat <rob@xzilla.net>)
Responses Re: PoC: history of recent vacuum/checkpoint runs (using new hooks)
List pgsql-hackers
On 1/7/25 21:42, Robert Treat wrote:
> On Tue, Jan 7, 2025 at 10:44 AM Bertrand Drouvot
> <bertranddrouvot.pg@gmail.com> wrote:
>>
>> ...
>>
>> Another idea regarding the storage of those metrics: I think that one would
>> want to see "precise" data for recent metrics but would probably be fine with some
>> level of aggregation for historical ones. Something like being able to retrieve
>> "1 day of raw data" and say one year of data aggregated by day (average, maximum,
>> minimum , standard deviation and maybe some percentiles) could be fine too.
>>
> 
> While I'm sure some people are ok with it, I would say that most of
> the observability/metrics community has moved away from aggregated
> data storage towards raw time series data in tools like prometheus,
> tsdb, and timescale in order to avoid the problems that misleading /
> lossy / low-resolution data can create.
> 

That's how I see it too. My primary goal is to provide the raw data,
even if it covers only a limited amount of time, so that it can be
either queried directly, or ingested regularly into something like
prometheus.

I can imagine a more complicated system, aggregating the data after into
a lower resolution (e.g. per day). But that's not a complete solution,
because e.g. what if there are many relations that happen to be related
only once per day?


regards

-- 
Tomas Vondra




pgsql-hackers by date:

Previous
From: Nathan Bossart
Date:
Subject: Re: Parametrization minimum password lenght
Next
From: Nathan Bossart
Date:
Subject: Re: Small patch to use pqMsg_Query instead of `Q`