On 5/20/06, Tom Lane <tgl@sss.pgh.pa.us> wrote:
> "Brendan Jurd" <direvus@gmail.com> writes:
> > I noticed a peculiarity in the default postgres aggregate functions. min()=
> > ,
> > max() and avg() support interval as an input type, but stddev() and
> > variance() do not.
>
> > Is there a rationale behind this, or is it just something that was never
> > implemented?
>
> Is it sensible to calculate standard deviation on intervals? How would
> you handle the multiple components? I mean, you could certainly define
> *something*, but how sane/useful would the result be?
Strictly speaking there's nothing bad in intervals. Physically
standart deviation on interval can be very useful without any doubts.
I can make a lot of examples on this. Say you want to know stat
parameters of semi-regular periodical process (avg distance in time
between maximums of some value and stddev of this quasiperiod -- why
not?).
Regards,
Ivan Zolotukhin