I'm not sure if this is what you intend to retrieve, but try
select distinct on ( sample.type,nuclide )
year as First_year,sample.type,avg(value),nuclide
from sample,measure
where sample.id = sampleid
group by sample.type,nuclide,year
order by sample.type,nuclide,year asc ;
Regards, Christoph
PS If this works, a short reply would be nice.
>
> I have a data base for storing information on radioactive contamination in
> various samples. I want to dig out for a set of the samples the first year
> for which I have information for the sample types and a radio nuclides and
> the average activity for that year. The relevant part of my data base are
> two tables, sample and measure, which a bit simplified can be described as
>
> create table sample(
> id serial primary key,
> type char()
> year integer
> )
>
> create table measure(
> id serial primary key,
> sample id integer references sample(id),
> nuclide char(),
> value float
> )
>
>
> I tried to do a
> select year as First_year,sample.type,avg(value),nuclide
> from sample,measure
> where sample.id = sampleid
> group by sample.type,nuclide
> having year=min(year)
>
> But it told me that year has to be grouped to be used in this way, so I
> grouped, but then I got averages for each year, and the same result if I
> also put a min(year) in the select part of the statement...
>
> If I do a
> select min(year) as First_year,sample.type,nuclide
> from sample,measure
> where sample.id = sampleid
> group by sample.type,nuclide
>
> I get the first year for each sample type and nuclide, but then I need the
> average for each of those first years... I thought I should use having for
> that, but I have obiously overlooked or misunderstood something..
>
> Any clues?
>
> Morten
>
> --
> Morten Sickel
> Norwegian Radiation Protection Authority
>
> ---------------------------(end of broadcast)---------------------------
> TIP 3: if posting/reading through Usenet, please send an appropriate
> subscribe-nomail command to majordomo@postgresql.org so that your
> message can get through to the mailing list cleanly
>