On further investigation it turns out that I/we have a serious data
issue in that my small table is full of 'UNKNOWN' tags so my query
cannot associate the data correctly - thus I will end up with 2+
billion rows.
Thanks everyone for your help
On May 16, 2008, at 1:38 AM, Simon Riggs wrote:
>
> On Fri, 2008-05-16 at 00:31 -0600, kevin kempter wrote:
>
>> I'm running the join shown below and it takes > 10 hours and
>> eventually runs out of disk space on a 1.4TB file system
>
> Well, running in 10 hours doesn't mean there's a software problem, nor
> does running out of disk space.
>
> Please crunch some numbers before you ask, such as how much disk space
> was used by the query, how big you'd expect it to be etc, plus provide
> information such as what the primary key of the large table is and
> what
> is your release level is etc..
>
> Are you sure you want to retrieve an estimated 3 billion rows? Can you
> cope if that estimate is wrong and the true figure is much higher? Do
> you think the estimate is realistic?
>
> --
> Simon Riggs www.2ndQuadrant.com
> PostgreSQL Training, Services and Support
>
>
>
> --
> Sent via pgsql-performance mailing list (pgsql-performance@postgresql.org
> )
> To make changes to your subscription:
> http://www.postgresql.org/mailpref/pgsql-performance