Re: [GENERAL] BIG Data and Perl - Mailing list pgsql-general

From Lincoln Yeoh
Subject Re: [GENERAL] BIG Data and Perl
Date
Msg-id 3.0.5.32.19991018104801.008d5b10@pop.mecomb.po.my
Whole thread Raw
Responses Re: [GENERAL] BIG Data and Perl
Re: [GENERAL] BIG Data and Perl
List pgsql-general
At 09:52 AM 15-10-1999 -0500, Andy Lewis wrote:
>I've got a fairly good size database that has in one table around 50,000
>records in it.
>
>It starts of and processes the first 300-400 rows fast and then gets
>slower in time and eventually just quits. It'll run for about 4-6 hours
>before it quits.
>
>Any idea what may be going on here?

Maybe you're running out of memory. Your perl script may be reading too
much into memory.

When using the perl DBI module, I get the impression that the perl script
reads in all the results when you do
$cursor->execute

I don't know if there are any ways around this. It can be a bit
inconvenient if the result is large ;).

Cheerio,

Link.


pgsql-general by date:

Previous
From: Jim Cromie
Date:
Subject: Re: XQL Problems
Next
From: Charles Tassell
Date:
Subject: Re: [GENERAL] BIG Data and Perl