Thread: Re: [GENERAL] BIG Data and Perl

Re: [GENERAL] BIG Data and Perl

From
Lincoln Yeoh
Date:
At 09:52 AM 15-10-1999 -0500, Andy Lewis wrote:
>I've got a fairly good size database that has in one table around 50,000
>records in it.
>
>It starts of and processes the first 300-400 rows fast and then gets
>slower in time and eventually just quits. It'll run for about 4-6 hours
>before it quits.
>
>Any idea what may be going on here?

Maybe you're running out of memory. Your perl script may be reading too
much into memory.

When using the perl DBI module, I get the impression that the perl script
reads in all the results when you do
$cursor->execute

I don't know if there are any ways around this. It can be a bit
inconvenient if the result is large ;).

Cheerio,

Link.


Re: [GENERAL] BIG Data and Perl

From
Charles Tassell
Date:
This is slightly unrelated (well, maybe more than slightly) but what is the
advantage to using cursors over normal SELECT statements?  I know from
experience that just using an execute("SELECT...") and fetchrow_array
doesn't go wild with memory usage, as long as you remember to close your
statement handles.

At 11:48 PM 10/17/99, Lincoln Yeoh wrote:
>At 09:52 AM 15-10-1999 -0500, Andy Lewis wrote:
>>I've got a fairly good size database that has in one table around 50,000
>>records in it.
>>
>>It starts of and processes the first 300-400 rows fast and then gets
>>slower in time and eventually just quits. It'll run for about 4-6 hours
>>before it quits.
>>
>>Any idea what may be going on here?
>
>Maybe you're running out of memory. Your perl script may be reading too
>much into memory.
>
>When using the perl DBI module, I get the impression that the perl script
>reads in all the results when you do
>$cursor->execute
>
>I don't know if there are any ways around this. It can be a bit
>inconvenient if the result is large ;).
>
>Cheerio,
>
>Link.
>
>
>************
>


Re: [GENERAL] BIG Data and Perl

From
Andy Lewis
Date:
I've identified the problem.

Its actually with a regex that I wrote. I'm in the process of re-writting
that.

Thanks.


Andy

On Mon, 18 Oct 1999, Lincoln Yeoh wrote:

> At 09:52 AM 15-10-1999 -0500, Andy Lewis wrote:
> >I've got a fairly good size database that has in one table around 50,000
> >records in it.
> >
> >It starts of and processes the first 300-400 rows fast and then gets
> >slower in time and eventually just quits. It'll run for about 4-6 hours
> >before it quits.
> >
> >Any idea what may be going on here?
>
> Maybe you're running out of memory. Your perl script may be reading too
> much into memory.
>
> When using the perl DBI module, I get the impression that the perl script
> reads in all the results when you do
> $cursor->execute
>
> I don't know if there are any ways around this. It can be a bit
> inconvenient if the result is large ;).
>
> Cheerio,
>
> Link.
>
>
> ************
>