Re: My Experiment of PG crash when dealing with huge amount of data - Mailing list pgsql-general

From 高健
Subject Re: My Experiment of PG crash when dealing with huge amount of data
Date
Msg-id CAL454F0wXbvXSmiV7qm0dvGtArbR0jp2yrgjMi1uzqm8AE0eig@mail.gmail.com
Whole thread Raw
In response to Re: My Experiment of PG crash when dealing with huge amount of data  (Jeff Janes <jeff.janes@gmail.com>)
Responses Re: My Experiment of PG crash when dealing with huge amount of data
Re: My Experiment of PG crash when dealing with huge amount of data
List pgsql-general
>To spare memory, you would want to use something like:

>insert into test01 select generate_series,
>repeat(chr(int4(random()*26)+65),1024) from
>generate_series(1,2457600);

Thanks a lot!

What I am worrying about is that:  
If data grows rapidly, maybe our customer will use too much memory , Is ulimit  command a good idea for PG?

Best Regards



2013/9/1 Jeff Janes <jeff.janes@gmail.com>
On Fri, Aug 30, 2013 at 2:10 AM, 高健 <luckyjackgao@gmail.com> wrote:
>
>
> postgres=# insert into test01 values(generate_series(1,2457600),repeat(
> chr(int4(random()*26)+65),1024));

The construct "values (srf1,srf2)" will generate its entire result set
in memory up front, it will not "stream" its results to the insert
statement on the fly.

To spare memory, you would want to use something like:

insert into test01 select generate_series,
repeat(chr(int4(random()*26)+65),1024) from
generate_series(1,2457600);

Cheers,

Jeff

pgsql-general by date:

Previous
From: Adrian Klaver
Date:
Subject: Re: store multiple rows with the SELECT INTO statement
Next
From: Tom Lane
Date:
Subject: Re: My Experiment of PG crash when dealing with huge amount of data