Re: My Experiment of PG crash when dealing with huge amount of data - Mailing list pgsql-general

From Jeff Janes
Subject Re: My Experiment of PG crash when dealing with huge amount of data
Date
Msg-id CAMkU=1xDZ-yaK+mzLweqHL1wNCf0MTgh+Vpe1SwmGzzgicBXBQ@mail.gmail.com
Whole thread Raw
In response to My Experiment of PG crash when dealing with huge amount of data  (高健 <luckyjackgao@gmail.com>)
Responses Re: My Experiment of PG crash when dealing with huge amount of data  (高健 <luckyjackgao@gmail.com>)
List pgsql-general
On Fri, Aug 30, 2013 at 2:10 AM, 高健 <luckyjackgao@gmail.com> wrote:
>
>
> postgres=# insert into test01 values(generate_series(1,2457600),repeat(
> chr(int4(random()*26)+65),1024));

The construct "values (srf1,srf2)" will generate its entire result set
in memory up front, it will not "stream" its results to the insert
statement on the fly.

To spare memory, you would want to use something like:

insert into test01 select generate_series,
repeat(chr(int4(random()*26)+65),1024) from
generate_series(1,2457600);

Cheers,

Jeff


pgsql-general by date:

Previous
From: Kevin Grittner
Date:
Subject: Re: SSI and predicate locks - a non-trivial use case
Next
From: Jeff Davis
Date:
Subject: Re: Dump/Reload pg_statistic to cut time from pg_upgrade?