Re: Insert 1 million data - Mailing list pgsql-admin

From Olivier Gautherot
Subject Re: Insert 1 million data
Date
Msg-id CAJ7S9TX0smQF6_hNf3EbcPGGeKYQpHwdUmyXsnFo7hhASiq3CA@mail.gmail.com
Whole thread Raw
In response to Re: Insert 1 million data  (Sreejith P <sreejith@lifetrenz.com>)
List pgsql-admin
Hi Sreejit,

On Tue, Dec 29, 2020 at 10:56 AM Sreejith P <sreejith@lifetrenz.com> wrote:

Thanks Rohit.

 

After upgrading volume getting following error. Almost same as previous one.

 

We have increased backup volume and run the Job Again . When I reach 900 thousand records,  getting almost similar error again. 

 

  • Do I need to turn off auto vaccum ?
  • Shall increase maintance work mem ? 

If you're tight on space, my recommendation would be to run the inserts in small batches (say 10,000 at a time). Don't turn off autovaccum, ever :-)

That being said, if you're suffering this way when creating your database, my inclination would be to move it with its logs to a disk with more space. Your server has no scalability and you'll suffer more dramatic crashes very quickly.

My cent worth...
--
Olivier Gautherot
 

pgsql-admin by date:

Previous
From: "Dischner, Anton"
Date:
Subject: AW: Insert 1 million data
Next
From: Deepak tyagi
Date:
Subject: Patroni issue