Thread: BUG #13637: ~2 GB psql import fails with out of memory error on machine with 64 GB RAM

BUG #13637: ~2 GB psql import fails with out of memory error on machine with 64 GB RAM

From
aaron@heyaaron.com
Date:
The following bug has been logged on the website:

Bug reference:      13637
Logged by:          Aaron C. de Bruyn
Email address:      aaron@heyaaron.com
PostgreSQL version: 9.3.9
Operating system:   Ubuntu 14.04 x64
Description:

I have a small digital ocean virtual machine w/ 512 MB RAM running a small
Sentry database (http://getsentry.com) for error reporting as well as a few
'dev' databases for websites and applications being designed.

Unfortunately I wasn't running the sentry 'cleanup' script, and my sentry
database grew to approximately 2.3 GB (according to the nightly dump from
autopostgresqlbackup.

I was running 9.1.x and decided it was time to upgrade.  I upgraded to 9.3.
During the upgrade, a misunderstanding of the Ubuntu upgrade scripts caused
me to 'pg_dropcluster --stop 9.3 main' after the scripts had already
upgraded 9.1 to 9.3 on my behalf.

Time to restore databases from my nightly backups.  :)

All my databases restored except for the sentry database.

Running 'psql sentry < sentry.sql' as the postgres user shows tables being
created, rows being inserted, etc...

It runs for a few minutes, and then I either get an out of memory error, or
it appears to exit normally without importing the data.

Thinking it was because my 'dev' server is only 512 MB, I temporarily scaled
it up to 8 GB.  Same error.

I scaled it up to 16 GB.  Same error.

I went whole-hog and scaled it to 64 GB.
Same error.

I kept trying and discovered that the box never gets below ~54 GB free
meaning less than 10 GB are being used up when it reports being out of
memory

I rebooted out of frustration, I tweaked sysctl settings for overcommitting
memory, etc...

No effect.

Finally I removed all the tweaks, scaled the box back to 512 MB, and
rebooted to start gathering info for a bug report.

I started running 'psql sentry < sentry.sql' and was watching the memory
usage drop when I noticed that I had forgotten to re-create my swap file
after the last reboot.

I ran swapon and watched as psql kept running while never touching my
swapfile, and my box continued to hover around 150K free.

By the time it crashed with an out-of-memory error, it restored most of the
tables in the database.

I tried importing again with the swap file in-place, and it consumed all the
memory and crashed.  I tried disabling swap, starting psql, then enabling
swap and I was able to restore the same majority of tables.

I'm sorry I can't pin this down further, but it's a bit out of my depth at
this point.

I do have the ~2.3 GB backup of the DB which bzip's down to ~550 MB if any
developers are interested in playing/testing.
aaron@heyaaron.com writes:
> I tried importing again with the swap file in-place, and it consumed all the
> memory and crashed.  I tried disabling swap, starting psql, then enabling
> swap and I was able to restore the same majority of tables.

That's odd.  Was it psql itself that was exhausting memory, or the
connected backend?

> I do have the ~2.3 GB backup of the DB which bzip's down to ~550 MB if any
> developers are interested in playing/testing.

That sounds too big for email, but if you have another way to get it
to me, I'd take a look.

            regards, tom lane

Re: BUG #13637: ~2 GB psql import fails with out of memory error on machine with 64 GB RAM

From
"Aaron C. de Bruyn"
Date:
psql was definitely using up the most memory--but like I said, the
total memory usage on the server was ~10 GB.  There was approximately
54 GB free the entire time.

A few options:

* I can put the DB on my server and send you an HTTP URL
* You can send me an SSH public key and do rsync or sftp
* I can stick it in a folder and send you a bitsync key

I have no preference.  The DB probably doesn't contain anything
private, but it could contain a few dev passwords.  Quite a few of
them have been changed since I decided to have an impromptu test of my
backups.  ;)

-A

On Thu, Sep 24, 2015 at 9:17 PM, Tom Lane <tgl@sss.pgh.pa.us> wrote:
> aaron@heyaaron.com writes:
>> I tried importing again with the swap file in-place, and it consumed all the
>> memory and crashed.  I tried disabling swap, starting psql, then enabling
>> swap and I was able to restore the same majority of tables.
>
> That's odd.  Was it psql itself that was exhausting memory, or the
> connected backend?
>
>> I do have the ~2.3 GB backup of the DB which bzip's down to ~550 MB if any
>> developers are interested in playing/testing.
>
> That sounds too big for email, but if you have another way to get it
> to me, I'd take a look.
>
>                         regards, tom lane
On Fri, Sep 25, 2015 at 3:06 AM,  <aaron@heyaaron.com> wrote:
> It runs for a few minutes, and then I either get an out of memory error, or
> it appears to exit normally without importing the data.
>
> Thinking it was because my 'dev' server is only 512 MB, I temporarily scaled
> it up to 8 GB.  Same error.

I would suggest posting the actual error. There are various errors
that can be read as "out of memory" but they're not really the same
thing. If you had free memory on the machine then which error it was
specifically might be important. Also Postgres prints more information
than just "out of memory" and that additional context could be
helpful.

--
greg

Re: BUG #13637: ~2 GB psql import fails with out of memory error on machine with 64 GB RAM

From
"Aaron C. de Bruyn"
Date:
That's the only error I got.  It spit out a few rows showing SQL
commands being processed, then it would die with "out of memory" or
just return to the command prompt.

I ended up destroying the DB and re-creating it since it didn't have
any critical data in it.

-A

On Wed, Oct 14, 2015 at 12:31 AM, Greg Stark <stark@mit.edu> wrote:
> On Fri, Sep 25, 2015 at 3:06 AM,  <aaron@heyaaron.com> wrote:
>> It runs for a few minutes, and then I either get an out of memory error, or
>> it appears to exit normally without importing the data.
>>
>> Thinking it was because my 'dev' server is only 512 MB, I temporarily scaled
>> it up to 8 GB.  Same error.
>
> I would suggest posting the actual error. There are various errors
> that can be read as "out of memory" but they're not really the same
> thing. If you had free memory on the machine then which error it was
> specifically might be important. Also Postgres prints more information
> than just "out of memory" and that additional context could be
> helpful.
>
> --
> greg