It is problematic to produce small enough subset, due to large DB and
randomness of the situation.
But here is what we see in server log file, see below:
It seems like
WARNING: ShmemAlloc: out of memory
ERROR: FreeSpaceMap hashtable out of memory
goes together, does it related to the size of Shared Memory or could
increase of
Shared Memory solve the thing or that just a coincidence?
Another thing ( from the code) it seems that
hash_search trying to inset new entry into DynHash and cannot allocate
memory
in DynHash Context, so another question is DynHash Context upper
limited, another word
cannot grow more than 8 * 1024 * 1024 byte?
Any suggestions are welcome,
Thank you.
------------------- LOG
----------------------------------------------------
LOG: all server processes terminated; reinitializing shared memory and
semaphores
IpcMemoryCreate: shmget(key=5432001, size=4669440, 03600) failed: Not
enough
core
This error usually means that PostgreSQL's request for a shared
memory segment exceeded available memory or swap space.
To reduce the request size (currently 4669440 bytes), reduce
PostgreSQL's shared_buffers parameter (currently 256) and/or
its max_connections parameter (currently 128).
The PostgreSQL Administrator's Guide contains more information about
shared memory configuration.
Error 2:
WARNING: ShmemAlloc: out of memory
ERROR: FreeSpaceMap hashtable out of memory
-----Original Message-----
From: Tom Lane [mailto:tgl@sss.pgh.pa.us]
Sent: Wednesday, October 01, 2003 2:51 PM
To: Maksim Likharev
Cc: pgsql-hackers@postgresql.org
Subject: Re: [HACKERS] FreeSpaceMap hashtable out of memory
"Maksim Likharev" <mlikharev@aurigin.com> writes:
> Using PG under Cygwin we having following error message during INSERT
> INTO
> "FreeSpaceMap hashtable out of memory".
Hm, that's not supposed to happen. Can you create a reproducible
example?
regards, tom lane