Hello,
One of our customers faced an unexpected behaviour when using pg_dump to
export one of DBs - pg_dump consumed all the server memory and got
terminated with following error:
....
pg_dump: reading row security enabled for table "public.databasechangelog"
pg_dump: reading policies for table "public.databasechangelog"
pg_dump: reading row security enabled for table "public.OperationAudit"
pg_dump: reading policies for table "public.OperationAudit"
pg_dump: reading large objects
pg_dump: [archiver (db)] query failed: out of memory for query result
pg_dump: [archiver (db)] query was: SELECT oid, (SELECT rolname FROM
pg_catalog.pg_roles WHERE oid = lomowner) AS rolname, lomacl FROM
pg_largeobject_metadata
....
Looking through the plan gave following:
Thanks for reporting the problem.
pg_dump process collects all the database objects first (tables, indexes and etc)
and then it write them into the file/archive.
In your scenario, there are many large objects that are present which leads to
allocate memory for the each object before it gets dumped leads to out of memory.
currently I don't see any alternative to this problem other than excluding the dump
and export them separately.
Regards,
Haribabu Kommi
Fujitsu Australia