PostgreSQL unsuitable for large datasets? - Mailing list pgsql-cygwin

From Michael Labhard
Subject PostgreSQL unsuitable for large datasets?
Date
Msg-id akr31g$7mo$1@main.gmane.org
Whole thread Raw
Responses Re: PostgreSQL unsuitable for large datasets?
List pgsql-cygwin
    We have a PostgreSQL (7.2.1) database on cygwin with one table that
contains some 2 x 10**6 records.  The database is about 100 MB total.  This
does not seem so large, but while attempting to do an insert into an empty
table using a select that joins to the large table on all the rows the
error,

        Memory exhausted in AllocSetAlloc(36)

occurs.  Reviewing posts on this subject as well as the PostgreSQL FAQ did
not reveal any specific solution.  ulimit -a:

$ ulimit -a
core file size        (blocks, -c) unlimited
data seg size         (kbytes, -d) unlimited
file size             (blocks, -f) unlimited
open files                    (-n) 256
pipe size          (512 bytes, -p) 8
stack size            (kbytes, -s) 2046
cpu time             (seconds, -t) unlimited
max user processes            (-u) 63
virtual memory        (kbytes, -v) 2097152

Virtual memory would appear to be adequate.  Increasing the SORT_MEM
parameter in the postgresql.conf file did not help.  Finally had to
partition the large table into 10 parts and do each individually.

    Seems like this problem would need to have a ready and definitive
solution before PostgreSQL could be used for large, serious databases.  How
large are the databases with which others are using PostgreSQL?  How do
other avoid this error with large datasets?

-- Michael





pgsql-cygwin by date:

Previous
From: "Michael Labhard"
Date:
Subject: Re: Why sequential scan with index present?
Next
From: Jason Tishler
Date:
Subject: Re: PostgreSQL unsuitable for large datasets?