Hello,
I have a database that contains a large amount of Large Objects
(>500MB). I am using this database to store images for an e-commerce
website, so I have a simple accessor script written in perl to dump out
a blob based on a virtual 'path' stored in a table (and associated with
the large object's OID). This system seemed to work wonderfully until I
put more than ~500MB of binary data into the database.
Now, every time I run the accessor script (via the web OR the command
line), the postmaster process gobbles up my CPU resources (usually >30%
for a single process - and it's a 1GHz processor with 1GB of RAM!), and
the script takes a very long time to completely dump out the data.
I have the same issue with an import script that reads files from the
hard drive and puts them into Large Objects in the database. It takes a
very long time to import whereas before, it would run extremely fast.
Are there any known issues in PostgreSQL involving databases with a
lot of binary data? I am using PostgreSQL v7.2.3 on a linux system.
Thanks,
-Jeremy
--
------------------------
Jeremy C. Andrus
http://www.jeremya.com/
------------------------