Carl Sopchak <carl.sopchak@cegis123.com> writes:
> Well, the upgrade to 8.3 seemed to rid me of the command limit, but now I'm
> running out of memory. I have 2Gb physical and 8Gb swap (after adding 4Gb).
What do you mean you're running out of memory? For most part of Postgres
that's only a problem if you've configured it to use more memory than your
system can handle -- such as setting work_mem or shared_buffers too large.
One area that can cause problems is having too many trigger executions queued
up. I don't know if that's what you're running into though.
> Is there a way for me to run this outside of one huge transaction? This
> really shouldn't be using more than a few hundred megs of RAM (assuming
> cursor records are all stored in memory)...
Personally I find it much more flexible to implement these types of jobs as
external scripts connecting as a client. That lets you stop/start transactions
freely. It also allows you to open multiple connections or run the client-side
code on a separate machine which can have different resources available.
--
Gregory Stark
EnterpriseDB http://www.enterprisedb.com
Ask me about EnterpriseDB's Slony Replication support!