I'm starting to wonder if my database has finally grown too big for my
computer.
But, rather than peform badly, I see a number of "bad things" happen:
* pg_dump fails to run, causing an "out of memory" error (I don't
understand
why this happens, but it does);
* restarting postgres across a reboot results in tables disappearing
without any error messages being given out (this is *really
disturbing*.)
postgres itself, however, starts up fine.
"wc -l" of the output from the last successuful dump_all is around
8million
lines, spread across half a dozen or so tables.
I'm afraid that if I restructure the db to have smaller tables, I'd just
be
putting off "hitting" the ceiling.
What can I safely do to get postgres running normally?
I'm currently using 8.1.3, will I fare any better with a more recent
version?
My end goal is to have maybe 50 million records. Will postgresql handle
this
(with only 1.5GB of RAM) or do I need to look elsewhere for a db of this
size?
Thoughts?
Cheers,
Darren