Thread: Where to start for performance problem?

Where to start for performance problem?

From
"MK Spam"
Date:
I've scanned some of the archives and have learned a lot about different performance tuning practices.  I will be looking into using many of these ideas but I'm not sure they address the issue I am currently experiencing.
 
First, I'm a total newb with postgresql.  Like many before me, I have inherited many responsibilities outside of my original job description due to layoffs.  I am now the sole developer/support for a software product.  *sigh*  This product uses postgresql.  I am familiar with the basics of sql and have worked on the database and code for the software but by no means am I proficient with postgresql.
 
The archives of this list provides many ideas for improving performance, but the problem we are having is gradually degrading performance ending in postgres shutting down.  So it's not a matter of optimizing a complex query to take 5 seconds instead of 60 seconds.  From what I can tell we are using the VACUUM command on a schedule but it doesn't seem to prevent the database from becoming "congested" as we refer to it.  :]  Anyway, the only way I know to "fix" the problem is to export (pg_dump) the db, drop the database, recreate the database and import the dump.  This seems to return performance back to normal but obviously isn't a very good "solution".  The slowdown and subsequent crash can take as little as 1 week for databases with a lot of data or go as long as a few weeks to a month for smaller data sets.
 
I don't really know where to start looking for a solution.  Any advice on where to start, understanding that I am a newb, would be greatly appreciated.  Thank you.
 
Nid

Re: Where to start for performance problem?

From
"Nid"
Date:
My apologies for the "From" name of MK Spam.  That references an email account I made for signing up for things on the net.  :]
 
Nid
----- Original Message -----
From: MK Spam
Sent: Monday, November 24, 2003 4:03 PM
Subject: [PERFORM] Where to start for performance problem?

I've scanned some of the archives and have learned a lot about different performance tuning practices.  I will be looking into using many of these ideas but I'm not sure they address the issue I am currently experiencing.
 
First, I'm a total newb with postgresql.  Like many before me, I have inherited many responsibilities outside of my original job description due to layoffs.  I am now the sole developer/support for a software product.  *sigh*  This product uses postgresql.  I am familiar with the basics of sql and have worked on the database and code for the software but by no means am I proficient with postgresql.
 
The archives of this list provides many ideas for improving performance, but the problem we are having is gradually degrading performance ending in postgres shutting down.  So it's not a matter of optimizing a complex query to take 5 seconds instead of 60 seconds.  From what I can tell we are using the VACUUM command on a schedule but it doesn't seem to prevent the database from becoming "congested" as we refer to it.  :]  Anyway, the only way I know to "fix" the problem is to export (pg_dump) the db, drop the database, recreate the database and import the dump.  This seems to return performance back to normal but obviously isn't a very good "solution".  The slowdown and subsequent crash can take as little as 1 week for databases with a lot of data or go as long as a few weeks to a month for smaller data sets.
 
I don't really know where to start looking for a solution.  Any advice on where to start, understanding that I am a newb, would be greatly appreciated.  Thank you.
 
Nid

Re: Where to start for performance problem?

From
Christopher Kings-Lynne
Date:
> The archives of this list provides many ideas for improving performance,
> but the problem we are having is gradually degrading performance ending
> in postgres shutting down.  So it's not a matter of optimizing a complex
> query to take 5 seconds instead of 60 seconds.  >From what I can tell we
> are using the VACUUM command on a schedule but it doesn't seem to
> prevent the database from becoming "congested" as we refer to it.  :]

Our busy website has a cronjob that runs VACUUM ANALYZE once an hour
(vacuumdb -a -q -z).

Have you tried going 'VACUUM FULL ANALYZE' (vacuumdb -a -q -z -f)
instead of a dump and reload?

Chris



Re: Where to start for performance problem?

From
"Nid"
Date:
I've been digging around in the code and found where we are executing the
VACUUM command.  VACUUM ANALYZE is executed every 15 minutes.  We haven't
tried VACUUM FULL ANALYZE.  I think I read that using FULL is a good idea
once a day or something.  Just doing a VACUUM ANALYZE doesn't seem to be
preventing our problem.  Thank you for the responses.

nid

----- Original Message -----
From: "Christopher Kings-Lynne" <chriskl@familyhealth.com.au>
To: "MK Spam" <mk_spam@comcast.net>
Cc: <pgsql-performance@postgresql.org>
Sent: Monday, November 24, 2003 7:13 PM
Subject: Re: [PERFORM] Where to start for performance problem?


> > The archives of this list provides many ideas for improving performance,
> > but the problem we are having is gradually degrading performance ending
> > in postgres shutting down.  So it's not a matter of optimizing a complex
> > query to take 5 seconds instead of 60 seconds.  >From what I can tell we
> > are using the VACUUM command on a schedule but it doesn't seem to
> > prevent the database from becoming "congested" as we refer to it.  :]
>
> Our busy website has a cronjob that runs VACUUM ANALYZE once an hour
> (vacuumdb -a -q -z).
>
> Have you tried going 'VACUUM FULL ANALYZE' (vacuumdb -a -q -z -f)
> instead of a dump and reload?
>
> Chris
>
>
>



Re: Where to start for performance problem?

From
Christopher Kings-Lynne
Date:
Nid wrote:

> I've been digging around in the code and found where we are executing the
> VACUUM command.  VACUUM ANALYZE is executed every 15 minutes.  We haven't
> tried VACUUM FULL ANALYZE.  I think I read that using FULL is a good idea
> once a day or something.  Just doing a VACUUM ANALYZE doesn't seem to be
> preventing our problem.  Thank you for the responses.

Try upgrading to PostgreSQL 7.4 and use the new pg_autovacuum daemon.
This daemon will monitor your tables and vacuum and analyze whenever
necessary.

Chris



Re: Where to start for performance problem?

From
Rob Fielding
Date:
The problems with giving suggestions about increasing performance is
that one persons increase is another persons decrease.

having said that, there are a few general suggestions :

Set-up some shared memory, about a tenth of your available RAM, and
configure shared_memory and max_clients correctly. I've used the
following formula, ripped off the net from somewhere. It's not entirely
acurate, as other settings steal a little shared memory, but it works
for the most part :

((1024*RAM_SIZE) - (14.2 * max_connections) - 250) / 8.2

as I say, it should get you a good value, otherwise lower it bit by bit
if you have trouble starting your db.

Increase effective_cache (50%-70% avail ram) and sort_mem (about 1/20th
ram) and lower you random_page_cost to around 2 or less (as low as 0.3)
if you have fast SCSI drives in a RAID10 set-up - this was a big speedup ;)

But this might not be the answer though. The values detailed above are
when tuning an already stable setup.

Perhaps you need to look at your system resource usage. If you're
degrading performance over time it sounds to me like you are slowly
running out of memory and swap ?

Generall if I take something over, I'll try and get it onto my terms.
Have you tried importing the DB to a fresh installation, one where you
know sensible defaults are set, so you aren't inheriting any cruft from
the previous sysadmin.

To be honest tho, I've never run pg so that it actually shutdown because
it was running so badly - i just wouldn't think it would do that.


--

Rob Fielding
rob@dsvr.net

www.dsvr.co.uk              Development             Designer Servers Ltd


Re: Where to start for performance problem?

From
Bruno Wolff III
Date:
On Mon, Nov 24, 2003 at 16:03:17 -0600,
  MK Spam <mk_spam@comcast.net> wrote:
>
> The archives of this list provides many ideas for improving performance, but the problem we are having is gradually
degradingperformance ending in postgres shutting down.  So it's not a matter of optimizing a complex query to take 5
secondsinstead of 60 seconds.  From what I can tell we are using the VACUUM command on a schedule but it doesn't seem
toprevent the database from becoming "congested" as we refer to it.  :]  Anyway, the only way I know to "fix" the
problemis to export (pg_dump) the db, drop the database, recreate the database and import the dump.  This seems to
returnperformance back to normal but obviously isn't a very good "solution".  The slowdown and subsequent crash can
takeas little as 1 week for databases with a lot of data or go as long as a few weeks to a month for smaller data sets. 

A couple of things you might look for are index bloat and having FSM set too
small for your plain vacuums. Upgrading to 7.4 may help with index bloat
if that is your problem.

Re: Where to start for performance problem?

From
Tom Lane
Date:
MK Spam <mk_spam@comcast.net> wrote:
> ... the problem we are having is gradually degrading
> performance ending in postgres shutting down.

As someone else commented, that's not an ordinary sort of performance
problem.  What exactly happens when the database "shuts down"?

            regards, tom lane