PostgreSQL logging - restrict error messages - Mailing list pgsql-general

From hartrc
Subject PostgreSQL logging - restrict error messages
Date
Msg-id 1353091257428-5732480.post@n5.nabble.com
Whole thread Raw
List pgsql-general
I'm running PostgreSQL 9.1.6 on Linux SLES 11 SP2

My question is, is it possible to restrict entries into the log bases on
number of entries per second or avoid duplicate entries within the same
second?

Some background:

My non default logging parameters in postgresql.conf

#LOGGING
log_directory='/postgresql/pg_log'
logging_collector='ON'
log_line_prefix='%t %r %u %d %a'  ,

log_statement='ddl'
log_destination='stderr,csvlog'
log_connections=on

As you can see I'm doing some fairly verbose logging. This is intentional,
it gives me good auditing capability and helps spot errors in applications.
The volume in logs is mostly manageable however if a developer runs a
statement presumably in a cursor loop that has an error in it then the logs
can quickly get big and they are simply repeats of the same information e.g.


2012-11-16 08:39:32 ip hsf_web_user pgdev [unknown]ERROR:  current
transaction is aborted, commands ignored until end of transaction block
STATEMENT:  INSERT INTO hse.extract_lpt
        (etc)

I have the above in the log thousands of times.. and within a minute or 2 it
is having to rotate logs. It is useful to know that there is an error on the
insert into hse.extract_lpt however I don't need to know every occurence of
this (particularly multiple instances within a second).

Thank You
Rob




--
View this message in context:
http://postgresql.1045698.n5.nabble.com/PostgreSQL-logging-restrict-error-messages-tp5732480.html
Sent from the PostgreSQL - general mailing list archive at Nabble.com.


pgsql-general by date:

Previous
From: Jeff Janes
Date:
Subject: Re: High SYS CPU - need advise
Next
From: Merlin Moncure
Date:
Subject: Re: Full text search ranking: ordering using index and proximiti ranking with OR queries