Thread: Advice on logging strategy

Advice on logging strategy

From
Mike Martin
Date:
I have a question on logging strategy

I have loggin set to
log_statement = 'all' on a network database with logging set to csv so I can import it to a logging table

However the database is populated via a nightly routine downloading data via REST APIusing prepared statements

This results in enormous log files which take ages to import using copy becuase each execute statement is logged with the parameters chosen

Is there any way around this?

I cant find any way to filter dml statements

thanks

Re: Advice on logging strategy

From
Rob Sargent
Date:

> On Oct 11, 2018, at 4:26 AM, Mike Martin <redtux1@gmail.com> wrote:
>
> I have a question on logging strategy
>
> I have loggin set to
> log_statement = 'all' on a network database with logging set to csv so I can import it to a logging table
>
> However the database is populated via a nightly routine downloading data via REST APIusing prepared statements
>
> This results in enormous log files which take ages to import using copy becuase each execute statement is logged with
theparameters chosen 
>
> Is there any way around this?
>
> I cant find any way to filter dml statements
>
> thanks
>
Do you want all the log lines in you logging table?
There was a thread yesterday (10.Oct.2018) on COPY which mention the possibility of multiple processes COPYing to same
table.

Re: Advice on logging strategy

From
Mike Martin
Date:
I suppose the ideal would be to log the prepared statement once and detail only if error rather than one per execution

On Thu, 11 Oct 2018 at 11:33, Rob Sargent <robjsargent@gmail.com> wrote:


> On Oct 11, 2018, at 4:26 AM, Mike Martin <redtux1@gmail.com> wrote:
>
> I have a question on logging strategy
>
> I have loggin set to
> log_statement = 'all' on a network database with logging set to csv so I can import it to a logging table
>
> However the database is populated via a nightly routine downloading data via REST APIusing prepared statements
>
> This results in enormous log files which take ages to import using copy becuase each execute statement is logged with the parameters chosen
>
> Is there any way around this?
>
> I cant find any way to filter dml statements
>
> thanks       
>
Do you want all the log lines in you logging table?
There was a thread yesterday (10.Oct.2018) on COPY which mention the possibility of multiple processes COPYing to same table.

Re: Advice on logging strategy

From
Jeff Janes
Date:
On Thu, Oct 11, 2018 at 6:27 AM Mike Martin <redtux1@gmail.com> wrote:
I have a question on logging strategy

I have loggin set to
log_statement = 'all' on a network database with logging set to csv so I can import it to a logging table

However the database is populated via a nightly routine downloading data via REST APIusing prepared statements

This results in enormous log files which take ages to import using copy becuase each execute statement is logged with the parameters chosen

Is there any way around this?

One option is to convert to using COPY...FROM STDIN rather than prepared INSERTs.

Another is to create a user specifically for bulk population, and do a 'ALTER USER bulk_load SET log_statement=none` to override the global log_statement setting.

Cheers,

Jeff

Re: Advice on logging strategy

From
David Steele
Date:
On 10/11/18 11:26 AM, Mike Martin wrote:
> 
> This results in enormous log files which take ages to import using copy 
> becuase each execute statement is logged with the parameters chosen
> 
> Is there any way around this?
> 
> I cant find any way to filter dml statements

pgAudit (https://github.com/pgaudit/pgaudit) gives you fine-grain 
control over what is logged by command type, table, or user as well as a 
lot more detail.

-- 
-David
david@pgmasters.net


Re: Advice on logging strategy

From
Mike Martin
Date:
Thanks!

On Fri, 12 Oct 2018 at 14:33, David Steele <david@pgmasters.net> wrote:
On 10/11/18 11:26 AM, Mike Martin wrote:
>
> This results in enormous log files which take ages to import using copy
> becuase each execute statement is logged with the parameters chosen
>
> Is there any way around this?
>
> I cant find any way to filter dml statements

pgAudit (https://github.com/pgaudit/pgaudit) gives you fine-grain
control over what is logged by command type, table, or user as well as a
lot more detail.

--
-David
david@pgmasters.net