Re: Change the csv log to 'key:value' to facilitate the user to understanding and processing of logs - Mailing list pgsql-hackers

From Julien Rouhaud
Subject Re: Change the csv log to 'key:value' to facilitate the user to understanding and processing of logs
Date
Msg-id 20220315123834.o2m4ght772fkg7cy@jrouhaud
Whole thread Raw
In response to Change the csv log to 'key:value' to facilitate the user to understanding and processing of logs  ("lupeng" <lpmstsc@foxmail.com>)
List pgsql-hackers
Hi,

On Tue, Mar 15, 2022 at 09:31:19AM +0800, lupeng wrote:
>
> When I audit the Postgresql database recently, I found that after configuring
> the log type as csv, the output log content is as follows: "database
> ""lp_db1"" does not exist",,,,,"DROP DATABASE lp_db1;",,"dropdb,
> dbcommands.c:841","","client backend",,0 It is very inconvenient to
> understand the real meaning of each field. And in the log content," is
> escaped as "", which is not friendly to regular expression matching.
> Therefore, I want to modify the csv log function, change its format to
> key:value, assign the content of the non-existing field to NULL, and at the
> same time, " will be escaped as  \" in the log content. After the
> modification, the above log format is as follows: Log_time:"2022-03-15
> 09:17:55.289
> CST",User_name:"postgres",Database_name:"lp_db",Process_id:"17995", [...]

This would make the logs a lot more verbose, and a lot less easy to process if
you process them with tools intended for csv files.

You should consider using the newly introduced jsonlog format (as soon as pg15
is released), which seems closer to what you want.



pgsql-hackers by date:

Previous
From: Joseph Koshakow
Date:
Subject: Re: Can we consider "24 Hours" for "next day" in INTERVAL datatype ?
Next
From: hubert depesz lubaczewski
Date:
Subject: Re: Change the csv log to 'key:value' to facilitate the user to understanding and processing of logs