Thanks Webb and Yury ..
Those are good ideas, but as it stands, I believe the debugging facility is
inadequate for my needs.
I have several databases set up on one machine (postgres calls this a
"cluster") and I need the log to specify which database, and ideally, which
user, the log entry is for. Another thing I will need to do is to have the
log specify some kind of session identifier. I wouldn't want atomic
transactions from another session to be inserted into the middle of a
begin/commit/rollback group without a way to separate them.
By now you've probably figured out that I am hoping to create a redo-log,
for the purpose of recovering a database in a crash scenario, and possibly
for creating a read-only standby database.
I think my best bet from here is to print out the postgres.c source and
begin to rummage through it.
Several things I'm looking for, in case anyone can help me out on this:
1) Build a query buffer which includes the results of functions such as
now() and random() rather than the function name itself
(This might be the most complex part of the whole thing .. any comments?)
2) Stamp log entries with a session id and a unique sequential number.
(Session ID information would later be used by the process which applies the
logs to fork new client sessions. This would ensure data consistency by
enabling the creation of a log-applier-process which could create separate
sessions and maintain the correct sequential order of statement execution.)
3) Determine if the above two steps are sufficient to accurately recreate
the state of a database if the logs are re-applied to a backup.
Thanks,
Bryan Ingram
-----Original Message-----
From: Yury Don [mailto:yura@vpcit.ru]
Sent: Monday, August 21, 2000 11:29 AM
To: Ingram, Bryan
Subject: Re: [SQL] Best way to create DML/DDL log?
"Ingram, Bryan" wrote:
>
> I would like to create a long of all successful DML/DDL statements.
>
> Has anyone accomplished this before? No need to reinvent the wheel ...
>
> If not ..I'm wondering what the best approach might be.
>
> I have looked at the tcop/postgres.c source, thinking I might start by
> modifying some of the -dX routines for generating debugging output.
>
> I've also tried working a little with the -E option to the postgres
backend,
> (passing it through postmaster as -o -E) but can't seem to redirect its
> output.
>
> Finally, another option might be to create a trigger, but I haven't really
> explored this option.
>
> One of the main requirements is that any function which returns a value
> which is determined by system environment must be written in the log as
the
> evaluated statement, and not written as the function itself. e.g. now()
> returns a text string of the current datetime which is written to the log
in
> place of the literal character string 'now()'
>
> As far as modifying postgres.c, I'm fairly confident I can create a log of
> the query buffer, but I haven't yet enough exposure to the source to find
> where I should check to see whether or not a statement was successful, or
> from where I might be able to pull text data from evaluated functions.
>
> Any commentary on this would be much appreciated.
>
> Thanks,
> Bryan Ingram
Hello Bryan
You can make the log of all SQL queries by setting pgecho=on and
pgdebug=2 (or > 2) in postmaster init. You will get all queries and many
other information in postgresql.log, then you can select only DDL/DML
statements using shell scripts (I have an examples of such scripts and
if you interesting, I'll send them).
Unfortunately, looks like it's not solution for you main task, because
queries will be in original from, and functons like now() will be
functions but not the values.
--
Sincerely yours,
Yury