Hi jr
jr wrote:
> hi Urs,
>
> this is perhaps not what you meant but...
>
Actually, this in-line-filter idea is exactly what I had in mind as the
worst-case laborious work around when I wrote my email. Laborious?
Laborious because there are 350+ tables that we fetch each night and I
did not fancy writing and parsing 350+ tables on each run.
The idea of a temp table was the sort of hint I hoped to get off the
list. And I got it. ;-) Now having run this for a few days, we start to
realise that maybe we still have to run it through an in-line-filter as
we keep getting new errors or overruns in new places. So we might have
to generate 350+ in-line-filters and run them each night.
BTW, there is a really neat trick that I stumbled across recently that
we plan to use for logging the actions of the filter. It's a technique
that allows us to have the stdout and stderr of the filter in a log
file, but also get the stderr out separately, so 'cron', which runs the
filter nightly, can email us the errors or failures.
(((filter-progress-db.sh | tee -a /var/log/progress-db-mirror) \
3>&1 1>&2 2>&3 | tee -a /var/log/progress-db-mirror) \
3>&1 1>&2 2>&3) >/dev/null
When the above command is run from cron, it will log everything to the
file and if anything goes wrong, stderr will go to the console, which
gets e-mailed to an administrator. If desired, you could also log stdout
and stderr to two distinct files.
It's taken from
http://www.enterprisenetworkingplanet.com/linux_unix/article.php/3870976/Improve-Your-Unix-Logging-with-Advanced-IO-Redirection.htm
> I'd use a simple script to filter out the offending lines before they go
> into the database, like the following written in Tcl (but you could
> write this in almost any language):
>
> ----------<snip>----------
> #!/usr/bin/tclsh
I guess in my case this would have to be perl. ;-)
> HTH
Thanks, yes it did. I like this list.
Regards,
--
Urs Rau