Does there currently exist any kind of script that can be run on
Postgres to conduct a complete feature coverage test with varying
dataset sizes to compare performance between functionality changes?
The interest of course is to get a baseline of performance and then to
see how manipulation of internal algorithms or vacuum frequency or WAL
logs being place on a separate physical disk affect the performance of
the various features with various dataset sizes.
If not, how many people would be interested in such a script being
written?
Keith Bottner
kbottner@istation.com
"Vegetarian - that's an old Indian word meaning 'lousy hunter.'" - Andy
Rooney