On Wed, 3 Nov 2004, Gavin M. Roy wrote:
> Just offering my 0.02... have you checked to see if the delays are due to
> heavy io?
From the server side of things, I've checked everything that I can think
of ... I ran a 'find . -mmin +1 -print | wc -l' on /usr/local/lib, it
reported back 28k files processed in ~45sec (641files/sec) ... I realize
that this doesn't simulate what Alexey is doing, but he's only reading in
a dozen files or so, according to him, for the index page ... is should be
able to handle that easily enough ...
I even grab'd a recursive directory script off of php.net, so taht I ran
it thorugh php ... did a stat on all 28k files to get mtime (again, not
reading in the whole file, only the meta data), but I was getting
~200files/sec off of that ...
So, the only thing I can think of is a problem with Alexey's code, which
he swears is impossible ... or a problem with mod_php4.3.9 itself ...
I even drop'd the # of processes on that server by 1/2 this afternoon, and
there is no noticeable improvement to Alexey's code ...
----
Marc G. Fournier Hub.Org Networking Services (http://www.hub.org)
Email: scrappy@hub.org Yahoo!: yscrappy ICQ: 7615664