On 23/02/12 08:39, Reuven M. Lerner wrote:
> (4) I tried "chunking" the deletes, such that instead of trying to
> delete all of the records from the B table, I would instead delete
> just those associated with 100 or 200 rows from the R table. On a 1
> GB subset of the data, this seemed to work just fine. But on the
> actual database, it was still far too slow.
This is the approach I'd take. You don't have enough control / access to
come up with a better solution. Build a temp table with 100 ids to
delete. Time that, and then next night you can increase to 200 etc until
it takes around 3 hours.
Oh - and get the Windows admins to take a look at disk activity - the
standard performance monitor can tell you more than enough. If it is
swapping constantly, performance will be atrocious but even if the disks
are just constantly busy then updates and deletes can be very slow.
--
Richard Huxton
Archonet Ltd