I would either validate the data in the script, or batch the records into
bunches of 100 or 1000 or whatever inserts. If one of the batches fails,
then reprocess only that batch using individual inserts.
I believe using transactions of moderate size will be faster than using
really really huge ones. There are of course also transactional semantics
issues if you have concurrent access going on.
-Z-