Ron St-Pierre <rstpierre@syscor.com> writes:
> BTW these updates do take longer than we'd like so I would appreciate more
> input on how this setup could be redesigned.
Where is the input coming from?
One option is to batch changes. If you just insert into a log table whenever
new data is available, and then do a batch update of many records you would
have a few advantages.
1) You could have a single updater and therefore no worries with concurrency.
2) The optimizer could choose a merge join or at least a nested loop and avoid
multiple round trips.
Something like
update current_stock_price
set price = log.price,
timestamp = log.timestamp
from stock_price log
where current_stock_price.stock = stock_price_log.stock
and stock_price_log.timestamp between ? and ?
You can either just use deterministic time ranges like midnight-midnight or
keep careful track of the last time the job was run.
You would first have to insert into current_stock_price any missing stocks,
but if you're batching them then again you don't have to worry about someone
else inserting them in the middle of your query. And it's more efficient to
add lots of them in one shot than one at a time.
--
greg