Archives of the TeradataForum
Message Posted: Wed, 08 Apr 2015 @ 14:46:51 GMT
You asking to update a table in a transactional fashion but via a mass insert/update from multiple concurrent sources/processes.
An Update DML statement (also insert and delete) that is from another table source with multiple rows involved will escalate any row-level write lock to an all-amp write lock as you cannot predict the data to be involved in the key lookup (containing to single-amp and row hash).
If you are really needing to do this in the fashion you describe, you will need to export the data to a file and import it through a BTEQ calling a query or macro (preferable) that will do this a row at a time and keep the locking isolated to the row hash. You can also do this with BTEQ with multiple sessions (or tpump/TPT stream) to accomplish this.
Easy BTEQ stub that logs on with 5 parallel sessions, imports a data file and uses multi-session updates through the macro to update the table. Also works better if the data is "not sorted" in row key order in the event your table is a NUPI defined vs. UPI and you have multiple rows to update within a row hash. In this case, the row hash lock could still block some transactions in the parallel process if the input is sorted by row hash.
.SET SESSIONS 5; .LOGON DBC/USER,PSWD; .set quiet on; .REPEAT 5 DATABASE
|Copyright 2016 - All Rights Reserved|
|Last Modified: 23 Jun 2019|