Archives of the TeradataForum
Message Posted: Fri, 12 Apr 2002 @ 17:18:09 GMT
<-- Anonymously Posted: Friday, April 12, 2002 13:13 -->
I don't think Teradata would be particularly slow in your example. Any database will have to find the rows that need to be updated. Often the most efficient way to do this will be a full file scan. In cases where the WHERE clause is extremely selective, a secondary index might be a good choice.
I know of some cases where a good solution was to create a temp transaction table with the same primary index and insert select the rows to be updated into it and then doing a primary index join to effect the update. However, whether or not that is more efficient for a particular workload will have to be resolved for that workload. Sometimes full table scan is really more efficient once you calculate all the costs.
I think saying that Teradata always advises the use of primary index access is overstating things a bit. Although it is true that prime index ops are faster on a row by row basis, often times a full scan is the most economically feasible way to achieve what you want.
|Copyright 2016 - All Rights Reserved|
|Last Modified: 23 Jun 2019|