Archives of the TeradataForum
Message Posted: Thu, 07 Jul 2011 @ 21:35:42 GMT
We have been trying to improve query response and one of the options has been to test the use of secondary indexes.
If I recall, the optimizer will consider using a secondary index if the selectivity of the index is around 2% of the data in the table.
However, I have come across situations where this rule of thumb did not seem to apply. In one test case, I could not get the optimizer to choose the index even when the only column requested in the query existed in the index and the value requested was less than 2%. The only way the optimizer chose the index was when I specified a value in the WHERE clause that did not actually exist in the index. All statistics on all columns and indexes were updated.
The only conclusion I could come to revolved around the other variables that could possibly affect the optimizer s choice to use an index and decided that the overall system size and the table size itself could be contributing factors. For example, an 800 AMP system and a 10 million row table would result in 12,500 rows/AMP with even distribution. The 2% rule would result in 200,000 rows which is greater than the overall AMP requirement. In this case, the optimizer might choose a full table scan over the index.
Does anyone have other guidelines on when it is useful to add secondary indexes that might take into account the overall system size?
Is my reasoning faulty?
|Copyright 2016 - All Rights Reserved|
|Last Modified: 27 Dec 2016|