Archives of the TeradataForum
Message Posted: Thu, 11 Oct 2012 @ 20:15:26 GMT
On your question: Does the Optimizer take into account Compression on Tables?
The optimizer is not sensitive to block-level compression, at least on in the 13.10 release. Neither will you see any additional steps added to the explain text when a query accesses a compressed table, as the decompression processes are performed transparently at the file system level. The optimizer does not take into account the extra time or CPU required for decompression when estimated processing times are established.
However, the average row size that is calculated during random AMP sampling will be different for compressed tables. When random AMP sampling is performed, one or more cylinder indexes are read. While the cylinder indexes are never compressed themselves, the information they carry is based on physical characteristics of the underlying data.
Even if full statistics have been collected, random AMP sampling is relied upon for determining the average row size as input to query optimization. Because the table's row size is based on the compressed image of the data, estimated processing times, which are influenced by row size, may be slightly less in queries accessing compressed tables. While it is not expected that this discrepancy will be large enough to cause the optimizer to make different decisions in most cases, the row size under-estimation with compression could lead to some query plan changes.
|Copyright 2016 - All Rights Reserved|
|Last Modified: 28 Jun 2020|