|
Archives of the TeradataForumMessage Posted: Thu, 06 Mar 2008 @ 10:53:39 GMT
I know that bad stats can results in worse query performance than no stats, but I have never seen a case where no stats were better then recent stats until today. We have a report that�s been in production for a long time now. We received recent complaints that when filtering for a certain value, the report was taking over 30 minutes. I captured the SQL the report was using and ran explain on it both with and without the offending Where clause. What I discovered was that the addition of this particular Where clause resulted in a product join. The behavior was different with different values in the Where clause, so I suspected something with the stats. I dropped the stats on that column and the query ran in about 10 seconds. I collected them again, thinking that maybe they were just old, and ended up killing the query after about 10 minutes. I was surprised that good stats would result in a bad plan. Does anyone have any insight on this? I always feel that identifying the stats to collect is a crap shoot. James
| ||||||||||||||||||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||||||||||||||||||
Copyright 2016 - All Rights Reserved | ||||||||||||||||||||||||||||||||||||||||||||||||
Last Modified: 15 Jun 2023 | ||||||||||||||||||||||||||||||||||||||||||||||||