Archives of the TeradataForum
Message Posted: Fri, 25 Apr 2008 @ 15:08:13 GMT
I really like the fact that your solution does not use the join. That cuts down the processing considerably. Every example I had seen used a join of the spool back to the table. However, since we are storing the entire string in spool, the original is not needed a second time. Great catch.
I too considered the concatenation approach. My concern was that instead of putting 1 digit into spool each time another one was found, it added it to all the previous digits and stored them over and over again. So, for an 18 digit number it is storing 18+17+16+15+14+13+12+11+10+9+8+7+6+5+4+3+2+1 = 171 extra bytes in spool per row. If this is an average number of characters and if there are 1000 rows pretty soon you have chewed up an extra 171,000 bytes of spool (may or may not be significant depending on the size of the system).
The only thing that I might change on your approach is using a MAX/OVER instead of ROW_NUMBER/OVER:
qualify lvl = max(lvl) over (partition by vartext)
Rather than generating a sequential number that is not used for anything else, simply evaluate and compare on what is already there.
Thanks for sharing your idea,
|Copyright 2016 - All Rights Reserved|
|Last Modified: 27 Dec 2016|