|
Archives of the TeradataForumMessage Posted: Fri, 28 Jul 2006 @ 17:13:00 GMT
A footnote on this discussion... The Teradata estimates the row length of expression results using sizes from the the object definitions, and objects with variable length (like VARCHAR) use the maximum length. TRIM does not affect this calculation, but SUBSTRING with constant limits will define a new (smaller) datatype. SUBSTRING with variable limits reverts to the base definition. We opened an incident on this topic some years ago and NCR convinced us that resolving object sizes during execution would be prohibitively expensive in terms of performance. create table t1 (upi integer not null, vstr varchar(64000)) unique primary index(upi); select upi, trim(vstr) || trim(vstr) from t1; (This give you the 3798 error). select upi, substring(vstr from 1 for 10) || substring(vstr from 1 for 10) from t1; (This works) select upi, substring(vstr from 1 for position(' ' in vstr)) || substring(vstr from position(' ' in vstr)) from t1; (This also gives you the 3798 error). That last example is basically what caused us to open the incident. We had a user trying to rearrange the contents of a large varchar field. The user eventually created a number of work tables with reduced field sizes to hold the pieces of the string, and then spliced them back together to produce the final result. I suspect that exporting the data, converting it with a program, and then reloading it would have been easier (at least in this case). /dave hough
| ||||||||||||||||||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||||||||||||||||||
Copyright 2016 - All Rights Reserved | ||||||||||||||||||||||||||||||||||||||||||||||||
Last Modified: 15 Jun 2023 | ||||||||||||||||||||||||||||||||||||||||||||||||