|
Archives of the TeradataForumMessage Posted: Tue, 07 Oct 2008 @ 13:49:29 GMT
If you literally mean "char" then the answer is yes. Teradata would not do things like converting � to ae or � to oe etc. If, however, you intended to ask "I want to clarify that if a character takes 1 BYTE in sql server that should also only take 1 BYTE in teradata? then the answer depends on the character sets chosen in SQL Server and Teradata. If you use Unicode or Asian character sets in Teradata then the 1-byte char in SQL Server would cost you two bytes in Teradata. Internally, Unicode chars are always stored as 16 bits = 2 bytes.
You did not give us any hints as to what character set is used in the definition of this column. I'd guess the above seems to suggest that this is not caused by Unicode. Even if the character set is what is called "Latin" this character should not be a problem. Its scientific name is LATIN SMALL LETTER Y WITH DIAERESIS and it is part of the Latin repertoire. Yet another guess is that the process transferring data from SQL Server to Teradata performs an implicit translation to ASCII and then � gets converted to whatever the process is capable of coming up with. Apparently, it comes up with ÿ, which increases the overall length of the string. So, it's not that Teradata requires two bytes to store that particular character but the character itself gets lost in translation. The latter is bad not only because the string length increases but it is bad because you don't even know which character it is. So, the recommendations would be: 1. Check that the char set is indeed Latin. 2. Inspect and correct problems in the data transfer process: avoid all implicit translations leading to data corruption. Victor
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||
Copyright 2016 - All Rights Reserved | ||||||||||||||||||||||||||||||||||||||||||||||||||||||
Last Modified: 15 Jun 2023 | ||||||||||||||||||||||||||||||||||||||||||||||||||||||