Home Page for the TeradataForum
 

Archives of the TeradataForum

Message Posted: Fri, 18 Jan 2008 @ 12:22:18 GMT


     
  <Prev Next>   <<First <Prev
Next>
Last>>
 


Subj:   Re: Testing for Multibyte character
 
From:   David Clough

The first thing to remember is that Teradata stored Unicode in UTF16 i.e. double byte in all cases.

So, if the character set is Unicode, they are all 2 byte characters! when they get to Teradata- hence it is not really possible.

If the character set on the column is Latin, you cannot have a multibyte character.

Any latin characters with a hex value of 80 or greater will result in a 2 byte character in a UTF8 file.

Characters may be corrupted moving between systems (between Oracle systems, between Teradata systems or between the two) if you are not careful.

If the column uses 2 byte characters on Oracle, and you have the NLS_LANG environment variable set to UTF8, any characters over X'80' go out as 2 bytes.

If you load this as UTF8 (specify the character set in Fastload), it will go back fine. If you do not specify UTF8 to fastload, you will get 2 bytes loaded to the field, neither of which are the correct characters.

Check that you specified the UTF8 character set in Fastload - you need this even if you are loading to Latin (if the output file comes from source with multipart characters.

If this is OK, extract the column to a file on both systems and check the contents by opening the file in binary using something like Textpad or Ultraedit.


Hope that helps

Dave Clough
Database Designer
Express ICS

www.tnt.com



     
  <Prev Next>   <<First <Prev
Next>
Last>>
 
 
 
 
 
 
 
 
 
  
  Top Home Privacy Feedback  
 
 
Copyright for the TeradataForum (TDATA-L), Manta BlueSky    
Copyright 2016 - All Rights Reserved    
Last Modified: 27 Dec 2016