|
Archives of the TeradataForumMessage Posted: Wed, 18 Nov 2004 @ 02:23:18 GMT
Lakshmi Narain Chandrasekaran wrote:
In this case why don't use use group by ( field_div_500 ) since they all seem to fit nicely in 500 bucket intervals. sel ( mycolumn / 500 ) as mygroup,.... from mytable group by ( mygroup ); Anyway, back to the original question. Hi, I guess it depends on what you where using the Case for. Yesterday I had a situation where the user was trying to convert a UNIX TIME_T value (example) 1090486005 to a Teradata Timestamp (example) 2004-07-22 08:46:45 I wrote a UDF. Seemed Cleaner. That didn't work because they were still on 5.0 So the SQL was to build up a formatted character string yyyy-mm-dd hh:mi:ss With lots of string concatenations and then cast it as a timestamp(0). There was a section of the SQL that needed to figure out whether to PAD the first DIGIT of the Hour, Minutes and seconds with a leading 0 to make it fit the desired out come. || ' ' || CASE WHEN ((1090486005 MOD 86400) / 3600) < 10 THEN '0' || CAST((1090486005 MOD 86400) / 3600 AS VARCHAR(1)) WHEN ((1090486005 MOD 86400) / 3600) >= 10 THEN CAST((1090486005 MOD 86400) / 3600 AS VARCHAR(2)) || ':' || I figured out that if you used FORMAT you could simply write || ' ' || CAST (( ((1090486005 MOD 86400) / 3600) (FORMAT '99')) as VARCHAR(2)) || ':' || I don't know if it performs any better, but it looks simpler. Also I don't know if there is a performance difference here using VARCHAR Versus CHAR but I kept their original intent.
| |||||||||||||||||||||||||||||||||||||||||||||||||||
| |||||||||||||||||||||||||||||||||||||||||||||||||||
Copyright 2016 - All Rights Reserved | |||||||||||||||||||||||||||||||||||||||||||||||||||
Last Modified: 15 Jun 2023 | |||||||||||||||||||||||||||||||||||||||||||||||||||