|
Archives of the TeradataForumMessage Posted: Tue, 28 Oct 2003 @ 19:43:03 GMT
<-- Anonymously Posted: Tuesday, October 28, 2003 13:55 --> I have the following table: CREATE SET TABLE Work_Dev.temp_dept_new ,FALLBACK , NO BEFORE JOURNAL, NO AFTER JOURNAL ( department_number SMALLINT NOT NULL, budget_amount DECIMAL(10,2)) UNIQUE PRIMARY INDEX primary_1 ( department_number ); I populate it with 6 records: Insert into Work_Dev.temp_dept_new (department_number, budget_amount) values (2, 1.55); Insert into Work_Dev.temp_dept_new (department_number, budget_amount) values (3, 1.55); Insert into Work_Dev.temp_dept_new (department_number, budget_amount) values (7, 1.78); Insert into Work_Dev.temp_dept_new (department_number, budget_amount) values (4, 1.55); Insert into Work_Dev.temp_dept_new (department_number, budget_amount) values (1, 1.55); Insert into Work_Dev.temp_dept_new (department_number, budget_amount) values (6, 1.80); If I then do an AVERAGE of budget_amount, why do I get an integer result when the column is decimal (10,2)? The following SQL returns the answer 2: sel avg (budget_amount) from work_dev.temp_dept_new; But if I cast the answer, I get the expected 1.63: sel cast (avg(budget_amount) as decimal(10,2)) from work_dev.temp_dept_new; The manual was not very clear about this, just saying that averaging an integer would return an integer. But along that same logic I would assume that averaging a decimal would return a decimal? Anyone have any experience with this? Is casting just the only solution or should I open a ticket with NCR?
| ||||||||||||||||||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||||||||||||||||||
Copyright 2016 - All Rights Reserved | ||||||||||||||||||||||||||||||||||||||||||||||||
Last Modified: 15 Jun 2023 | ||||||||||||||||||||||||||||||||||||||||||||||||