Archives of the TeradataForum
Message Posted: Wed, 30 Aug 2007 @ 01:19:25 GMT
I think you have problems with your overall design.
Loading as many tables as you are with one script will increase you maintenance / testing and support effort.
You need to isolate the components of your process as much as possible.
I.e. verify the data at MQ - either visually through mq explorer or use an application such as Q to copy the data to a text file.
once you are confident that the data is ok I would create a copy of your TPump that execs a single macro that does a straight insert with no transformations into a 'staging' table. If that is ok I would add you production macros and tables one by one or even field by field until you identify the cause of your problem. I think your issue will relate to an incorrect data type conversion or something similiar.
Our design is a dedicated Tpump job for each MQ feed with the Tpump job loading no more then four tables - this makes the ETL easy to debug and monitor and removes the need to re-test objects that are not impacted when new code sets are deployed.
Without seeing you TPump script it is hard to provide much more feedback
|Copyright 2016 - All Rights Reserved|
|Last Modified: 28 Jun 2020|