Archives of the TeradataForum
Message Posted: Thu, 24 May 2001 @ 15:22:00 GMT
I am running a macro that I wrote that is loading a historical table. The macro chains rows together by setting the revision date of row#1 with the effective date of row #2. We typically do an insert select into a blank table then a rename to do this type of operation. The table is almost a billion rows (130Gb) and contains data from 1996 and later. I am running the macro once for each day in our historical table. I see good performance (6-12 minutes) and am using a low amount of spool (<5Gb). I am running from within queryman using a long list of execute statements
I run the macro non-stop using this method and have been told that I am adversely affecting the system (8 node 5200 with 2.1Tb and 500Mhz) One nightly load process runs long when I am running 24 x 7. The process that runs long is loading a database other than the one I am reading from or writing to. The load process that loads the database I am reading seems to run fine.
Can anyone provide insight into any aspect of this process that should be re-worked or suggest things to check while the macro is running.
Before running the macro(s), I create a volatile table to hold intermediate results
The macro does the following
insert a single row into a status table used to track when the macro starts
Delete all from the volatile table
insert into the volatile table from the big table where day=day passed to the macro (metrics are sum())
update the volatile table with totals from the table I am creating (that contains all previous days)
update the final table by setting revision dt=the new data's date since it is now the row we want active
insert into the final table from the volatile table collect statistics on the final table
update the status table by changing the status to COMPLETED
I have the complete text of the macro and DDL but I wasn't sure if it was appropriate to post it here.
|Copyright 2016 - All Rights Reserved|
|Last Modified: 27 Dec 2016|