Home Page for the TeradataForum

Archives of the TeradataForum

Message Posted: Thu, 11 Oct 2001 @ 17:57:56 GMT

  <Prev Next>   <<First <Prev Next> Last>>  

Subj:   Re: INSERT INTO big table
From:   Glenn McCall

We have a process where we run a relatively complex query (i.e. takes a little while to run) and we need to populate several tables depending on different variations of data in the complex query. The input table is akin to a set of transactions. The transactions are processed depending upon information in other tables that contain rules about how to process the information. In other words we join the transaction table to a bunch of other tables and depending upon the results of the join we have up to 6 different operations that may result in 1, 2 or 3 records being changed in other tables.

What we do is run the complex query in a fastexport. The data goes into a named pipe which is read by a mload task. A value in the data stream determines which of the 6 activities is performed (i.e. which label is applied). DML within the label determines what and how the tables are affected. This is illustrated by the following (simplified) example.

.DML LABLE transaction1
Insert into table1 (x,y,z) values (:x, :y, :z);
Insert into table2 (x,y, a, b) values (:x, :y, date, :q);

.DML LABEL transaction2
Insert into table1 (x,y,z) values (:x, :y, :alternate_z);

LAYOUT realised_data
Apply transaction1      WHERE update_Cd = 1 and transaction_type = 'c'
Apply transaction2      WHERE update_Cd = 1 and transaction_type = 'c'
Apply transaction3      WHERE update_cd = 2

An alternative would have been to simply insert the results of the original complex query into a temp table and query it several times (this would be a all rows scan based Update_Cd & transaction type) for each table modified.

  <Prev Next>   <<First <Prev Next> Last>>  
  Top Home Privacy Feedback  
Copyright for the TeradataForum (TDATA-L), Manta BlueSky    
Copyright 2016 - All Rights Reserved    
Last Modified: 28 Jun 2020