Home Page for the TeradataForum
 

Archives of the TeradataForum

Message Posted: Thu, 17 Sep 2009 @ 12:17:52 GMT


     
  <Prev Next>   <<First <Prev Next> Last>>  


Subj:   Re: Update of millions of records
 
From:   Victor Sokovin

  I have to do an update some millions of records in a table by matching records from another table. So, which one will be better among these:  


  1. Direct update to the table through SQL  


  2. Or first we will export from source table and again update the target table doing multiload of the exported file.  


When you ask a question as generic as this and provide no details you probably feel yourself that it will be difficult to answer it.

Option 1 should be considered first, of course. Why? Because update of a few million rows should be business as usual for Teradata.

That's what you buy Teradata for. You can store billions of rows in a table and an update of 0.1% of rows is not out of normal requirements. You've done your investment in the technology and now you should be able to concentrate on your business, do your updates in this case, without spending too much time on all kinds of clever work-around strategies with every step like this.

Having said that, the update in question will probably be only part of activities on the table(s) involved, and you have to think of the consequences of your actions in the global context. Besides performance consider the level of locking (utility level vs. table/row level), presence of additional structures on the table (indexes, triggers) which may exclude the use of utility update, availability of free slots for utility use (every ML process contributes towards the system-wide count of simultaneous utility processes, and there are limits), etc.


Victor



     
  <Prev Next>   <<First <Prev Next> Last>>  
 
 
 
 
 
 
 
 
  
  Top Home Privacy Feedback  
 
 
Copyright for the TeradataForum (TDATA-L), Manta BlueSky    
Copyright 2016 - All Rights Reserved    
Last Modified: 15 Jun 2023