Home Page for the TeradataForum
 

Archives of the TeradataForum

Message Posted: Fri, 18 Sep 2009 @ 22:29:16 GMT


     
  <Prev Next>   <<First <Prev Next> Last>>  


Subj:   Re: Update of millions of records
 
From:   Victor Sokovin

  One other thing to consider - if you run the delete as an insert to a new table, you'll have to collect stats and make sure you have a system in place to manage granting rights on the new table. The stats thing may or may not be an issue, since depending on the volume of deletes you might want to recollect on the original table anyway. The grants can be automated (bteq script with SQL that generates grants from dbc.AllUserRights and dbc.AllRoleRights then runs the results), but it's still something you might have to set up.  


David, are you discussing the scenario when the UPDATE on the original table is done as a series of fast operations on a temp table followed by the RENAME of the temp table to the original table? I believe you are because otherwise you would not be talking about rebuilding the grants.

Stats sort of "follow" the table after the RENAME. Cf.

http://www.teradataforum.com/teradata/20081124_194101.htm


So, the temp table, which had no stats, will have no stats after it is renamed. Oops.


  A question for the list: leaving aside deletion by PPI (where you'd always want to at least collect on PARTITION), would a reasonable rule of thumb be that the row change threshold for collecting stats after a delete is the same for running a delete as insert into a new table?  


Perhaps the above gives the answer to this question as well?


Victor



     
  <Prev Next>   <<First <Prev Next> Last>>  
 
 
 
 
 
 
 
 
  
  Top Home Privacy Feedback  
 
 
Copyright for the TeradataForum (TDATA-L), Manta BlueSky    
Copyright 2016 - All Rights Reserved    
Last Modified: 15 Jun 2023