Home Page for the TeradataForum
 

Archives of the TeradataForum

Message Posted: Fri, 30 Oct 2009 @ 12:35:45 GMT


     
  <Prev Next>   <<First <Prev Next> Last>>  


Subj:   Re: Queries on Huge table...how to?
 
From:   Victor Sokovin

  I have a table which have over 380 million records.  


  I ran sel count(*) and it took more than 6 minute to return the result.  


  Then I am running sel * on this table and it's not returned any result in more than 23 minutes.  


  I saw the skew factor of the table, and surprising there is none, less than 1%.  


If everything is normal, performance measurements of this kind should be the function of your system's parameters (=price) and the concurrent usage. Not knowing those it is hard to comment on your figures. Having 10 or 1000 AMPs is a big difference, having 1 or 1000 users is a big difference, too.

Perhaps just one remark on sel *: in addition to the number of rows check whether the table is "wide", i.e., whether it has long (VAR)CHAR etc columns.

If you don't need them, then avoid loading them via your queries. Obviously, they will waste a lot of I/O.


  Actually I had to run a minus query on the table between two of it's columns..  


  If that's case than how a little complex queries would run on this table...  


  any advice in such scenario..? should we break table..  


Check with your DBA how busy the system is during your tests and let them provide you with your system parameters. If you feel like posting them here then there would probably be people out there with similar systems, now or in the past, and they would be able to tell you whether the performance is normal.


Victor



     
  <Prev Next>   <<First <Prev Next> Last>>  
 
 
 
 
 
 
 
 
  
  Top Home Privacy Feedback  
 
 
Copyright for the TeradataForum (TDATA-L), Manta BlueSky    
Copyright 2016 - All Rights Reserved    
Last Modified: 15 Jun 2023