Home Page for the TeradataForum
 

Archives of the TeradataForum

Message Posted: Mon, 09 Jan 2001 @ 01:45:11 GMT


     
  <Prev Next>   <<First <Prev Next> Last>>  


Subj:   Re: Technical Data Modelling
 
From:   Eric J. Kohut

I would tend to agree with the normalize until you are forced to de-normalize approach.

First a disclaimer, I am not a logical modeler.

However, You do have a few mitigating issues that seem to be pushing you in the opposite direction, but in the long run a normalized model is usually the best.

One issue you may be contemplating is that when modeling transactional tables you rarely get a chance to access the table via the Primary Index when using transaction number / id. Most of our customers try to access their large tables via the PI as often as possible to save the remaining system resources for when a table scan is required. I've also seen these tables accessed using a mixture of Transaction Id as UPI and Location and / or Dates as a NUSI.

As another way to address this, sometimes you can substitute the customer id as a surrogate PI (NUPI) and use this as an alternative access approach, but be careful of data skewing if you choose this approach. Try to keep the values / NUPI hash to less than the number that will fit into a 64 data block especially if you must do single transaction inserts, possibly much more if the inserts are batched. There are also ways of handling the fact that you don't always know each customer for a transaction, but in the banking sector, I would think that this would be very minimal. If not sign me up for an account there.


Thanks,

Eric

EJK
Eric J. Kohut
Senior Solutions Consultant - Teradata Solutions Group - Retail
NCR Corp.



     
  <Prev Next>   <<First <Prev Next> Last>>  
 
 
 
 
 
 
 
 
  
  Top Home Privacy Feedback  
 
 
Copyright for the TeradataForum (TDATA-L), Manta BlueSky    
Copyright 2016 - All Rights Reserved    
Last Modified: 15 Jun 2023