Archives of the TeradataForum
Message Posted: Tue, 14 Feb 2006 @ 13:37:00 GMT
Subj: | | Re: Question on Strategy |
|
From: | | Meade, Cornelius |
Most sites have to deal with space limitations of some sort. The underlying question would seem to be one of sizing. You need to allocate
enough space for the data you intend to hold plus a certain amount of overhead. Cutting it too close here usually results in a tradeoff in more
care and feeding of the ETL processes that populate your databases. This can result in other quality/service issues if you are constantly having
to manually shepherd data into or out of your mart because the tolerances are so close that the normal data volume fluctuations that occur result
in issues or make for problematic ETL. Do not forget that the more elaborate the ETL process, the more likely it is that its care and feeding
will be resource intensive also.
Any estimating process put in place is only going to be so accurate as there are many factors which ultimately affect space usage in a TD
system and some of them are difficult to factor for in a predictive manner (as has already been mentioned) but you can probably just use the files
sizes to base some decision making on. Usually there are concise business rules which dictate which business data should fall off but in your
case it sounds like perhaps you would possibly consider dynamically adjusting criteria that identifies the data to be purged in order to
accommodate the newer (presumably more valuable) incoming data as that would obviously maximize the space available to do any subsequent load
activity.
Cornelius Meade III
EDS - Consulting Services/BI/CRM
|