Archives of the TeradataForum
Message Posted: Wed, 19 Jun 2002 @ 17:28:50 GMT
Some of your reasons to move ETL away from the Teradata system are not really valid, for example :
It always possible and recommendable to design your data warehouse without having Batch Windows as a requirement. In fact, NCR is moving in that direction with the Active Data Warehouse concept and the improvements in the loading tools. Your design will also be affected for user's Data availability requirements.
This reason will affect both ETL strategies and I should say that an out-of-Teradata strategy will be more affected because more business logic could mean more lockup transformations and bigger data volumes will involve expensive non parallel IO to move the data required for the transformation outside Teradata
This is a good point and the best solution is making the user a part of the ETL process and making him understand the benefits of this approach. In my experience, user are very happy to have access to the original data in the database (temporarily) to validate if the changing business rules.
That's true, however, an increase in CPU power for your ETL process will also increase the CPU power for the business process. In the other hand, having an ETL server that only work during batch hours is not a very good way to use that box and do not improve user response time.
Sure there are, an after test some of them I couldn't find a match for the inherent parallelism of Teradata (for large data volumes)
However, there is always a good reason to make transformation outside Teradata (number or date validation are not easy on Teradata) and in this case, I would prefer using Inmod routines, Access Modules or tools like Genio (Hummingbird) that can stream data directly to Fastload (doing all sort of simple row based transformations on the fly), to avoid the expensive non parallel IO.
|Copyright 2016 - All Rights Reserved|
|Last Modified: 27 Dec 2016|