Home Page for the TeradataForum

Archives of the TeradataForum

Message Posted: Thu, 14 Aug 2003 @ 16:45:14 GMT

  <Prev Next>   <<First <Prev Next> Last>>  

Subj:   Re: Informatica and Altered MLoad Control File Considerations
From:   Dwight Etheridge


Yes, using Infa 6.2

Yes, we stay with FileWriter, and turn off Loader connection (Not always - sometimes)

Prior to last step, I capture the ctl file produced in the TgtFiles directory and productionalize it.

I sometimes choose to do because:

- Infa does not let you redirect support tables (UV_ ET_ WT_ ML_) with the default loader. Infa tries to put these tables in the default database.

- We have multiple record types (target output) from Infa. We can deploy multiple imports in 1 mload invocation. Infa doesn't build this type of ctl file, but you can with a dedicated mload.

I have experienced the lockup situation when a target is also a lookup in the same session. Infa spins up the mload reading on a pipe and the lookup has read lock. Breaking this up relieves the contention.

Scripting: My views on this subject are shaped from years of doing ELT on dbc's. With todays exotic data models, I find situations where SQL applying staged data to a 3NF data model appropriate. For one, the data applies as a transaction (all or none). No potential for operations incorrectly restarting a mload that applies data directly to a target. Two, staged data often goes to multiple entities in a data model, so reuse of this data is important. For example I use a Infa default mload on a staging table. Then in Workflow manager, I kickoff 3 concurrent scripts to move that data out to the data model. One script may do some OLAP functions, one script may pivot the data with a cross join, one script may join the staged data to the 3NF to pickup information. These things can leverage the tool to drive more processes on the dbc (a good thing). SyncScan can kick in too! And you can tune your Transformations with collected statistics, secondary indexes, and EXPLAIN inspection. You have all that parallelism on the dbc eager to do the job. Another thing I champion with Informatica is to push the lookup concept back to the dbc in the source qualifier using SQL override.

A blended environment of mappings & scripts is usually what I end up with. Take advantage of both tools.

Dwight Etheridge
Teradata Certified Master

  <Prev Next>   <<First <Prev Next> Last>>  
  Top Home Privacy Feedback  
Copyright for the TeradataForum (TDATA-L), Manta BlueSky    
Copyright 2016 - All Rights Reserved    
Last Modified: 15 Jun 2023