Archives of the TeradataForum
Message Posted: Mon, 20 Jul 2009 @ 14:49:54 GMT
I wonder whether you could rescue my credibility for me ....
Recently, I instructed a developer to use a Macro in place of a Stored Procedure that had been created, basically on the grounds that there was no need for procedural code, and that Macro's offer the best chance of good performance.
The change from Procedure to Macro with trivial, just topping and tailing the wrapping code, as the rest of it was just a series of Insert/Select statements.
Basically what those Insert/Select do is splitting down what would otherwise be an horrendously complicated statement into more manageable chunks. So it writes some data into a 'work' table and then uses that 'work' table in the next step, joining to some other table in the process.
Thing is, the Insert/Select uses a set of 'work' tables (actually they exist in Perm space) and clear them down at the end of the Macro. This seems to have had the affect that, within the Explain Plan, there are no available Stats for the Macro to work with.
This ended up with the Macro never finishing and my insistence on using a Macro appearing rather foolish.
So, one question I have is, do Macros establish their Explain Plan at time of Compile or time of execution ?
Additionally, when writing from one 'work' to another - which are initially empty - will the Macro be able to do a Random Amp sample or will it only pick up 'no statistics' at the start of run ?
Are there times when Macros (like in this instance) are not a good idea ?
We can re-do this in Bteq, but I'm now starting to question how good Macros are.
Any thoughts would be welcome.
|Copyright 2016 - All Rights Reserved|
|Last Modified: 27 Dec 2016|