Home Page for the TeradataForum
 

Archives of the TeradataForum

Message Posted: Fri, 18 Nov 2005 @ 20:13:43 GMT


     
  <Prev Next>   <<First <Prev Next> Last>>  


Subj:   Re: Stored procedures results
 
From:   McBrideM

Your comments are well written and I pretty much agree with the basic components of your points, but I'm not sure I would say that SP's and macros are inherently bad (or even non-portable for that matter). Even if one chooses to ignore ANSI complicacy or uniform code construction, I wouldn't go so far as to say a database implementation should avoid SP's and macros, especially since it is most likely a basic database performance enhancing feature. That would be like saying don't use any of the OLAP functions or windowing functions of Teradata in order to aggregate and analyze (mine) the business data because DB2 doesn't have the same feature.

As I have remarked on another thread, good development strategy requires a complete separation of the conceptual, logical, and physical designs layers.

The intent of the conceptual enterprise data and process model is to define in business terms what "things" (I used the term loosely) mean, especially in regard to business processes. It is especially important to define, in no uncertain terms, what information is produced/consumed (data in motion) and retained (data at rest), that will drive a company's profitability or the business unit's efficiency.

The logical model, the next layer of the onion, so to speak, attempts to answer the question of 'form & function'. What are the basic concepts (facets and entities) and relationships? The data model looks at data "at rest" as a noun, while the process model looks at data "in motion", the verb form. It is imperative that I emphasis this next statement, as it cannot be stated too loud or too often..."A logical model is ALWAYS (I repeat) ALWAYS about the business function, not the technical solution! It describes data and process completely independent of a physical target or platform.

Therefore, business rules are captured generically, perhaps in the form of decision trees, pseudo-code, or just plain, everyday language (documentation). It is only at this level of abstraction that rules are truly portable and platform independent (in my opinion).

The last layer in the quest for a final business solution is the physical models. These are ALWAYS highly specific, detailed designs with a final technology target in mind. It is here that the "code" for the solution (if you will) is documented in the model. This is where SP's, DDL, triggers, ETL code, source and report interfaces are actually designed by applying and transforming functional requirements (logical models) into physical systems. To be certain, it would be nice to have the ability to use a LOWER CASE tool that can translate the physical design (notice I didn't say logical design - you can't really skip this layer) into "code" built for a specific target database, application server and programming language, but that is not a necessity, as the objective of the physical model (layer) is to optimize the organization of data and tune unit processes (function points), which should leverage any feature contained within the target system.

For an example:

Suppose the Conceptual models for a business identify a concept named EMPLOYEE, and another concept named JOB ASSIGNMENT. And, lets suppose, there exists a relationship between both of these concepts (along with a concise business definition). Now, as we move to the next level of detail, the Logical models, the business insists that an EMPLOYEE can have many JOB ASSIGNMENTS over time, but can have only ONE assignment at any single point in time, and may have no assignment if on a leave of absence. This is a now a discernable, discrete business rule which identifies a required, enforceable cardinality between the two entities. In the final design layer, the Physical model, this business rule will become a database constraint applied to the underlying data tables named EMP and JBA (to keep it simple). This is where the technical architect has to decide a solution for enforcing the "well understood" business rule. Choices might include the following:

1) Enforce the rule external of the database in ETL (extraction, transformation and Load) processes using a tool or program written to ensure the rule is enforced prior to loading (and maintaining) the data.

2) Enforce the rule using Hard Referential Integrity on the DDL within the target RDBMS.

3) Enforce the rule using a table trigger and a (perhaps) Stored Procedure (SP).

4) Ignore the rule altogether and hope that the source data is always right (trust me, this happens a lot).


Whatever the decision, it should (must) be documented in the detail design and the specification for the final solution contained within the physical model (I use model and design interchangeably here). The targeted platform may dictate the solution, or performance criteria may be the deciding factor, or in may just come down to personal preference of the technical architect, but, in any case, the solution should take advantage of any and all features and capabilities offered by the target system; else why even have different technologies?

My primary goal here is not to debate the virtues of portability, but to emphasize the need to adequately document the business requirements independent of the technical solution. Therefore, when newer, better, chromier (brighter), faster technologies come along, you can easily migrate to them simply by modifying the technical design without ever revisiting the Conceptual and Logical models (as long as the business itself hasn't change as well).

Models exist to propagate understanding, communicate requirements, and assist in the strategizing of design solutions. From top to bottom, the layered model approach provides more and more detail, starting with the very 'general' business concepts right down to the very 'specific' implementation design. Performed properly and effectively within any business context, these models make platform migration and portability a non- issue in my purview.

To debate the value of SP's or macros in a single design solution as making anyone's life easier (or harder) is something like debating over the best way to effectively wipe up spilt milk, a napkin or a sponge? In the end, does it really matter, as long as the person who has to actually clean up the mess has at least one or the other!

FOOTNOTE: To implement any technical design solution, whether it includes SPs or not, without the appropriate documentation or existing model supporting the corresponding business requirement, is simply an irresponsible act, and perhaps even insanity. It is tantamount to creating pure chaos for the next person left holding the proverbial 'bag'.


With respect and most humbly submitted,

Michael E. McBride, MSCIS



     
  <Prev Next>   <<First <Prev Next> Last>>  
 
 
 
 
 
 
 
 
  
  Top Home Privacy Feedback  
 
 
Copyright for the TeradataForum (TDATA-L), Manta BlueSky    
Copyright 2016 - All Rights Reserved    
Last Modified: 15 Jun 2023