Archives of the TeradataForum
Message Posted: Wed, 06 Jun 2001 @ 07:32:03 GMT
I am considering implementing a Teradata query to run hourly on our V2R3 system from MVS which can be used over time to compare response times and therefore give me a (very simple) guide as to how the box performs at different periods (Are these known as Canary queries or have I made that up?)
I would like the query to last for long enough to be statistically representative but not so long as to adversely impact the box in its own right. I was thinking perhaps of approximately a 1 minute query on an otherwise quiet system .
Ideally I would need to run this against a static table so that conditions are identical each time it runs with the only difference being that it may be competing for system resource with other jobs running at the same time . Unfortunately , the only tables in the Data Warehouse big enough to accommodate such a query are all modified daily / weekly etc. and therefore wouldn't provide consistent conditions and results .
If I develop the query to do a full table scan which also returned a count of the number of rows on the table would calculating the rows per second or some other similar measure give me a reasonable , consistent and statistically viable measure to use for my analysis or am I barking up the wrong tree completely?
I would also like these results to be stored in a table so is it also possible to return a start and end time for a query as part of the answer set of the query to be Inserted into a table or do I need to do a 2 step INSERT system DATE and TIME for start time followed by an UPDATE with row count and system DATE and TIME for end time from the query itself .
Has anyone any experience of attempting anything similar or is there a better way of tracking this sort of thing .
Thanks for any help / suggestions in advance .
|Copyright 2016 - All Rights Reserved|
|Last Modified: 27 Dec 2016|