Archives of the TeradataForum
Message Posted: Mon, 05 Aug 2002 @ 20:09:28 GMT
I don't think that there's any 'best' canary query. You should pick some queries that run in a controlled region and then accumulate a history of how the queries run versus time of day.
Our performance analyst created four different queries of different complexity that reflect different aspects of our load. One query is run every twenty minutes and the execution time is compared to a maximum value. If the maximum execution time is exceeded warning messages are sent to the technical staff: low priority for single occurrences, high if multiple hits occur within an hour.
The maximum values are determined empirically, and vary from query to query and system to system. The largest query takes about three seconds on our largest production system and alerts at thirty seconds. The largest query takes about 100 seconds on our smallest development system and alerts at 400 seconds. The queries are sized for testing production systems and we let the development systems fall where they may.
One query is a five table join, one is a simple keyed retrieval, one is an update, and the last is a select sum(). At least, I think that's what they are; I haven't looked in a while and the performance analyst is out of the office. All of the queries are run on small tables maintained specifically for that purpose, and no production data is involved.
The query timings are stored in a database (MVS SAS I think), and the execution times for the last 6 hours are displayed on a web page. Because the execution times vary widely from query to query, the values are shown as datapoints only (along with the maximums).
|Copyright 2016 - All Rights Reserved|
|Last Modified: 28 Jun 2020|