Archives of the TeradataForum
Message Posted: Wed, 21 Mar 2007 @ 19:50:38 GMT
You are pondering a most interesting question. Here is what I share from watching a variety of customers.
First of all, there is little right and wrong with any answer to your question. The measure of success is what works well for your system, requirements and user expectations.
Next, it is OK to run a Teradata system at 100%. Restating this point it is OK to run the system at 100%, but not for 100% of the time.
In terms of specific language I use to answer your question, I tell customers it is wise to run a system at below 85% CPU utilization per day. Some customers like to keep that number at or below 70% and others run it up as higher than 85%. I have seen systems run at 100% all day for days on end. That too is OK, the software and hardware holds up.
What seems to fail between 70% and 100% is that of user expectations and only you and your team can put that marker in its proper place.
Finally when drawing the line as to how much is too much, one must allow for excess capacity for the normal unexpected events that just happen. For example, one needs to allow for capacity to do table restores, database cleanup, reruns of summarizations, corrections to applications that fumbled the data, etc. Many declare that capacity requirement to be about 20% per day.
Like I offered before, little is right or wrong about any number you hear. One needs to discover this answer over time.
Let me know if my thoughts here trigger follow-on questions
|Copyright 2016 - All Rights Reserved|
|Last Modified: 28 Jun 2020|