Server farms such as Google and Yahoo! Provide enough compute capacity for the highest request rate of the day. Imagine that most of the time these servers operate at only 60% capacity. Assume further that the power does not scale linearly with the load; that is, when the servers are operating at 60% capacity, they consume 90% of maximum power. The servers could be turned off, but they would take too long to restart in response to more load. A new system has been proposed that allows for a quick start but requires 20% of the maximum power while in this “barely alive” state.
a) How much power savings would be achieved by turning off 60% of the servers?
b) How much power savings would be achieved by placing 60% of the servers in the “barely alive” state?
a) is the answer 90% ? the question says 60% servers consume 90% power. So if 60% servers turned off, does that mean 90% power saved?
b) is the answer 70%? barely alive state requires 20% power, originally it consumes 90% power, so power saved = 90% - 70% = 20% ?
Any suggestion is appreciated. Thanks