> watts * 3.413 = btu
No, that's wrong.
.....
You have: watt hour
You want: btu
* 3.4121416
/ 0.29307107
Rant:
After I get the low-bid subcontract to manage the place; I'm going
to set aside a special section Down There just for the HVAC folks
who insist on perpetuating that most medieval of units... the BTU.
It belongs in the pile of toxic waste that now holds farthings,
stone, furlongs, and slugs.
Some day, I'll be able to look at EPA Yellow Tags on
water heaters and AC's without grinding my teeth..
/Rant
Well, if you do away with that you can continue with the "mile" as well, then lose the pounds and yards and gallons while you're at it.
On the other hand, I have a question I was pondering at the nanog power session (which was a really good one).
What is the amount of energy coming out of a server as heat as opposed to what you put in as electricity? My guess would be pretty close to 100%, but is it really so? And I've also been told that you need approx 1/3 of the energy taken out thru cooling to cool it? So that would mean that to sustain a 100W server you really need approx 130-140W of power when cooling is included in the equation. Is this a correct assumption?
In one of our data centers we use community cooling, we get 4 C (I think it was approx 4 C) degree water and we're required to heat it at least by 8 C before we return it, this is then used in the community power plant to produce hot community water, and this process I've been told is quite effective. Any thoughts on this? Guess it doesn't work in the boondocks though...
There were also plans to use heat converters to have the cooling water from nuclear power plants heat community hot water, but politicians chickened out. Now we just spew that cooling water straight out into the ocean.
I guess none of this makes sense in the southern part of the US, but further up north where houses actually need heating and not cooling most of the year, are things like this done?
What is the amount of energy coming out of a server as heat as opposed to what you put in as electricity? My guess would be pretty close to 100%, but is it really so? And I've also been told that you need approx 1/3 of the energy taken out thru cooling to cool it? So that would mean that to sustain a 100W server you really need approx 130-140W of power when cooling is included in the equation. Is this a correct assumption?
Based upon my real-world experience, and talking to a few folks, it's very close to 100%. Most assume 100% for the practice of calculating cooling.
However, for those who are very scientific, they try to tell you that some of the power is going into movement of hard drive heads, etc., which creates force on your racks, etc. A true, but irrelevant discussion, really, because it's likely an immeasurable amount.
One could do the excercise of putting a computer in a well insulated box and measuring power in vs. rate of rise of temperature. Volunteers? 