10% of household energy usage is consumed by consumer electronics on standby. Given that the average energy usage for an american household is about 30 kwh every day (940 kwh per month), this means that 3 kwh/day ($7.25/month @ $0.08098/kwh) is wasted energy.

You can look up just how much various devices use on standby: obviously it's going to be a pain to unplug *everything* when you're not using it, but entertainment centers, cable receivers, and audio equipment are big culprits and putting these things on a power strip saves enough money to buy you a burrito every month.

Here's a fun fact: if you sealed up your house so that no temperature could leak out. Every watt you use adds a Joule of heat every second. There's a fun XKCD post exploring this fact. So although a 50 watt box fan might make you *feel cooler* (due to increasing the heat transfer rate between your body and the air around you), it's adding about 180 kJ of heat to the room every hour.

Since 180 kJ is a little bit arbitrary, I'll note that 1 kJ=0.000278 kilowatt-hours=239 calories (yes, like the ones in muffins). If you were sitting in the room, doing nothing, you'd be adding about 25 kJ (63.5 calories) into the room.

I thought the cooling cost of cooling your standby electronics would be appreciable too, but it turns out they're not that much.

Air conditioners are rated using a term called seasonal energy efficiency ratio (SEER), which is the ratio of cooling output (in BTUs) to power input (in watt hours). Typical Energy-star air conditioners today have a SEER of 13. Converting the units, this is an actual efficiency of 3.81 (*if anybody can shed some light into this high efficiency, please let me know).*

Back to the power-strip: if your entertainment center, on standby, uses 1 kwh per month, that's 3600 kJ of additional heat your air conditioner has to cool. Given a SEER of 13, this means that on average it's going to take an extra 0.25 kwh of electricty per day ($0.61/month @ $0.08098/kwh) to cool the stuff that was on standby.