OK, Paul Wheaton simply doesn't know what he's talking about:
"I think that this does produce some savings, but not as much as you might think. If you set your thermostat to a constant 70, the heater works a little at a time throughout the day. If you drop it to 50 at night or in the middle of the day, the heater stops working, but then when the time comes to warm the house again, the heater has to work at full power for a long time to get the temp back up - thus losing a lot of your savings."
If you're running heat constantly, you're maintaining a constant flow of heat from your interior to the exterior. That is, you're maintaining a high heat exchange rate to the exterior, and you're constantly wasting a large portion of heat.
If you're heating only while you need a warm interior, then as the interior temperature falls, the energy flux to the exterior decreases. You're no longer pumping heat into the external environment.
Yes, you'll run your furnace/heating system continuously for a while in raising the interior temperature, but that is largely adding heat to the interior space, not to the exterior.
The net is expending less energy.
Your most efficient strategy is to turn interior heat down to the minimum essential level (ultimately: enough to keep pipes from freezing), or the minimum level the thermostat allows (often ~50F in the US). My own practice is generally to turn any heating system off entirely at night.
From a moisture management perspective, you also win as cold air has a lower absolute humidity, that is, the quantity of water it can hold is lower. Heating cold humid air reduces the relative humidity, allowing walls and surfaces to dry out.
The overnight heat loss is also a very clear sign that Paul Wheaton is dealing with an exceptionally poorly insulated structure. And a very poor grasp of thermodynamics.
The same principle holds for AC as well, though here you want to increase the temperature setting at which the AC comes on, or disable AC entirely while you're out of the home.
A better way of thinking of this is to minimize the energy input (heating or cooling) when it's not needed.
His article on CFLs (http://www.richsoil.com/CFL-fluorescent-light-bulbs.jsp) is very cringe-worthy as well. As part of his argument, he references a report he heard of claiming that kids gained 20 IQ points by switching back to incandescent bulbs. Seriously.
My thoughts exactly. Started reading on the site a couple of weeks ago after an article about lawns on HN (which was reasonably ok and correct). His CFL article starts ok, and he does make some good and valid points but then the pseudo-science and false logic kicks in to levels that make my cry.
wikipedia: "In order to be effective as an insecticide, diatomaceous earth must be uncalcinated (i.e., it must not be heat-treated prior to application)[13] and have a mean particle size below about 12 µm (i.e., food-grade – see below)"
Its used occasionally for deworming people and considered a low risk insecticide. Just because you can eat it and it has some benefits in use cases doesn't mean you should consume as much as you can. Its the classic vitamin snake-oils sales pitch, "X is good for you thus more of X must be better for you"
I've encountered advocates of DE consumption previously. No real sense of whether it's legit or bogus, though there seems to be some plausibility. I'd have to look into other sources. Wheaton's credibility based on his other statements isn't great, and dosing and/or specific indications would be useful.
It is helpful to realize that in a pre-industrial society, parasites were very common among humans (and still are in less developed nations), including many introduced via food. Whether or not DE could combat that is arguable, but a regular intake might be argued if you're subject to parasites in your food supply. I'm more in favor of alternatives such as cooking.
It could be true if switching from 60 Hz old-style fluorescent tubes to incandescent. Working with 60 Hz lighting for those of us that can clearly see the flicker is very difficult. It's literally like trying to work under a strobe. Even at higher rates these lights have random flickering from age and other reasons. You may not consciously see flickering, but your brain does.
Your theoretical analysis misses an important physical fact: houses are able to exchange air with the outside world, not just heat. As you heat a normal room, it will expel air in order to maintain constant pressure, and similarly, as you allow a room to cool, outside air will enter again to keep the pressure constant.
There was an article about this in the American Journal of Physics [1, 2] a couple years back, and the authors calculated that heating a room from 273K to 300K (32F to 80F) causes it to expel about 10 percent of the air inside it.
Somewhat surprisingly, the authors don't comment on how relevant this is to the question of when to turn your heater off. The fact that allowing a room to cool draws in cold air from the outside means that the practice is worse than you would naively think, but I don't know when if ever that actually suggests leaving the heater on a constant setting. I'd be very curious to see someone extend the calculation to give an answer.
I have been told by multiple HVAC professionals recently that one should keep their fan on their system always running. They have in fact adopted this practice at our offices and it greatly help with keeping temperatures in consistent from room to room.
The arguments they provided for this were:
[1] Constant air flow creates more consistent temperatures (less highs and lows from interior to exterior or floor to floor.
[2] Allows filtration system to work continuously cleaning the air.
[3] (This is the one I am skeptical of) Starting the fan motor has a large current draw, but leaving it running is very low current on modern high efficiency HVAC systems. Also apparently when the fan starts i the most likely time for fan failure due to high current and torque on the motor. These fan motors are typically not repairable in small commercial and residential units and are quite expensive.
We adopted the fan always on with scheduled temperature adjustment here at our office, it is more comfortable for sure and less dusty. I do not know if it affected energy costs. When I tried this at home it was more comfortable but my electric bill went up by ~$20
For offices, it's likely more a matter of keeping pockets of hot/cold air (or other quality issues not related to temperature) from forming. Even just with lighting and office equipment, there's a fairly considerable energy flux into many office spaces, and without mixing you'll tend to get uncomfortable pockets forming.
Your #3 point strikes be as fairly valid: large electric motors do impose a very high load, and the current draw could be harmful to the motor (and switching / control circuits), while the imposed load might also be hard on other equipment. I doubt the energy usage argument (for cycling the motors off) really carries much weight, and it should be possible to idle such systems at low power. For most commercial buildings, you've got systems for heating, chilling, and air handling which are pretty much all operating simultaneously: you have a need for hot and cold air in different places, there are mixers which will deliver what's needed where it's needed (at least in theory), and the fans blow all the time.
Though I've only a pretty glancing familiarity with HVAC in general, not particularly my area of expertise.
I run the circ fan all the time as well, but the bearings on the motor do have a limited lifetime. Ask me how I know this.
A new motor will cost hundreds of dollars, more so if the blower and fan cage are a single assembly. You'll need to amortize that cost into your comfort level.
It will also quit at the least convenient time, following a combination of Tuttle's and Murphy's Laws. Again, ask me how I know this.
About 3, the idea is right, but they're probably overstating the difference between starting power and running power.
Starting a motor is not something you do lightly for bigger motors (but big like a subway or a car) but I doubt they're that heavy for it to be a big concern.
My bet is that it'll spend the starting power in around 5s of running, tops.
This effect isn't going to be very significant. A constant exchange of air will have a significant effect, but a one-off exchange of 10% of the air will be negligible. The reason for this is that the air has a very small thermal mass compared to the building structure.
I've got mote sensors in every room in my house, including the hall. When my kids leave the front door open in winter for a few minutes, as kids tend to do, I can see the temperature in the hall dive on the graphs. But close the door again, and the temperature is right back up very close to where it started in just a few minutes - even when the heating is off.
The heat loss due to thermal expansion and contraction of air will be quite minimal.
Thermal density (specific heat) of air is going to be ~1000x less than that of solid objects.
The specific heat of gypsum (the primary constituent of drywall) is 1.09 kJ/kg.K
For dry air it's 1.0 kJ/kg.K
Air's density is 1.225 kg/m3.
A 6m x 9m x 2.3m (20' x 30' x 7.5') room has a volume of about 130m^3, so a 10% exchange would be 13m^3, or 16kg.
That's about 15 kJ of heat energy per degree C, or roughly 0.0004 liter (0.0001 gallon) of heating oil equivalent.
The drywall would be (in feet) 20x7.5x2 + 30x7.5x2 + 20x30 ft^2 (I'll assume the floor is some perfect insulator for now, and that the room has no doorways), and 1/2 inch thick, or 1.6 m^3. That's about 3600 kg of gypsum, which has a heat capacity of about 4000 kJ per degree C, or about 0.1 liter (0.027 gallons) of heating oil equivalent.
cheap apartments are also poorly air-sealed, which means they're already losing a great deal of their heat through air leaks. An extra one-time 10% turnover won't make a puddle of difference.
In winter in particular, the rate of through-roof air leaks is also greater when temperatures are warmer due to the stack effect. The author is completely wrong on this front - it's far better to let your house cool down and then re-heat.
The only time this isn't true is if you have a two-mechanism heating system such as a heat pump with resistance heat backup, where a large temperature swing invokes the more expensive resistance heater. The author doesn't.
...but the ideal structure is not supposed to exchange air with the outside particularly because of this. That is why windows and doors have seals on them. I'm no mechanical engineer, but I've worked with enough of them in my career to understand that their goal is usually keeping a structure tightly sealed when ingress/egress paths are closed (of course opening windows/doors is a choice of the occupant, which if your striving to reduce heat loss/gain... you shouldn't do)
An ideal structure should minimize convection, but rooms for people really can't be allowed to undergo large changes in temperature without exchanging air with the outside environment, because that would imply a large change in pressure.
A 10 percent increase over atmospheric pressure might not seem like that much, but it's enough to notice, and would probably be uncomfortable. It's like being under 1 meter of water. It's also enough to break large windows.
I don't think comfort would be much of an issue given a slow adjustment period, but structurally it would be ludicrous. You'd be unable to open or close doors. Your walls would have to be built a hundred times stronger than the walls of a normal house. Any breach in the pressure envelope would be a miniature version of Aloha Airlines 243.
This is, by the way, why hard drives have filtered air holes rather than being completely sealed.
"the ideal structure is not supposed to exchange air with the outside"
No, the ideal structure should minimize heat loss from air exchange.
I'm not fully up on my air exchange rates, but it's fairly typical for ranges to be in the 4-20 range, that is, the interior air is exchanged with the exterior 4-20x per hour.
In my Thorsten Chlupp references elsewhere you'll find he makes extensive references to heat exchangers which minimize thermal losses. He does this by a twofold process for his Fairbanks, AK, homes: entering air is routed first through the ground where it's heated from very cold ambient temperatures of as low as -40C / -40F to a temperature closer to freezing (~0F). It's then passed through a heat exchange where the exiting warm air transfers much of its heat to the entering cold air.
The purpose of tightly sealed windows and other possibly entry/exit points isn't to eliminate air exchange so much as to control it: you want air entering and exiting through your designated ventilation systems and transferring heat properly, not traversing the envelope arbitrarily.
Another way of putting it: A well-designed structure should minimize random, unintentional air exchange, but provide sufficient deliberately-engineered ventilation to keep the air and people happy. For efficiency, that ventilation should go through a heat recovery ventilator (HRV) or energy recovery ventilator (ERV) as appropriate to the climate and budget. (An energy recovery ventilator also exchanges moisture).
Beyond Chlupp, Lstiburek is a great engineer and writer on these topics.
Perhaps I lacked the verbosity in my original comment, but this was what I meant to imply: ventilation is part of the design. Unintentional exchange is to be avoided.
>...but the ideal structure is not supposed to exchange air with the outside particularly because of this.
With regard to letting a house cool down at night, that's impossible. If you heat a quantity of air from 273 K to 300 K at constant volume, the pressure increases from 1 atm to 1.1 atm. That may not sound like much, but it's better expressed as a pressure differential of 10 kilonewtons per square meter, and the surface area of your house is such that it could severely damage the walls and blow the door open -- popping it like a balloon.
(The fundamental mathematical error made by those who minimize this effect is ignoring the surface area of the building)
It's most definitely not the goal to suffocate the inhabitants, no, and the goal also isn't to have the natural humidifiers living inside create a tropical climate ripe with microorganisms and stuff. Yes, you want to avoid excessive air exchange because you'd lose a lot of heat that way, but tightly sealing a house would be a very bad idead indeed. If you do make the outer envelope essentially airtight, that gives you very low energy consumption and is part of the passive house concept, but then you need active ventilation, which obviously implies a hole in the envelope, and thus will keep the pressure inside equalized with the pressure outside (modulo a small differential caused by the ventilation itself).
Funny how these things have been well solved in places with real winters (or summers) for hundreds of years isn't it?
Central European buildings used to have walls so thick as to achieve a cave effect, which gave them nearly constant temperature through most of the year. With some minimal-ish heating during winter as a side-effect of cooking with wood to combat the effect of bad windows.
Not sure if your comment was a general statement or aimed at the author of the article (Paul), but for the record Montana has "real" winters. I lived a much similar life in rural Utah for many years, doing many of the same lifestyle optimizations that Paul is engaged in.
I don't know what Paul's house is built of, but mine was solid brick, which probably matched the structure and function of the dense brick/cob building of Europe. The thermal mass was a blessing, as my only form of central heating was wood/coal-burning cylinder stone in the center of the home. In the dead of winter (lows often in the -5 to -15F range at night), a short, intense burn first thing in the morning would charge the house, with minimal burning during daylight hours to maintain a comfortable temp around the core living areas (55-to-65F). Just before bed, load and tweak the stove for a long slow burn. By morning, even on the coldest of nights, the temp never dropped below 45F.
As for his heat-the-person-not-the-room method, that's an obvious, old technique which I applaud Paul for using. For me, a thick down comforter on the bed pre-charged with heat via old-fashioned hot water bottle an hour before (they have nice, newfangled silicone models these days) was all it took. The water was, of course, heated on the wood stove.
During the day, wise clothing (layers, wool, hats) and space heating made the place livable. As did the occasional bout of labor to build up head (an old saying is that chopping wood heats you twice). Yes, there is also some acclimation that takes place. Several years after abandoning the lifestyle, I still cannot tolerate indoor temps much warmer than 70.
Frankly, given that Paul has a rocket mass heater (much, much more efficient than my stove ever was), I am surprised he needs much in the way of spot-heating. Unless his home is of stick/frame construction.
In Austria and parts of Germany, rural houses have huge eaves. Wood is collected and stacked under the eaves providing extra thick walls/insulation during the winter.
Sure, if you have shitty insulation. The proper strategy is to have good insulation, and then to have the heat on constantly using a CV system at low temperatures, because heating constantly with water of 30 degrees (C, that is) is much more energy efficient than heating in peaks with much warmer water (60, 70, 90 even 20 years ago). So yes, in houses with crap insulation warming constantly is less efficient. However in properly build houses, warming constantly is more efficient. (I'm living for a few months in New Zealand - oh my the quality of the houses here is atrocious. It's one of the main reasons I couldn't live here permanently. And then they put in a 'heat pump' (an air to air one) and call themselves 'green'. Don't get me started.)
How can heating be any less efficient than 100%? I get that engines can be less efficient than 100% since the remaining energy is dissipated as heat and heat is the not desired energy in that case.
What happens with the remaining energy if you heat with less than 100% efficiency? Does it start to rotate things?
Heating can be more than 100% efficient. Before you go "what, lol, no" - let me clarify. Obviously it can't be even 100% thermodynamically efficient, but a heat pump for example can generate 3-4 kW of heat from 1 kW of power. The rest of the energy comes from the environment obviously, but naively assuming that the 100% efficient power-into-heat is the best you can do is not true.
Are there any efficient electric heating systems for a household based on a thermodynamic cycle with the environment?
I think the most electric heating systems I have seen are using the rather bruteforce and inefficient (only near 100%) method. The basic principle that they heat a resistor and something quickly transfers the heat away so the resistor doesn't burn out and/or your house doesn't catch on fire.
An other fun method to increase the "efficiency" of electric heating would be heating with bitcoin miners. It wouldn't make the heating more efficient in the sense it would cost the same energy but at least you could get some of your money back spent on heating.
I'm not quite sure what you mean your first question, but I think heat pumps using air, ground or water as heat reservoirs would qualify. They are in common household use at least where I am located, and have a input-power-to-heat efficiency of up to 400% (the Carnot cycle imposes a limit of roughly 7-10x depending on which exact scenario you calculate)
'Efficient' not in the thermodynamic sense, but in the sense that you need to put in x % less energy to have the same comfort level (i.e, have the house at a comfortable temperature at the times you need it).
And to add another use of 'efficient' to the mix, just to keep things interesting ;) , when people say 'heating a house with a CV (water) based system powered by natural gas is more efficient than using electricity', what they mean is that A) electricity is generated from other sources and there are losses in the conversion/transport/etc, whereas for gas that's not so; and B) that it's just much cheaper. So yeah, sorry for the confusion, but in energy use land, 'efficient' is a highly overloaded word.
Electric heating has near 100% efficiency regardless how you use it[0]. Heating with furnace is less efficient since some of the heat escapes through the chimney directly and efficiency is typically disproportional to the heating power so it's better to heat constantly[1].
In South Africa, most homes are built with little consideration of cold weather, since the winters are relatively mild. Problem is that draughts and poor insulation in winter then result in significant spiking energy usage.
Most houses have poor in insulation. Also, adding good insulation without degrading indoor air quality and causing moisture/mildew problems is tricky. So improving energy efficiency of heating for existing houses with lesser insulation is very useful.
My German apartment has drafty, unsealed wooden window frames, and the window in the kitchen is a single large sheet of glass. We have to hang a rug over that window in the winter, as it feels like you've left a freezer door open if you get within 3 feet of it.
I don't speak German so I cannot look it up, but my guess is these standards are for new buildings or buildings that are being renovated (and probably the numbers are different for private and commercial buildings as well, which is a shame), and owners of an existing building cannot be forced to adhere to the standard.
Thickness of walls is of course related to insulation, but it isn't everything; for example 20cm of typical insulation still has a much lower U-value then a concrete wall of 100cm thick.
It is a common complaint among people who moved from Finland to Germany, how dodgy the German houses are. If the rest of the world is even worse, maybe I should never leave home.
Hell, if you have good insulation, you don't really need any heating at all beyond that produced by cooking, lighting, and body heat of the occupants; except in really exceptional circumstances. You do need air exchange, preferably heat recovering.
Yeah, you'd probably need to spend over 1 million on a home here to get one with good insulation(if). Housing costs/quality is my biggest complaint for NZ.
This all depends on the type of heating system you have, the weather where you live, whether you have a humidifier on your heating system, and the insulation level of your house.
How much of this is thrown out the window when you have "peak pricing" for energy?
Also, the "straightdope" link doesn't exactly dispute that setting the thermostat down is ultimately a much more modest ~6% energy savings on average. Nothing to scoff at, but ultimately you come out better by taking a holistic approach. If you can afford it, of course.
Peak pricing isn't particularly common in residential power metering, though I suppose it might arrive.
The answer would depend on your heating/cooling cycle and usage. Most usage peaks are bimodal: early morning and early afternoon. Those would tend to correspond to morning and evening heating peaks, assuming your residence is largely uninhabited during the day. Overnight demand is usually low.
This could lead to, say, greater use of steam/water heating using thermal storage. Heating a well-insulated water storage tank with off-peak energy, then transferring that to the structure when it's most needed, would be a form of demand averaging / peak shifting. Depending on the storage methods used, boiler explosions might become an increased risk.
If you're using direct-delivery methods, e.g., radiant electric heat as described here, peak pricing would make some of the options used less beneficial on a cost basis.
I agree with all your theory. I thought the same thing. Then I went out and bought a programable thermostat, set it quite low overnight and during the day, and warmer from 5-8 AM and 5-8 PM. My heating bill was higher than just maintaining 68-70 all the time. And that's not even counting the $100 I spent on the thermostat (now on a shelf in the garage, I put the old manual one back in its place).
Either the laws of physics are different than well established theory says they are... or your thermostat didn't do a very good job. I vote for the second one.
Or there is more than one physical effect in play: see https://news.ycombinator.com/item?id=8112118 -- there's air movement as well as heat, because your house is not a sealed box.
Putting in heat over a short time, to recover from low temperatures, is different to putting in heat over an extended period. If your heater has very low efficiency at high heat output then it seems highly possible that extended low output could be better. It's a financial efficiency we're talking about primarily when people are looking to more efficiently heat their house.
If you're heating with electric for example you can sometimes buy electricity far cheaper at night (demand is low and traditional production can be easily spun-down). Thus buying electricity all night and keeping the house warm, whilst using more electricity, could be cheaper than paying for peak rate electric.
Physics alone could explain it in very rare circumstances - board construction that expands in warm air closing gaps, cool air opens gaps and causes more cooling. Heat exchange rate from in- to outside is then possibly greater at lower temperatures. Houses aren't simple to model.
Virtually all thermostat-controlled heating systems I've seen don't have different output levels, but are just on or off. So poor efficiency at high output wouldn't come into play.
The rest is possible, but I'd file using expensive electricity instead of cheap electricity under the thermostat not doing a very good job, or more precisely the person setting it up not doing a good job of setting it up to run more cheaply.
Gas fired combi boilers (with centrally heated [pumped] cycling water) are pretty much the default in the UK. I've never seen one that didn't have output temp adjustments. I'd imagine that set highest (which should reach thermostat set target temp fastest) would produce greatest flue losses but I'm speculating there.
I meant controlled by the thermostat. I'm sure you can adjust it manually, but in normal operation, it's cycling between on and off, and nothing more. Output level when heating your house from 50F to 70F is the same as heating your house from 69F to 70F, it just runs longer.
I'm sure there are exceptions to this somewhere, but it doesn't seem common. Thermostats don't typically have a way to command anything besides on and off.
From a thermal energy perspective this simply isn't possible.
However, some systems do lead to this because of flaws in the the implementation. My ground heat pump, for instance, has a backup electric heat system that it will automatically, and unavoidably, utilize if it hasn't reached the target temperature within a set, relatively short period of time. So in my very well insulated home I do indeed see significant cost increases if I do temperature setbacks, as the recovery period sees more expensive/less efficient electric heat kick in, versus just incrementally using the heat pump through the day. Some fuel-based systems go to a less efficient high-heat stage in the same sort of situation.
You are heating the walls and floors which take a long time to come up to temp. I liken it to the difference between keeping large truck at 65mph compared to getting it up to speed.
"I think that this does produce some savings, but not as much as you might think. If you set your thermostat to a constant 70, the heater works a little at a time throughout the day. If you drop it to 50 at night or in the middle of the day, the heater stops working, but then when the time comes to warm the house again, the heater has to work at full power for a long time to get the temp back up - thus losing a lot of your savings."
WRONG WRONG WRONG WRONG WRONG WRONG WRONG WRONG WRONG
Heat losses are driven by two factors:
1. The temperature differential between the hot and cold sides.
2. The thermal conductivity (or exchange) between the hot and cold sides.
That's straight out of Newton's Law of Cooling / Fourier's Law:
http://en.wikipedia.org/wiki/Convective_heat_transfer#Newton...
If you're running heat constantly, you're maintaining a constant flow of heat from your interior to the exterior. That is, you're maintaining a high heat exchange rate to the exterior, and you're constantly wasting a large portion of heat.
If you're heating only while you need a warm interior, then as the interior temperature falls, the energy flux to the exterior decreases. You're no longer pumping heat into the external environment.
Yes, you'll run your furnace/heating system continuously for a while in raising the interior temperature, but that is largely adding heat to the interior space, not to the exterior.
The net is expending less energy.
Your most efficient strategy is to turn interior heat down to the minimum essential level (ultimately: enough to keep pipes from freezing), or the minimum level the thermostat allows (often ~50F in the US). My own practice is generally to turn any heating system off entirely at night.
From a moisture management perspective, you also win as cold air has a lower absolute humidity, that is, the quantity of water it can hold is lower. Heating cold humid air reduces the relative humidity, allowing walls and surfaces to dry out.
The overnight heat loss is also a very clear sign that Paul Wheaton is dealing with an exceptionally poorly insulated structure. And a very poor grasp of thermodynamics.
The same principle holds for AC as well, though here you want to increase the temperature setting at which the AC comes on, or disable AC entirely while you're out of the home.
A better way of thinking of this is to minimize the energy input (heating or cooling) when it's not needed.
See:
http://www.uswitch.com/energy-saving/guides/heating-on-all-t...
http://www.straightdope.com/columns/read/2970/does-turning-d...
http://energy.gov/energysaver/articles/thermostats