Welcome! Here are the website rules, as well as some tips for using this forum.
Need to contact us? Visit https://heatinghelp.com/contact-us/.
Click here to Find a Contractor in your area.

EDR: Did they really paint radiators?

michael_15
michael_15 Member Posts: 231
So like most of you, I've heard the story about dipping radiators in paint and so forth to determine the EDR of radiators. . . .

Is this really true? It doesn't seem to me that this would arrive at the correct answers. The true answers would depend fairly strongly on the shape of the radiator. For example, taller radiators should have lower BTU/h ratings than shorter radiators of the same EDR.

The air in between the columns (or tubes) of my radiators near the top is always warm (naturally). It's well over 100 degrees. This is because the convecting air picks up enough heat on its journey from the bottom of the radiator to the middle of the radiator to reach this temperature.

However, this means that the BTU-h of the top of the radiator is limited by the heat of the surrounding air. If we say that the radiator's output is 30% radiant and 70% convection, then the 240 BTU (steam, 215*) per square foot, which is 168 BTU of convective BTU, is lowered. If the surrounding air near the top of the radiator is, say, 110 degrees (meaning a 105* difference versus a 145* difference), this convective segment may only be worth 112 BTU or something along those lines (2/3rds the power: these are estimates), meaning the BTU output of the square foot of EDR near the top of the radiator is only 184 BTU. (I didn't reduce the radiant portion of heat loss here.)

This, of course, doesn't even get into the fact that the radiant energy coming out of the middle tubes of the radiator aren't living up to their potential.

This lowered heat loss performance based on shape can be seen, for example, when looking at the published ratings for heat loss from bare steel pipe. A 2-inch steel pipe will loses around 405 BTU per square foot of pipe, but a 4-inch pipe loses only 363 BTU and a 10-inch pipe loses 327 BTU. (All of which are greater than the 240 BTU we use for cast iron radiators).

Anyway, this was just going through my mind as I was sitting on the train today. Thought I'd post it and see what you guys think. . . I hope I made sense.

Comments

  • michael_15
    michael_15 Member Posts: 231
    EDR

    So like most of you, I've heard the story about dipping radiators in paint and so forth to determine the EDR of radiators. . . .

    Is this really true? It doesn't seem to me that this would arrive at the correct answers. The true answers would depend fairly strongly on the shape of the radiator. For example, taller radiators should have lower BTU/h ratings than shorter radiators of the same EDR.

    The air in between the columns (or tubes) of my radiators near the top is always warm (naturally). It's well over 100 degrees. This is because the convecting air picks up enough heat on its journey from the bottom of the radiator to the middle of the radiator to reach this temperature.

    However, this means that the BTU-h of the top of the radiator is limited by the heat of the surrounding air. If we say that the radiator's output is 30% radiant and 70% convection, then the 240 BTU (steam, 215*) per square foot, which is 168 BTU of convective BTU, is lowered. If the surrounding air near the top of the radiator is, say, 110 degrees (meaning a 105* difference versus a 145* difference), this convective segment may only be worth 112 BTU or something along those lines (2/3rds the power: these are estimates), meaning the BTU output of the square foot of EDR near the top of the radiator is only 184 BTU. (I didn't reduce the radiant portion of heat loss here.)

    This, of course, doesn't even get into the fact that the radiant energy coming out of the middle tubes of the radiator aren't living up to their potential.

    This lowered heat loss performance based on shape can be seen, for example, when looking at the published ratings for heat loss from bare steel pipe. A 2-inch steel pipe will loses around 405 BTU per square foot of pipe, but a 4-inch pipe loses only 363 BTU and a 10-inch pipe loses 327 BTU. (All of which are greater than the 240 BTU we use for cast iron radiators).

    Anyway, this was just going through my mind as I was sitting on the train today. Thought I'd post it and see what you guys think. . . I hope I made sense.
  • Al Letellier
    Al Letellier Member Posts: 781
    edr

    I think you spend too much time on the train...seriously though, dipping in paint just gives you the area of heating surface of the irregular shaped surfaces of the radiator. As a heating contractor with much success working with steam heat, thats close enough for me. I let the engineers worry about all the stuff you mentioned. While it's great to have all that background information, and it can be interesting, it simply doesn't inherit a lot of space in my brain.

    To Learn More About This Professional, Click Here to Visit Their Ad in "Find A Professional"
  • David Efflandt
    David Efflandt Member Posts: 152
    approximation better than nothing

    Other factors are involved too. Like that heat transfer is most efficient when two substances are closest to the same temperature. You have condensate at bottom heating the coolest air, and hottest steam at top heating the warmest air.

    You would think that color would make a difference too, because dark colors radiate and absorb heat more effectively than light colors. However, other than metalic colors, there is little difference. My guess is that less heat radiated from light colors bounces off of adjacent sections, and more heat radiated by dark colors is absorbed and radiated by adjacent dark sections.

    Anyway, it all averages out close enough that an EDR estimate based on actual measurements is better than no measurements at all.
  • Mike T., Swampeast MO
    Mike T., Swampeast MO Member Posts: 6,928


    Makes perfect sense and you're not the only one who things about such...

    For steam the ratings are very accurate because it's easy to test. If you know the pressure of the steam and measure the quantity and temperature of the condensate you need only put different radiators in a controlled space for comparison.

    Yes, they definitely found that the shape & size of the radiator mattered. Two-column rads have more output than those with more columns (higher percentage of radiation). Short rads have more output than tall ones (more efficient convection). Pipe coil radiators had the highest output of all. Thin tube rads have almost the same output as two-column but packed more output into a given cubic volume (they could be deeper without sacrificing much radiant output).

    With steam they could also construct very elaborate testing chambers to find the effect of various enclosures and radiator placement.

    With water the testing is much more difficult as it's hard to get accurate flow and temperature data. According to "Mechanical Equipment of Buildings", Harding & Willard, 1929, "The difficulties encountered in testing direct hot water radiators are so great that very few tests of such radiators have been made or reported."

    Even when such tests were made, they were concerned with maximum output at fairly high temperatures--at least 180°. I've yet to find any test results at low temperatures.

    For the most part, they seem to have based water radiators on the performance of steam. The steam didn't change in temperature, but by using a higher air temp in the test chamber, correction could be made. Again though, I cannot find any numbers low differential temperatures--say in the 10° - 60° range.

    The great depression, WW-II, increasing wages, decreasing supply of inexpensive cast iron, high housing demand, good thermostats and cheap blowers all seem to have combined to stop serious testing of standing iron radiators--pity...

    When based on output per sq.ft EDR per degree temp difference between the rad and the air, standing iron rads appear to have their highest output at low temperature when there is very little convection. As the temperature difference increases, radiation seems to decrease as more kinetic energy is required to move the air. As temp difference increases further, the height of the rad seems to begin limiting convection with a corresponding increase in radiation. At least that's what my admittedly crude tests seem to be saying...
This discussion has been closed.