Quote:
Originally Posted by JLeising
By the way  love your posts  keep up the good work! This is only the second one I disagreed with ... gives you about a 99.9999% hit rate!

Well thank you.
I think I see what you are saying. That as the voltage drops, and since the resistance of the element is fixed, it forces the amperage to decrease as well effectively reducing the efficiency of the heater. IE you won't get the full 1000 Watts of heat transfer to the water. (Like a dim bulb trying to light a room).
I was not using Ohm's Law V (or E) = IR to do the calculations; but the one to calculate P (or Power). Power as you may remember is the application of electricity (in Joules) over time (in seconds).
1 Joule of electricity delivered in 1 second is a Watt.
The formula to calculate Watts is Amps times Volts or P=IV
The effective delivery of that electricity is over time is P or power.
So in order to calculate the actual effect of a voltage reduction we would need to use the equations in this article
http://en.wikipedia.org/wiki/Power_(physics)) to determine the new current being drawn at the reduced voltage based on the "mostly fixed"  temperature dependent  resistance (R) in the heating element core at the lower voltage.
Then you can multiply the actual incoming voltage (say 100 Volts) times the calculated (or observed) amperage to find the actual watts being delivered to the water.
So if the heater at "standard conditions" delivers 1000 Watts to the water by consuming 120 volts at 8.33 amps we can use that amperage to calculate the resistance (R) of the element using Ohms's Law E=IR or 120=8.33xR
This gives the 6 Gallon Suburban heating element an R of 120/8.33 or 14.4 Ohms. So if the incoming voltage is dropped to 100 volts and the resistance stays "almost" fixed at 14.4 Ohms, then the current consumed will be V=IR or I=V/R or 6.9 amps instead of 8.333
Since there is only 6.9 Amps of current at 100 Volts being delivered to the heating element the Wattage of the element falls to
694 Watts (an almost 30% reduction in an already inefficient way of heating the hot water heater).
If I got that right, it makes even MORE sense to use propane to heat the water in the summer when delivered volts are reduced by the power company and the physics of the campground's power grid.
I assume that since a motor or "Inductive" load will not function at all at reduced power, the capacitors keep the watts being delivered pretty much constant and as such increase the amperage demand as the voltage drops until the breaker pops.
I got it now. Thank you.
HE CAN BE TAUGHT!