Mikebits
Well-Known Member
Here is another thought, if we take the argument that semiconductors do not follow ohms law based on the fact that their resistance changes based on current and temp rise, then can we not apply this thinking to a piece of wire?
As wire temp increases, wire R increases, as in a temp coefficient, yet we say the wire follows ohms law. Of course the coefficient of temp in the wire is much smaller and affects the linearity of the wire in a much lesser degree, but does the smaller coefficient qualify the wire to meet the ohms law? Is the degree of the coefficient the deciding pass/fail criteria for meeting ohms law?
As wire temp increases, wire R increases, as in a temp coefficient, yet we say the wire follows ohms law. Of course the coefficient of temp in the wire is much smaller and affects the linearity of the wire in a much lesser degree, but does the smaller coefficient qualify the wire to meet the ohms law? Is the degree of the coefficient the deciding pass/fail criteria for meeting ohms law?
Last edited: