I would like to measure the temperature of some wire I am using while at high current.
I am using the copper resistance and a high accuracy voltmeter.
I am avoiding using any connectors and connecting my probes directly to the wire
but I have an issue where as I watch my voltmeter (at a stable current) the voltage is slowely decreasing.
If the wire was heating the voltage across it would go up, not down... im confused.
I dont but its a nice TDK Lambda power supply so I would expect it to perform.
Are you saying that maybe the PSU is dropping the current as time progresses?
I dont but its a nice TDK Lambda power supply so I would expect it to perform.
Are you saying that maybe the PSU is dropping the current as time progresses?
monitoring the temperature of a wire? I admit it sounds pretty strange to me. I suppose mounting a temperature sensor on part of the wire is not going to be considered here?
Are you saying that Tungsten has a negative temperature coefficient?
If so, please explain why all incandescent lamps dont self destruct when power is applied to them...
Something is obviously wrong. I just tried the same little experiment using a few lab standards. I set the current source for 10 amps and intentionally used a light weight maybe AWG 18 banana patch cord. I measured the voltage drop with a lab DMM.
Applying a fixed and very accurate 10 amps at start:
V = .217 volt so R = .0217Ω
about a min later:
V = .224 volt so R = .0224Ω
Exactly as expected and the wire was getting warm to the touch. Things behaved exactly as expected. The only way this would not produce these results is if the current were not held exactly constant. That or the wire resistance were decreasing and that is not going to happen.
Well, having a precision power supply doesn't really tell us anything. If it's in constant voltage mode, you won't get the constant current. You have to set it up for current. Also, beign "nice" doesn't mean it's capable of delivering the required current.
V/I is Voltage in Volts divided by current in Amps; Has units of Ohms.
A Tugsten's lamp filament resistance is about 10x higher when on than at room temp.
V = IR is ohma law, but if you make I constant and call it k ;It will take the form V = kR. Now by insppection if V increases, then R has to increase at constant current.
Lamps don't self-destruct because the filament is in an atmosphere without oxygen. The temperature the wire reaches will be dependent on the current and diameter of the wire and the amount of cooling. As the lamp ages, the filament reduces in diameter and this for the same voltage it gets hotter.
Tapping a hot pot with your finger and holding it there are different. The length and area of contact could determine how serious the burn is.
Either I am misunderstanding some subtlty, or, you mean that for Tungsten resistance does not increase with temperature, but decreases.
To provoke thought I asked the question about a lamp filament self destructing, which it would do with thermal run-away is the resistance decreased as it got hotter.
Remember that K = deg C + 273 (aprox)
and R = pL/A; Resistance is p(Resistivity) multiples by length and divided by cross-sectional area.
Resistivity is a material property.
Resistance depends on geometry.
The bulb is designed to operate at the lower resistance, but it does cause a blip in higher current draw on start-up.
Both sources of information you linked to seems to counter what you stated KISS. The resistance of tungsten increases with an increase in temperature. Resistance and resistivity are not inversely proportional, so as resistance increases, so does resistivity.
However, if resistance decreased with temperature increase, that doesn't necessarily mean that the filament would self destruct. It could still reach an equilibrium before that event took place.
There is a difference in the behavior of voltage in constant voltage vs. constant current mode. The OP, I didn't think was dividing the measured V by the measured I.
Yes, resistance and resistivity are directly proportional for the same geometry.
The (resistance/resistivity) of Tungsten and Copper behave very differently with respect to temperature.
I would like to measure the temperature of some wire I am using while at high current.
I am using the copper resistance and a high accuracy voltmeter.
I am avoiding using any connectors and connecting my probes directly to the wire
but I have an issue where as I watch my voltmeter (at a stable current) the voltage is slowely decreasing.
If the wire was heating the voltage across it would go up, not down... im confused.
You're probably getting a thermoelectric voltage at the junction of your probes and the hot copper wire.
Try this experiment. Let the current be on for a while. After the voltmeter reading has done the slow decrease, turn the current off and see if your voltmeter reads zero. If it doesn't, you're seeing some thermoelectric voltage included in your reading.