I am wanting to use a thermistor as a crude flow meter and was wondering:
When they give the maximum power rating, is this due solely to damage caused by the thermistor self-heating to above the maximum temperature, i.e. if heat transfer from the thermistor was improved (eg with higher air flows or lower ambient temperatures) could you go above the max power rating?
I have seen a few de-rating curves where the rating is 100% until 25 degs and then linearly decreases to zero at the max temp. but I don't understand why you couldn't allow higher power at lower temperatures?
At a rough guess, I'd say that you could probably push the power rating a bit, if you have a cooling method. On a personal level, though, I would be more conservative, by overrating a device, rather than underrating it.
Thanks for the replies guys. I'll try and clarify a bit.
Say I design a circuit to keep the thermistor at a constant temperature, 70 degrees fro example. What if the thermistor is rated OK up to 125 degrees and 75 mW, but to keep it at 70 degrees it will have to dissipate 150 mW of energy (if for example there is cold air whistling past the thermistor). Is it going to be OK doing that or is there a problem with the leads or something to stop it running that much current?
well, i think that there are 2 factors to consider....the temperature...wicg you say it is in the range and the current that passes trough it...
you can have it maintain to 10 degrees, but if you have 0.5A trough it and it only holds 0.3A then you can damedge it...