• Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Signal noise over longer distances

Status
Not open for further replies.

NJ Roadmap

New Member
I am planning to interface a temperature sensor but am yet to choose one because of a question which I still can't seem to answer:

Analog sensor or digital?

The reason why I am straying away from getting an analog one is because the sensor will be connected to a PIC via a fairly long wire (about 15 metres), and am worried that this length can introduce a fair amount of noise and hence reduce accuracy. I could go for a sensor with a variable current output (like the AD590) which would reduce noise problems, but the only ones I've found are too expensive for my project (>£5).

The reason why I am not sure about a digital sensor is because it would have to work over I2C interface (which my PIC supports), but I've heard that the I2C interface is a bit of a pain to get working? The advantage of a digital sensor on the other hand is that signal noise won't affect the readings much because its only 2 widely-spaced voltage levels being read (i.e. it's digital)

Can anyone help?

Happy Holidays!

p.s. the other option is wireless, but again my prototype should demonstrate complete cost-effectiveness, and I'm sure wireless Rx and Tx circuits will bring the cost up by a fair amount? If not, can someone point me in the right direction for wireless Rx and Tx circuits for digital use? I'm a n00b when it comes to wireless stuff!
 

eng1

New Member
When I suggested the AD590, I knew it's expensive, but you have realized that distance is an issue. Noise is a problem and the voltage drop is another. If the output is a current, you get rid of the second one.
If you choose an analog sensor with voltage output, you can build an RS232 interface near the sensor with another small PIC.
 

evandude

New Member
If you're familiar with PICs, getting an I2C interface working is really not that bad. A couple months ago I played with an I2C EEPROM for the first time, and all I did was adapt some very simple existing example code I found and it worked just fine, within an hour of starting. Plus, I was bit-banging it, instead of using the hardware module, which might be even easier.

I2C temp sensors are more expensive than their analog counterparts, but it's definitely a more elegant solution - on the measurement end, you just have one small part, instead of a big circuit, and if you need multiple sensors at the end, I2C is a bus so you can connect them all to the same cable as long as they have separate addresses.
 

Optikon

New Member
NJ Roadmap said:
I am planning to interface a temperature sensor but am yet to choose one because of a question which I still can't seem to answer:

Analog sensor or digital?

The reason why I am straying away from getting an analog one is because the sensor will be connected to a PIC via a fairly long wire (about 15 metres), and am worried that this length can introduce a fair amount of noise and hence reduce accuracy. I could go for a sensor with a variable current output (like the AD590) which would reduce noise problems, but the only ones I've found are too expensive for my project (>£5).

The reason why I am not sure about a digital sensor is because it would have to work over I2C interface (which my PIC supports), but I've heard that the I2C interface is a bit of a pain to get working? The advantage of a digital sensor on the other hand is that signal noise won't affect the readings much because its only 2 widely-spaced voltage levels being read (i.e. it's digital)

Can anyone help?

Happy Holidays!

p.s. the other option is wireless, but again my prototype should demonstrate complete cost-effectiveness, and I'm sure wireless Rx and Tx circuits will bring the cost up by a fair amount? If not, can someone point me in the right direction for wireless Rx and Tx circuits for digital use? I'm a n00b when it comes to wireless stuff!
Since you are worried about the accuracy of your signal, why don't you start out by working out what accuracy you need. This will narrow & help you with choosing which sensor. The low noise implementation of the sensor comes later. Low level analog signals can be run on wires those distances without degrading your system accuracy possibly. Lets not assume it can't be done until you have a sensor chosen.


What accuracy do you need?
 

Analog

New Member
NJ Roadmap said:
+/- 2 degrees C..
The problem is that most analog sensors output 10mV/degree C or K. So if the noise ~20mV or greater, then my readings could be off..

This is something I'd be interested in, its a Microchip I2C-compatible temp. sensor for £0.78:
http://www.electro-tech-online.com/custompdfs/2006/12/61869-1.pdf

Not that expensive!
Interesting. I use thermocouples that are run over a hundred feet and they are better than the accuracy you need, and they put out voltages < 10mV! Sure, there is noise, but most of the noise is common mode, which can be rejected by a good instrumentation amplifier. Secondly, temperatures that I read don't change more than once per second, so a good low pass filter kills most of the other non-common mode noise easily. Finally, a TC is a low impedance device, so it can drive a long distance quite easily. Perhaps you should look into something like that.
 

audioguru

Well-Known Member
Most Helpful Member
Microphones operate over very long cables and don't pickup noise. Their output is about 10mV and you can't hear interference so it must be less than 10uV! They use shielded balanced audio cables to stop interference pickup.
 

evandude

New Member
I think it's also very important to make your choice based on the actual environment. If you're using it somewhere like an industrial setting or in a car, with lots of electrical interference and all that, it would probably be well worth your while to either go digital, or go with a more robust analog solution, like a current output and/or some good shielded cable, etc. But if you're going to be using it in a relatively "quiet" environment, like in your house or back yard or something like that, you may not really need to worry about it at all. A 15 meter long wire shouldn't cause tremendous problems in and of itself, as long as it's not picking up huge interference, any voltage drop across it should be pretty much constant and easy to take care of with some calibration in software, etc.
 
Last edited:

Optikon

New Member
NJ Roadmap said:
+/- 2 degrees C..
The problem is that most analog sensors output 10mV/degree C or K. So if the noise ~20mV or greater, then my readings could be off..

This is something I'd be interested in, its a Microchip I2C-compatible temp. sensor for £0.78:
http://www.farnell.com/datasheets/61869.pdf

Not that expensive!
Well +/- 2 degrees is a no brainer. Any old 1 deg or 0.5 deg accuracy part would do the job. If you like I2C then go for it. I would recommend you take care not to add too much capcitance on your long cable though. I2C running at 400kHz can only tolerate ~ 1000pF before the poor signal integrity breaks it.
 

evandude

New Member
Optikon said:
I2C running at 400kHz can only tolerate ~ 1000pF before the poor signal integrity breaks it.
but then again, there's really no reason you should need to be running a temperature sensor at anywhere near to that high of a speed, when the PIC gets to make the data rate as low as it wants.
 

Sceadwian

Banned
Noise on an analog part can be reduced to very low if not non-existant levels just by using shielded wire. 15 meters really isn't that far. If you want cost effectivness just use the analog part and basic shielded wire. Industry has been using setups like that for years on longer runs than that with near 100% reliability.
 
Status
Not open for further replies.

EE World Online Articles

Loading
Top