Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Converting Signal from Single-Ended to Differential for ADC?

Status
Not open for further replies.

dknguyen

Well-Known Member
Most Helpful Member
If I am feeding a photodiode amplifier with 0-4.096V output to an ADC that can take differential inputs +/-2.048V, is there any point of running it through a single-ended to differential converter first if the ADC is right next to the photodiode amp? It feels like there would no benefit unless the signal was natively differential or if the amplifier and ADC were far apart from each other.

The alternative is just to leave the signal as single-ended up feed the signal to the + input of the ADC and feed 2.048V to the - input of the ADC.
 
If I am feeding a photodiode amplifier with 0-4.096V output to an ADC that can take differential inputs +/-2.048V, is there any point of running it through a single-ended to differential converter first if the ADC is right next to the photodiode amp? It feels like there would no benefit unless the signal was natively differential or if the amplifier and ADC were far apart from each other.

The alternative is just to leave the signal as single-ended up feed the signal to the + input of the ADC and feed 2.048V to the - input of the ADC.


No unless you have a significant input offset voltage that gives trouble.
 
Depends upon the accuracy and noise level you need.
The differential ADC input likely will give a lower noise level from the ADC output. Look at the ADC specs.
What is the resolution of the ADC?
 
It's a 16-bit ADC. The ADS1178.

But would a single-ended to differential converter contribute anything if the ADC is already next to the amp and the amp is not natively differential? Because the SE-to-diff converter (ADS8476) literally does nothing else. No scaling, gain, filtering or other signal conditioning.

Are you implying it could give lower noise even if it's right beside (aka same PCB) to the converter already? Because surely, the converter will add it's own noise, and I'm not sure if simply being differential will outweigh that when there is no distance to be transmitted over. Or are they only really meant for transmission of the signal to an ADC over a distance? I've not been able to find a clear answer for this.
 
Last edited:
Differential comes in to its own when there is a possible voltage drop on the wiring from a sensor (eg. due to ground current) or when an input is needed across an ungrounded component or from a bridge-type setup like with strain gauges.

For your application you can wire directly across the sensor oV and output so it's fine.

Actually, for a lot of applications you can use differential inputs without any special converters even with remote connections, by using a "four wire" sensor connection - two for power, which have the voltage drop, then sensor ref and out to the differential input, which have near zero current so no offsets.

You can get devices with four-wire connections for that exact use, eg:
https://www.thermometricscorp.com/images/RTD/4_wire_rtdsensor.gif
 
Now I've wondered the same thing as the OP in the past. With my fairly limited knowledge of ADCs, I can't see any reason why the converter should perform better if presented with a true differential signal than with a single-ended signal and a fixed reference voltage.
Except... that a differential signal has zero common-mode component, whereas using a fixed reference gives a common mode voltage equal to half the differential voltage. If it where a differential amplifier, with a finite CMRR, we might expect to see an output error dependent upon the common mode signal.
Could there be any such effect in ADCs (or, more likely, some kinds of ADCs)?
 
Looking at the ADS1178 data sheet, it would appear that the main disadvantage of using the input single-ended and not differential, is that you only get half the dynamic range for unipolar signals, so the maximum output is 15-bits instead of 16.
If that's not an issue in your application then single-ended operation should be fine.
 
Looking at the ADS1178 data sheet, it would appear that the main disadvantage of using the input single-ended and not differential, is that you only get half the dynamic range for unipolar signals, so the maximum output is 15-bits instead of 16.
If that's not an issue in your application then single-ended operation should be fine.
That only if you tie the - ADC intput to ground. If you tie it to mid-voltage then you should be able to maintain the full range.
 
That only if you tie the - ADC intput to ground. If you tie it to mid-voltage then you should be able to maintain the full range.
That's only for a bipolar (±) input and I believe your input is unipolar.

From the data sheet:
upload_2018-8-1_11-44-12.png
 
That's only for a bipolar (±) input and I believe your input is unipolar.

From the data sheet:
View attachment 114000
Unless I'm missing something, I don't believe that they are talking about bipolar voltages as referenced to ground when they use the word "bipolar" here. I think they're talking about the ability for both the + input to be more positive than the negative input as well as the - input to be able to be more positive than the + input. The ADC's input range can only accept signals between AVdd and GND so I don't see how it could ever actually accept a truly bipolar input voltage.

It would make some things a lot easier if it could though
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top