I have a transducer that has a voltage output range of 5.62 vdc to 2.0 vdc
and I am trying to interface this to a meter that has range of 0 vdc to 1 vdc. The supply voltage that I have to use is single sided and is 12 vdc.
As the transducer is used in different applications its voltage range will vary from low value of 2 vdc to a value inbetween 2 vdc and 6 vdc.
The circuit must be adjustable so that what ever the highest output voltage happens to be, it will be changed to 1vdc and whatever the lowest output voltage happens to be, it will be change to 0 vdc or near 0 vdc. I havent the foggiest notion how to accomplish this and any help would really be appreciated.
This could be the answer to a million different posts. You can use a pic chip with the sensor voltage being read in with the adc pin. You can then convert this voltage with a formula or a look up table and then use a
pwm pin to send out 0 to 1 volt.