I am trying to create a circuit that allows automated, accurate variable output voltage from a range of ~1.5V-4V from a 5V supply.
The circuit must step down voltage from 4V at day one down to 1.5V, approx 90 days later, so its extremely gradual. The period in which each step occurs within 90 days is not too important, neither is the steps in voltage (however the smaller the resolution the better)
One way I though of doing this is by programming a microcontroller to output PWM. The output voltage would be controlled by the average of the duty cycle. Maybe a capacitor could be used to smooth out this signal into a constant analogue voltage?
Do you guys have any suggestions, I am a complete novice with little experience?
Thanks!
The circuit must step down voltage from 4V at day one down to 1.5V, approx 90 days later, so its extremely gradual. The period in which each step occurs within 90 days is not too important, neither is the steps in voltage (however the smaller the resolution the better)
One way I though of doing this is by programming a microcontroller to output PWM. The output voltage would be controlled by the average of the duty cycle. Maybe a capacitor could be used to smooth out this signal into a constant analogue voltage?
Do you guys have any suggestions, I am a complete novice with little experience?
Thanks!