I was assigned to make a voltmeter using a breadboard and an Atmega328 that measure both AC and DC currents. I am struggling with the AC part of the voltmeter as I am required to do it without a transformer. Everything else works fine, anyone please recommend what to do I am so confused. Thanx a lot
So am I supposed to make another DC meter circuit with a rectifier in front of it or is it possible to add the rectifier in the same circuit? sorry if the question seems dumb, I am only a mere student and have little knowledge about electronics.
Is your restriction that you need to power you meter from the AC supply or is the restriction on just scaling the voltage down to a level to feed into the ADC on the ATMEGA 328 ?
Is your restriction that you need to power you meter from the AC supply or is the restriction on just scaling the voltage down to a level to feed into the ADC on the ATMEGA 328 ?
I think it is just scaling down but here is the task regarding AC signal sensing attached. Again sorry for the hassle but this is my first ever electronics project and I have a poor background of electronics.
You basically assume you have sine wave input (e.g. 50- 60 Hz), precision rectify, and multiplay by a fudge factor that would read RMS. The average is a mathematical average.
I think it is just scaling down but here is the task regarding AC signal sensing attached. Again sorry for the hassle but this is my first ever electronics project and I have a poor background of electronics.
The PDF seems to have little or no relation to your original question?, and actually explains what you need to do - simple attenuator, and add a DC offset.
After a quick look at the .pdf; it looks like they don't want a "rectifier". They just want the computer to read voltages in the +/-10 volt range. The computer can not read that so a circuit is made to convert -10 to 0, 0 to 2.5 and +10 to +5. (remember the computer can only see 0 to 5V)
There are several ways to do this.
C1 strips off any DC from the signal to measure.
R2, R3 causes the ADC input to want to go to 2.5V or 1/2 of the supply.
R1 & (R2//R3) makes a voltage divider. (reduces the voltage down to what the computer can read)
From the .pdf your posted: The -10 to +10V signal gets changed to 0 to +5V for the computer to read.