I am trying to build a variable AC current source using an AC voltage supply and large valued sourcing resistors. (I know a more 'active' source is a better design, which I will also be building, in the meantime this should be an effective way to achieve what I want)
HOWEVER there seems to be some frequency dependance in my output and I can't figure out why this would be. The design employs a simple current divider with the resisitor through the top branch of the divider variable between 5 values (ie. a 5 pos. switch through 5 different resistors) of, 1MOhm, 10MOhm, 100MOhm, 1GOhm, and 10GOhm, producing an output current of 1microAmp, 100nA, 10nA, 1nA, and 0.1nA respecitively. The first three resistor values seem to work fine, however the setup breaks down on the 1Gig, and 10Gig resistors. Here, I've noticed a huge frequency dependance in my output, with for example, an input frequency of ~10Hz giving me the expected output current, however as I increase the frequency the output current increases accordingly, with, for example, 15times the expected output observed at only a few hundred Hz...
I have no idea why this would behave in this way...is there some obvious underlying circuit theory I'm totally unaware of?!?
I've racked my brains and do not understand this at all...Please let me kwow if you have any suggestions, including any references that might be helpful.
Many thanks.
HOWEVER there seems to be some frequency dependance in my output and I can't figure out why this would be. The design employs a simple current divider with the resisitor through the top branch of the divider variable between 5 values (ie. a 5 pos. switch through 5 different resistors) of, 1MOhm, 10MOhm, 100MOhm, 1GOhm, and 10GOhm, producing an output current of 1microAmp, 100nA, 10nA, 1nA, and 0.1nA respecitively. The first three resistor values seem to work fine, however the setup breaks down on the 1Gig, and 10Gig resistors. Here, I've noticed a huge frequency dependance in my output, with for example, an input frequency of ~10Hz giving me the expected output current, however as I increase the frequency the output current increases accordingly, with, for example, 15times the expected output observed at only a few hundred Hz...
I have no idea why this would behave in this way...is there some obvious underlying circuit theory I'm totally unaware of?!?
I've racked my brains and do not understand this at all...Please let me kwow if you have any suggestions, including any references that might be helpful.
Many thanks.
Last edited: