superbrew
Member
Hi, i am studying parallel resonant circuits in school now and am having some trouble with some of the results from a lab I was assigned. The assignment was that given a 2.2mH inductor, calculate the capacitor required for the resonant frequency to equal 10KHz.
The inductor that I used measured 2.197mH and and had a DC resistance of 4 ohms. The value of C that I calculated was approx 115nf. The actual value of C that I used was 112.76nf. These values calulated to give a resonant frequency of 10.1118KHz.
The circuit was constructed using a 4.7K
hm: resistor in series with the LC circuit. The source had a 50
hm: output impedance. When I measured the maximum voltage across the LC using a DMM and an oscilliscope, I came up with a frequency of 10.83KHz. Why is there such a large difference in frequency? Without using a network analyzer, what is the best way to measure the resonant frequency? Thanks for your help.
The inductor that I used measured 2.197mH and and had a DC resistance of 4 ohms. The value of C that I calculated was approx 115nf. The actual value of C that I used was 112.76nf. These values calulated to give a resonant frequency of 10.1118KHz.
The circuit was constructed using a 4.7K