Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

I need help with a transformer-harmonics question

Status
Not open for further replies.

diane98284

New Member
I did an experiment where you look at the sinusoidal waveforms of a small transformer. When you decreased the frequency the waveforms distorted and when you then decreased the voltage the distortion disappeared. Why does it behave this way? Someone told me that it was because the transistor is cheap and light on iron. Also a friend mentioned harmonic phenomenon, but couldn't elaborate. What is harmonic phenomenon? Could someone please help me out. I am so confused??
 
Harmonics are higher freqency copys of the original signal (this is comon in radio transmiters).For exsample an RF transmiter transmits at 100Mhz the 1st harmonic wod be 200Mhz,the 2nd 300Mhz,the 3rd 400Mhz

Cheap radio trasnmiters suffer from this.so that the trsmiters signal (10Mhz)can be picked up at 200Mhz...
 
This may or may not apply to your situation. I was reading about RF transformers- specifically baluns used to connect high impedance balanced feedlines to low inpedance coaxial lines/inputs. The author noted some conditions where saturation of the core could occur - and that at saturation the waveforms started to look like square waves with resultant harmonics.
 
Its probably because the core of your transformer is saturating, generally the higher the frequency the smaller core you can use for the same power. When you decrease the frequency the flux density will increase in the core, every transformer has a maximum flux density above this it will saturate and result in distortion. Reducing the voltage reduces the current through the transformer which will reduce the flux density which is why the distortion disappears when the voltage is reduced.


Transformers are designed to operate in certain frequency ranges, standard mains transformers are designed to operate at 50Hz while transformers used in SMPS will operate at much higher frequencies sometimes into the MHz range.
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top