Continue to Site

Welcome to our site!

Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

  • Welcome to our site! Electro Tech is an online community (with over 170,000 members) who enjoy talking about and building electronic circuits, projects and gadgets. To participate you need to register. Registration is free. Click here to register now.

Transformer question

Status
Not open for further replies.
think of it as a rectangle, the height is the voltage, the width is the current, and the area inside is the power. power in = power out. so the area has to stay constant. if you increase the height (voltage) then the width must decrease (current) and vice a versa...
 
The people at "over unity" do not believe it. They think they can "make" power.
 
What will be The changes If we Give Transformer High Current Or High Voltage (AC) On It terminal.......!
 
What will be The changes If we Give Transformer High Current Or High Voltage (AC) On It terminal.......!

Your question does not mean anything in English.
A transformer has voltage and current ratings that must never be exceeded.

If the voltage fed to the transformer is too high for the transformer then it might arc and destroy its insulation.
If the current is too high at its output then its core might saturate and it will get too hot.
 
Why The o/p Voltage Decrease's When We connect load across It & how to stable it .....?
Are you talking about a mains transformer or a simple inverter circuit?
1) A very cheap low current mains transformer has resistance in its windings. The resistance causes its voltage to drop when it is loaded.
I have some Chinese wall-wart AC-DC adapters that have an output of 9V at 200mA but the output is 17V with less current.
I also have some wall-wart AC-DC adapters that are regulated so the output voltage is always 5.0V. They are small and lightweight because they use a high frequency switching PWM circuit.
2) A voltage regulator or good inverter uses negative feedback to automatically adjust the output voltage.
 
Resistance of the windings plays a very small part in the problem, as I outlined in one of my articles:

One of the biggest problems with small transformers is a term called "regulation." This is situation where the output voltage drops considerably, as more current is drawn from the device.
The voltage may start at 12v under no-load and drop to 8v when full current is drawn. Since the project requires 12v for operation, the transformer is "overwound" with additional turns so that the final voltage is 12v when full current is delivered.
This means the no-load voltage is about 12v + 4v = 16v. (and it can be higher)
This is why these types of plug pack are not a good choice.
If your project takes a small current, it will be supplied with a voltage above the recommended value and it may be damaged.
The drop in voltage is due to two things.
1. The secondary winding has a small resistance and a voltage drop occurs due to this.
2. But the main reason why the output voltage drops is due to magnetic flux losses. This is also called "iron losses" and is due to some of the magnetic flux generated by the primary winding being lost as heat in the iron core and general losses as it passes from the primary winding (where it is generated) to the secondary winding (where it cuts the secondary turns). Both the primary and secondary windings heat up and create losses. All these combine to produce a transformer with POOR REGULATION.
 
Simple inverters do not have voltage regulation. The load current causes the output voltage to drop.
Here is a 70W simple square-wave inverter for Japan where the mains is 100V. The inverter's voltage is 109V with no load, 100V with a 40W load and only 90V with a 140W load.
 

Attachments

  • simple inverter performance.PNG
    simple inverter performance.PNG
    6.7 KB · Views: 430
I was generating freq. from 555 ic as for supplying to transformer for it a huge current is needed & as if i give the ic this current it will be damage permanently pls tell how to take o/p from it....?
 
Your English is so bad that I cannot understand what you are asking.
Maybe you should speeky zee language of your country on a website in your country.
 
Your English is so bad that I cannot understand what you are asking.
Maybe you should speeky zee language of your country on a website in your country.

I am making small inverter for this i want to generate frequency (50hz) as for this high current is needed & the maximum supply of 555 ic is 15v & few ampere if i will give more than this the ic will damage, this is my question how to have high current o/p from 555 ic.

Thank's in advance...!
 
Why don't you look at the datasheet of a 555??
Its max allowed supply voltage is 16V but it is spec'd at 15V.
Its max output current is only 200mA. If a 555 drives a transformer in an inverter then the max output power is only 2W.
Its output will be an unregulated square-wave like inverters that were made 40 years ago.

You asked how to increase the output current of a 555. Then you know nothing about electronics and you must learn it.
 
Status
Not open for further replies.

Latest threads

New Articles From Microcontroller Tips

Back
Top