Hei there.
I`m wondering, whats the differences of sending 4000V 50 Hz through a long cabel, and 4000V 400Hz through the same cabel, the load on the end is not to much..
the only difference is the fact, that transforming the high voltage to normal 230V less iron is required for the transformer at higher frequencies.
In aviation the generators normally supply a 400Hz voltage which is transformed to 24 or 48V feeding all aviation electronics and hydraulic acutators thus saving weight.
On land lines high voltages are used to reduce the current flow resulting in less losses.
Well... it's easier for you to send 50Hz because your mains supply is already at that frequency and you would just step up the voltage with a transformer...
I'm sure there would be some interference caused on devices by sending 400Hz through a long cable... It would help if it was shielded.
Edit: If you do decide to switch it at 400Hz, you baisically have a switchmode power supply on your hands, and you need to reconvert it back to AC... it would be easy to keep it a square wave but then you are complicating things if you want to filter it out back to a sine.
Forgive me for not googling this, but I have often wondered how the mains are kept so accurate as far as frequency goes. Isn't the frequency all in the speed the generators are turning?
I have wondered how the US settled on 60Hz mains. Why not a round number like the rest of the world?
Forgive me for not googling this, but I have often wondered how the mains are kept so accurate as far as frequency goes. Isn't the frequency all in the speed the generators are turning?
I have wondered how the US settled on 60Hz mains. Why not a round number like the rest of the world?
Apart from the historical and a legacy issue, there are tecnical reasons:
>>>> The inductive reactance of the long power lines (and capacitive reactance in the underground cables):
With 400 Hz the series inductive reactance is higher (reactance = inductance x 2 x Pi x frecuency), giving more voltage drop.
And the paralel capacitive reactance is lower, which gives more leakage current.
>>>> 400 Hz is well into the audio spectrum, so a 400 Hz grid (and its harmonics) should cause more interference in "plain old analog" telephone circuits.
Forgive me for not googling this, but I have often wondered how the mains are kept so accurate as far as frequency goes. Isn't the frequency all in the speed the generators are turning?
The mains' frecuency is the timing reference of the "electical clocks":
Once upon a time they were sincronous motors driving a mecanical clock,
Nowadays there are lots of electronic clocks that use the mains cero crossing as a reference: standalone clocks, microwave ovens, VCRs, etc.
The frecuency may have a little short term drift, but the utilities correct it in the long term comparing an "electrical clock" with a good time reference.
Could the higher 'skin effect' (higher line resistance) of using 400hz on very long power lines Vs 50/60 hz be a factor in selecting the lower frequency for power distrubution?
The mains' frecuency is the timing reference of the "electical clocks":
Once upon a time they were sincronous motors driving a mecanical clock,
Nowadays there are lots of electronic clocks that use the mains cero crossing as a reference: standalone clocks, microwave ovens, VCRs, etc.
The frecuency may have a little short term drift, but the utilities correct it in the long term comparing an "electrical clock" with a good time reference.
This I realize, but how do the power companys in advanced countrys keep this so accurate? What happens when you have more then one power plant or generator on a grid? If the frequency isn't synched perfectly, won't you end up with a goofy harmonic?
This I realize, but how do the power companys in advanced countrys keep this so accurate? What happens when you have more then one power plant or generator on a grid? If the frequency isn't synched perfectly, won't you end up with a goofy harmonic?
The generators have to be synchronised before connecting to the grid. Once connected, the generators are effectively locked together, so the whole grid runs at the same speed.
The generators are synchronous, so they run at the speed of the grid, whatever torque is applied to them.
This I realize, but how do the power companys in advanced countrys keep this so accurate? What happens when you have more then one power plant or generator on a grid? If the frequency isn't synched perfectly, won't you end up with a goofy harmonic?
I've no idea how it's done now, but YEARS ago in the UK they used to have two clocks in the power station, one was an accurate standard clock, the other fed from the generated mains - during the course of the day, as load varied, it would run slightly fast, or slightly slow - but at midnight they would speed up (or slow down) the generators to make it read the correct time again.
This meant mains powered clocks had (and still do have) excellent long term accuracy, but may vary a second or two over a 24 hour period.
I've always wondered (and been amazed) that they can keep all the grid syncronised to each other - presumably it's cleverly designed to make it fairly automatic?.