Its probably because the core of your transformer is saturating, generally the higher the frequency the smaller core you can use for the same power. When you decrease the frequency the flux density will increase in the core, every transformer has a maximum flux density above this it will saturate and result in distortion. Reducing the voltage reduces the current through the transformer which will reduce the flux density which is why the distortion disappears when the voltage is reduced.
Transformers are designed to operate in certain frequency ranges, standard mains transformers are designed to operate at 50Hz while transformers used in SMPS will operate at much higher frequencies sometimes into the MHz range.