i think a transformer works by farady's law and actually the farady's law (e= d teta/dt) which states that the voltage generated is a function of changes in the current, which of course means frequency. So the higher the frequency the more output voltage we should expect.
The primary current will not be constant if the supply frequency changes. The inductance of the primary side means that the primary side is inversely proportional to the frequency.
If you double the frequency with the same supply voltage, the current will halve. The magnetic field will halve, but it will get to that value in half the time, so the rate of change of magnetic flux will be the same, and the output voltage will be the same.
In most transformers, the winding resistances are small. The changing magnetic flux in the core generates voltage in both the primary and the secondary. The voltage generated in the primary is nearly as large as the supply voltage. If the rate of change of magnetic flux were too small, there would be a lot of current, which would increase the flux, so the voltage generate would increase, to balance the supply voltage.
So the voltage generated in the primary is just about equal to the supply voltage. The voltage generated in the secondary is larger or smaller by the turns ratio.
Two things to note. At higher frequencies, there is less flux, so the the cores can be far smaller. That is one reason why 400 Hz is used on aircraft. It is why switch mode power supplies, with transformers running at 20 kHz or more, are far smaller and lighter than linear power supplies.
Secondly, in this thread
https://www.electro-tech-online.com/threads/quick-transformer-connection-question.119432/#post982194 are the results of when I reversed one half of the primary winding (with a suitable safety resistor). There was no voltage generated by the primary winding, and the current would have been huge without the resistor that was there for the test. The point is that transformers rely on the voltage generated in the primary winding.