Lac said:
hmm. Are there any connection with what voltage and current you get from a powersource? like a 4:3 ratio or something like that?
No, there's no standard relationship between voltage and current from a power source, it depends completely on the individual design.
Basically it depends on the mains transformer (assuming the power supply uses one!). The voltage is dependent on the number of turns, and the current on the thickness of the wire.
Or does most powersources have a custom ratio between voltage/current? and if, how am I able to fine the current that a source outputs if I know the voltage? connect a load, and use a multimeter and ohms law?
It may be a problem with your English, but you mention "current that a source outputs", this terminology isn't correct. It should either be "current the load requires" or "maximum current the supply can provide".
To measure the first is simple, just place an ammeter in series with the load and measure the current. The second is more difficult to measure, the easiest way is to check the ratings of the components - if the transformer is rated at 1A, then the maximum rating is 1A (near enough). But the lowest rated component in the PSU would be the one to limit the current, a 1A transformer and a 100mA regulator would only provide 100mA maximum.
To actually measure it you would need to apply different loads, monitor the voltage and current, keeping a careful eye on the temperature of the various components - transformer, rectifier, regulator etc. There's always a chance that you could overload the PSU and damage something.