Several of my power supplies have two modes: constant voltage and/or constant current. They have two knobs, one which sets the maximum voltage that the supply will deliver (e.g. open-circuit); the other knob which set the maximum current that the supply will deliver (e.g. into a short circuit).
For example, I can use such a supply to charge a lead-acid battery. Suppose you want to limit the battery charging voltage to 14.2V, so set the supply to 14.2V open-circuit. Further suppose, that you want to limit the maximum charging current into the battery to 3A, so short the supply terminals (before connecting it to the battery), and set the current limit knob so that the supply puts 3A into the short circuit. Then connect the supply to the battery.
If the battery is mostly discharged (~11.5V), the supply starts out in constant-current mode and puts 3A into the battery, whose terminal voltage slowly rises as the battery accumulates charge. Eventually (after several hours) the supply/battery voltage rises to 14.2V, where the constant-voltage mode takes over, now the voltage stays at 14.2V, and for the next several hours the current flowing into the battery slowly decreases to a few 10s of mA as the battery finishes charging.
An example of a constant-current source is your Ohmmeter, where depending on the range it is set to, it puts a constant-current into the resistor being measured, and reads the voltage drop across the resistor being measured. Typical currents that an Ohmmeter puts out range from 10uA to about 10mA.