G'day Mate,
Ideally, the supply has electronic current-limiting. If it does not, then there is a chance that you will overheat it or blow a fuse if you set it to a fixed voltage like 13.6V, and then connect it a depleted 12V Lead-Acid battery. The reason is that the battery will happily suck 20 to 30A when fed with a well regulated constant-voltage supply that is set to a higher-than-the-battery voltage.
I use a lab supply, which has two knobs; one to set the open-circuit voltage of the supply, the other to set the maximum current that it can deliver (its short-circuit current limit). This way, while the battery terminal voltage is below the supply's open-circuit voltage, the current is limited to a safe (for the supply) value. My settings for a car battery would be: 14.5Voc and 3Amax. It takes about 24hours to fully charge a 70Ah battery.
If your supply does not do over-current gracefully (it overheats or blows fuses), then you have two options:
The first is to measure the battery terminal voltage before connecting to the supply. Say it is 12.0V (mostly discharged). Set the supply to an open-circuit voltage of 12.0V, and then connect it to the battery. Now slowly increase the supply voltage until the supply current approaches 6A. Leave it that way until the current drops to say 3A, and then increase the supply voltage a bit more and so on. Stop charging after the supply voltage is 14.5V, and the current has dropped to less than 1A.
The second option is to go looking for a cement or wire-wound power resistor that is about 1Ω at about 15W. Hook it between the positive supply terminal and the positive battery post. It will get hot, but now you can start with the supply cranked up to 14V, and come back 15h to 30h later, and not hurt the supply or the battery.
ps: Spent 10years of my life down-under, as a "New Australian"