Well, you're close. The current that a load (LEDs in your case) draws is usually the minimum current that your power supply should be capable of supplying. So, your 700ma supply will be just fine to power your LEDs.
Normally, a load will draw only the current that it needs to perform its function. So, you could just as well power the LEDs with a 20 amp supply with no problem. Even though the power supply is capable of supplying a much higher current, the load will pull only what it needs.
That said, powering LEDs is a bit more involved than just connecting a power source and turning it on. LEDs need to have the current through them limited in some way. Normally, that means using a resistor in series with the LED, which, based on the resistor's value, will limit the current through the LED to a safe value. That value is a bit different for different types of LEDs. You'll have to read the data sheet for the LED for that data.
A LED will have a Forward Voltage specification. That's the voltage that the LED "regulates" itself at when energized. Use that voltage value, and using Ohm's law, calculate the value of the resistor to limit the current to the recommended value. R=V/I, where Riresistor value in ohms, V=the LED's Forward Voltage spec, and I=the recommended current value in amps.
For an array, use the manufacturer's spec for current. If the LEDs are all in parallel, you'll have to calculate the sum of currents for all the LEDs. In series/parallel, it's a bit more complicated. The Forward Voltage specs will add for series legs, current will add for parallel legs.
Cheers,
Dave M