I use a 100 Ohm resistor in series with the + terminal of a dual-meter (V/I) lab supply set to an open circuit voltage of 5.0V.
Connect the LED between the free end of the resistor and the - terminal of the power supply. I am assuming you do not know which lead on the LED is the anode, and which is the cathode.
If you get lucky, and the LED cathode is connected to the supply -terminal, the LED should light with reasonable brightness. If it doesn't light, try reversing the LED's leads. If the LED still doesn't light, but the supply shows some current, then the LED is shorted, or it may be an IR LED. The reason I set the supply to only 5.0V is because the reverse breakdown of most LEDs is ~5V, so you are less likely to blow up a reversed LED.
With the LED illuminated, check the supply's current. Say you see 20mA. This means that if you want to use the LED as an indicator, then it needs about 20mA to get that relative brightness. You can also estimate Vf from this measurement.
If it is too bright, you might be able to reduce the current. Try it by momentarily reducing the supply voltage to less than 5.0V. Say that the LED is still bright enough at 5mA s displayed on the supply's current meter, then now you have an idea of how to calculate a suitable current limiting resistor if the LED is to be used as an indicator from a higher or lower voltage.
Further reduce the supply voltage to where the LED just barely lights, and that is good measure of Vf.