That pesky Ohm guy again !
Carry on! There is no need for an LED - the 555 will work without it.
For your calculations...
The standard 555 timer is capable of sourcing or sinking 200mA from its output. A typical IR LED seems to be rated at upto 100mA and will drop about 1.3v when doing this. The 555 can happily drive the LEDs then, you just need that resistor calculating. :?
I assume you have a single LED and a 9v power supply.
If you connect directly to the 555 output you need a series resistor ...
R = V/I , the resistor will drop 7.7volts, the LED dropping the other 1.3v, I guess you want 100mA (the max the LED can stand) for maximum 'brightness'.
The resistor will therefore be...
7.7v/0.1A = 77 Ohms
For two LEDs in series (dropping 1.3v each, 2.6v total)
The resistor will be 6.4v/0.1A = 64 Ohms
The circuit you linked to shows an output transistor (PNP) with two diodes from the base to the positive line. This configuration is a constant current supply --- but I calculate it to be providing only 3.5mA :!:
Don't copy this if you want your circuit to work ... I reckon the circuit as shown should have the 180 Ohm emitter resistor changed for a 7 Ohm one to supply 100mA.