No you cannot get the same light output when using the same bulb on lower voltages.
A bulb [tungsten lamp] rated for 230V will work a 110V but the light output will be much lower.
Use ohms law for the 230V and 110V options then work out the power.
Oops, i misread his question and answered a slightly different one
Ohm's law isn't sufficient though. The resistance of the bulb will vary greatly with voltage across the filament.
then....my question is how do you know at what temp u hav to operate a bulb of 100W ? 230V/100V,,,, Full Confused
If a bulb is rated for 100w at 230v, then when you put 230v across the bulb it will dissipate 100w of power. If you lower the voltage, the current will decrease (obviously), the temperature will decrease (a little less obvious), and the resistance of the filament will decrease (a little less obvious too). This makes it difficult to know what voltage gives you what power.
Recently, I did a project in which I needed to be able to control the power to a 250w 110vAC bulb. My first assumption was that the bulb resistance was relatively constant and would introduce little error across the 110v range. I quickly learned that the assumption was WAAAY wrong. So I made a table in Excel and filled it in with various voltage and current measurements.
For instance:
0.41A @ 5.39v
0.51A @ 8.67v
0.55A @ 10.74v
0.93A @ 28.8v
1.20A @ 45.25v
1.55A @ 70.29v
2.09A @ 120v
I took over 30 measurements to build my table, that was just a few of them.
I was then able to calculate power (V * I) and resistance (I / V / V)
I then graphed Voltage vs Resistance and saw that it made a VERY nice curve
For the bulb I used, bulb resistance = 6.61 * voltage^0.454
Based on that information, you can calculate power vs. RMS voltage of a bulb. You just have to take the time to figure out the relationship.