usually im most circuits u dont need more than 1/4W....
like kinjal said, it doent matter what voltage u use for the circuit....
all it matters is what current passes trough the resistor; for example if u have a resistor with the value R and the current passing trough it is I then the power disipated trough it would be P=I*I*R.
or if u know what voltage is across it P=U*U/R
for example if u have a resistor of 100ohm and the current trough it is 40mA then P=(40/1000)*(40/1000)*100=0.16W that means that u must use an 0.25W resistor(the maximum power that can disipated on the resistor must be greater than the one it is disipated in your citcuit).
now, on the other hand if on the 100ohm resistor u aply a voltage of 4V, then the power disipated is P=4*4/100=0.16W. now again u will have to use an 1/4W resistor again
now u sai u have a circuit powered by 3V, that means that the maximum voltage across the resistor is 3V(usually is much smaller), taken in reverse if u would use 0.25W resistors, than for that power to be disipated in a resistor its value has to be R=U*U/P=36 Ohms
so as a conclusion, if in the circuit u dont use more than 3V and resistors smaller than 36ohms, use 1/4W types....
hope you are not more confused then u were whe i send u the message