I've read up everything I could about calculating resistors, but after entering the calculated resistor values into Livewire.. I get different voltages/currents.
What's going on..
I use the V=IR formula, but I get things like the attachment.
In the attachment, I was attempting to get an LED forward voltage of 1.5v at 30mA.
If you look at the data sheet for LED you will see that it is expotential.
For example, if you gradually increase the voltage across a LED from zero, it will start to glow when the voltage across it reaches around 1.6 ~ 1.7 Volt.
So you need to look at the voltage versus current graph for the LED you are using and see what the voltage is at the current you wish to put through it. (or measure one with a multimeter)
You can then use Ohm's Law to calculate the series resistance.
eg. LED voltage 1.8 V @ 30 mA with a 5 V supply.
R = (5 - 1.8)/.03 = 106.7 Ohm, so use either a 100 Ohm or a 120 Ohm resistor.
Take ljcox's advice. You should start from the V-I characteristic of the LED you're using. In this case you should know the model or make a simulation and draw it. You can't decide both the voltage drop across the LED and the current arbitrarily.
If you want to apply the Ohm's law to the resistor, measure the current (Id) with the simulator. You'll find an operative point for the LED (Id, 2,3 V) and you will be able to use your equations after changing the supply voltage, for example.