To expand somewhat on Ron's answer, the LED and resistor would have the same voltage drops, regardless of resistor value, but the current through both would change. This is because LEDs (like other diodes) have a (roughly) constant voltage drop throughout their current range. So if the LED in question is a 1.8v LED, it will always drop 1.8v, and the other components in series with it will have to take the rest of the voltage.
The downside is, of course, you need another component to handle the extra voltage as well as to limit the current. Diodes (once their knee voltage is reached) have a very low resistance. Not only that, they can only handle a small amount of current (a 1.8v standard red LED is somewhere around 22mA). Connecting the diode to exactly 1.8v will not work, since the low package resistance (less than 20Ω when it's 'on') would lead to overcurrent and a blown diode.
To remedy this, you add a resistor. To find the right value, use the following formula:
Resistor = (power supply voltage - diode voltage)/Spec current (this is on the diode datasheet, if you don't know and it's a typical looking diode, go for 20mA).
So, for a 5V supply and a 20mA LED, it would look like this:
R = (5v-1.8v)/20mA = 3.2/.02 = 160Ω
If we change that to a 330Ω resistor, the current would drop to about 10mA (3.2v/330Ω = 9.69mA) but the voltages would stay the same.
It's like paying your monthly bills, the phone company doesn't get all your money because it's the first bill. you pay what you owe them, then move on to the next bill. Same thing here. The LED takes what it needs, then sends the rest of the voltage on to the other components.