You simply measure the voltage drop across a resistor in series with the load - simple ohms law, a 1 ohm resistor will drop 1V for every amp of current.
As you probably can't afford to lose 5V off your supply rail, it's usual to use a smaller resistor and amplify the voltage drop with an opamp - this also allows you to reference the voltage to 0V (by using a differential opamp circuit).
and be sure to check the wattage of the resistor. if there is going to be 1A through the 1 ohm resistor then you need a 1W resistor. and remember that the power dissipation varies directly with the square of the current.
what are you thinking of trying to measure??
am i correct that you are not trying to measure 5A of current ?
hopefully ?
i have used a 1 ohm resistor on many occasions to measure current -in the 0 to 200mA range.
if you set your reference on the adc correctly To 1/2 your estimated maximum input Voltage .you should be fine..
what i am really trying to say is you should study the adc that you intend to use.. becaust there is a minimum voltage that can be converted on all ADC's .
dont quote me on this but 19 mV comes to mind on the 0804 adc
this means that 0 to 18 mV ouputs a 00000000 on the 0804
and 19 mV to 37mV or so produces a 00000001 ..and so on
anyway check the LSB voltage this will give ya an idea of what ya can expect..