Hello there,
You didnt specify what resolution you needed, and it also depends on how much voltage you can afford to loose across the sense resistor. For example, if you can afford a full 1 volt then you might use a 0.1 ohm resistor although it would have to be able to handle the power too, and you'd still need an amplifier.
For 0.010 ohm sense resistor, 10 amps produces 0.1 volts, and that multiplied by 50 gives you 5 volts, so the resolution with a 10 bit ADC is around 5mv, which represents 10ma:
10 amps, 5 volts
1 amp, 0.5 volts
0.1 amp, 0.050 volts
0.01 amp, 0.005 volts
Note however that when we get down to this kind of resolution, input offset of the op amp used for the amplifier becomes important. You should use a chopper stabilized auto adjust offset op amp for best results because even a small offset like 1mv on the input creates an error of 50mv on the output. If you dont need super perfect accuracy then you can get away with a different op amp.
Also, the resistor used as the sense resistor should be grossly overrated to prevent too much heating. Ideally one made for current sensing should be used, possibly a four wire type device.
You can also investigate the current system before added a resistor to see if there is already a series resistor in there. If there is, you can measure the voltage across that to get the current.