It depends on the sense voltage VS. common mode voltage. The circuit in the post is kind of an extreme example. There we have a maximum instantaneous current of about 420ma. With the 0.2ohm sense, that would develop a maximum sense voltage of about 85 mV. After it is divided down by 11, we are left with about 8mV total. 420ma is produced when the common mode voltage is about 32V. Let's say there is a 1% mismatch. Instead of 10k/1k and 10k/1k, it is 10k/1k and 10.1k/1k. The top of the sense resistor (10k/1k: 32.085V) divides down to 2.917v. The bottom (10.1k/1k: 32v) divides down to 2.883v. So, instead of 8mv, we got 34mv, a 400% error. Now, if you used a 2 ohm sense resistor, maximum sense voltage would be 850mV, making about a 30% error under the same circumstances (you would need an amp with less gain, about 16 would be good for that).
You don't have to get 0.1% resistors, there are a few ways to handle it. An easy way is to get a bunch of 1% resistors and set up bins, like for 10k, a 9.989k bin, and a 9.991k bin, then measure the resistors and place them in the appropriate bin. They don't need to be exactly 10k, the resistor in divider 1 just needs to be the exact same resistance as the corresponding one is divider 2. Another method would be to just be fairly close, then calibrate the difference in software.