All that any power source can do is to limit it's output current. They cannot force output current to flow through a disconnected load.
If you need a minimum load current on the output, you will need to provide an alternate path for it to follow. The simplest way would be to put a 10mA current sink across the output. But that means that, when the load is connected, your output current will be the total of the load and the sink, or 20mA.
If you need the load current to always be 10mA, then you need additional circuitry to measure the load current and adjust the current sink to make up the difference.
But, do you really need a 10mA minimum load? Or is that just what your existing supply needs to prevent the output voltage from rising when unloaded? If so, then fixing the source of the problem would be a better choice. Few PWM controllers can maintain a constant voltage with absolutely no load, but many can do so with the small load that the feedback voltage divider resistors provide. It's basically how close to zero% duty cycle that the converter can run.
As for accuracy, what do you really need? If you use a trim pot in your feedback network, you can trim the output to whatever accuracy your voltmeter can guarantee. (Note) But, do you just need accuracy at room temperature? Or do you need accuracy across a wide operating temperature? And for how long? Even the best voltage reverences drift over time.
The stated initial voltage accuracy of the MC34063 is 2%. But the accuracy of a complete circuit is also dependent on other factors as well.
(Note) Don't expect to get an accurate measurement from a cheap voltmeter. And don't confuse resolution for accuracy.