An ammeter typically has a very low resistance. When you attach it directly across a voltage source, you are effectively shorting out the voltage source. In this case, the dc source may be going into a protection mode such as current limiting, or fold-back or crowbar mode. You really should not do a measurement this way unless you know exactly how the dc source will respond to a short circuit. If the dc source has built-in current limiting, then it would be ok to do it this way, but many dc sources do not have limiting and putting a short across their output may result in a lot more than the rated current flowing through the meter. In such cases, the fuse in the meter can open, or the meter may be damaged, or the dc source can even be damaged. I think that the best way to measure the current from the dc source is to put a load resistor on the source, in this case 25 ohms would be appropriate (using ohms law), and then place your ammeter in series with the resistor.
In the second case, using Ohm's law, I calculate that the amount of current to be expected through the 10Mohm resistor is .0000025 amps (2.5 microamps) which is likely too little for the meter to measure. Why not try a 1K ohm resistor instead?
Ohms' law is very useful in these cases. It says that in a resistive load, voltage across a resistance is equal to the current through it times the resistance. V=IR. simple algebra allows you to rearrange this in two other ways I=V/R and R=V/I so when you know two things you can always calculate the third.