Obviously if it applied a short circuit, the battery current would be very high (limited by its internal resistance) and there would be no voltage to measure since the voltage across a short is zero. So voltmeters are designed to have as high an impedance as possible so they don't disturb the voltage due to any source impedance. The manual for the meter will tell you what that value is. 10 megohms is a typical for a multimeter when measuring DC voltages.
Alternately, if you place a multimeter in the current measurement mode, then is exhibits a very low impedance, since it is connected in series with the load and you don't want any significant voltage drop across the meter which would disturb the measurement accuracy (the maximum drop is typically a couple tenths of a volt full scale).