Thats a lot of info, so, if you're unsure, I will clear up some of the things you read - and if I'm wrong, hopefully another member will put me right. You may or may not need this information to repair your flashlight, but just a few things to help you along
Batteries. Standard Alkaline cells generally are 1.5V. They may be a touch above this when fresh, and nothing is drawing current from them. Like all batteries, their voltage will drop as you draw more current from them - thats 'internal resistance. So if you drawm say 100mA from 4 AA cells, its likely to be ~6v. Draw 1000mA (1Amp) it'll sag to maybe 5V. When the current stops flowing the voltage will slowly go back up to 6v.
Rechargeables like NiMH generally have an 'average' voltage of 1.2V per cell. The use 'average' because one fully charged it can be as high as 1.4V, but usually 1.3. As current is drawn it drops to ~1.2V, eventually down to 1.1V, at 0.9V they are considered 'flat', and should be recharged. Like the alkiaines, their voltage will sag under heavy current load, but they have lower internal resistance, and are designed to kick out a lot of current in a short space of time. 4 x 1.2V = 4.8V. For providing an Amp its likely to be 4.5V ish.
Because of the simple way in which the flash light powers the LED's (paralleled group, with a single current limiting resistor) the battery voltage has a
direct influence on the LED current. Since LED's will drop a fixed voltage, which the resistor 'taking the rest'. And because V = IR, I = V/R. As V increases (the voltage across the resistor which is the battery voltage, minus LED volts) the current through it increases. So in your flash light alkelines may be 'slightly' brighter, although their voltage will drop considerably when under load, perhaps bringing them down to the voltage similar to the NiMH.
Either way, I'm sure both types of batteries will work, with rechargables being better for convenience and a longer run time (NiMH maintain a high capacity for high current loads, where-as alkalines' capacity drops as you increase load current.
When you say 'two amps on two batteries' how are you measuring them? If you connect them directly to your multimeter to measure current, the multimeter is effectively shorting them, and you're reading the absolute maximum current the battery can deliver (and it's voltage will drop to something like 0.5V) its a bad idea. In order to measure current, place your meter in series with the battery and a load - such as your flashlight. That will tell you how much the flash light is drawing. Turning your fluke to 'voltage', and placing the probes across the battery tells you voltage. So measure current in series, going 'through' something. And measure voltage 'across' something.( in parallel).
LED's. LED's do indeed have a wide variety of forward voltage drops, dependant on the materials used (which also determine colour). White LED's are generally just blue LED's with a yellow phosphorous coating which emits the red/yellow of the spectrum when excited by the blue light. The red/yellow and blue mix to form a fairly good 'white'. The chart you found was correct in that the range of forward voltages of LED's is generally from 1.7 - 4V. With 1.7V being high efficiency red, yellow being ~1.9, some greens being anywhere from 2.0 to 2.4, and blue/white/UV being >3.3V.
The voltage drop of a white LED is usually between 3.4 and 3.7V. If you've read '12V' or '18V' then such devices either have built in resistors (usually in the leads), or come with a power regulator. The bare LED will require a minimum of 3.5V. Connecting it to a power supply of a higher voltage (say 9V) without a resistor will blow it, because the LED will 'hold' its voltage drop of 3.5V, and your power supply willl kick out 9V. The 'difference' would usually be across a resistor, who's value determines the current. I = V/R. Where 'V' is the voltage across the resistor. Without a resistor, the leads/wires are the only resistance in the circuit, and they are probably <0.1ohm. That means lots of current flowing, which means dead LED. '20mA- 60mA', depends on the LED package. Usually the 5mm round ones are 20mA max continuous, but can be 'pulsed' at much higher current, because the off time allows cooling. Some larger devices can be 350mA, 750mA, and theres even 5A LED's now (yes, 5 AMPS). I'm wiling to bet yours are 5mm, as they are the most common, cheapest, and make great 'arrays'. I woudl say 20mA max, but to give a margin, run them at 15-17mA.
Putting LED's in parallel. Generally, its a bad idea, but is used in some circumstances, and can work if the devices are well matched. Here's why.
The forward voltage of the LED's will not all be exactly the same. They may all be the same colour/type, and perhaps all from the same batch from the same manufacturer, but they may vary a few 0.1V. This means that, if you have say 20 LED's in parallel, the ones with lower forward voltage will draw more current than the ones with higher forward voltage. If the variation is great, the LED with the lowest Vf will draw too much current and blow, perhaps going open circuit (disconnecting). Now there are 19 LED's in parallel, so each LED gets slightly more current than before, once again the weakest one blows, leaving 18 LED's, taking more current. The result is a series of blown devices. Even if devices don't fail, the variation may cause some LED's to be brighter than others (not an issue if they are all clustered).
That said, will 100+ LED's - each LED is using ~1% of the total current drawn. So if one LED fails, the 1% it was taking gets split between all the other LED's. Because of the vast number it really doesn't make much difference, so believe it or not, putting MANY LED's in parallel isn't too bad. Bad practice? perhaps, but its cheap, and obviously works. ALl the LED's in the cluster will be of the same type so it shouldn't matter too much for flashlight apps.
In series, the current flowing through one LED, is exactly the same as the current flowing through another, and another, in a long string. This has the advantage of being able to control the exact current for ALL LED's. With the disadvantage of requiring a higher voltage. 100 white LED's in series, requires a minimum voltage of 100 * 3.5 = 350V! Sometimes they may make series parallel arrays, say parallel groups of 3 LED's in series. Each 'group' requires 3 * 3.5v = 10.5V, and so, uses a series resistor on each group for use with 12V. (resistor is chosen so that I = 20ma. With V = 12 - 10.5 = 1.5, R = 10.5/0.02 = 520 ohm). Because a series group can be added up to get close to a power supply voltage, theres less voltage across the resistor, which can make 'series' groups more efficient than parallel = longer batt life and less heat.
In your flashlight, although perhaps 'not ideal' its a very simple way to power up many LED's. Its also very cheap requiring only ONE component extra to the LED's. The downside is, that resistor disappates a low of power, so its inefficient, and is prone to failure. And should it reduce its resistance, risk popping a few of the LED's in the parallel array, which *may* cause a cascade of popping LED's. - thats doubtful though, given the number of them. It also means the LED current (which equates to brightness, and LED life time) is directly related to the battery voltage. And so isn't particularly controlled well.
As for the resistor. Wire wound package = good. 3-5 watts = good. And you're spot on about the 'poor design'. Electrically its cheap, but serves its purpose. Thermally it sounds like a nightmare. Especially considering that LED's also generate heat, and if the resistor is on the back of the LED array, it'll just add to the heat. The surroundings of the resistor get warm, so it can't dissipate as much heat, and gets hotter = fail. I believe this also happens on very expensive LED flashlights, that use a switchmode power supply to power a 3Watt LED (single device, 3.5V) from a large rechargeable battery. To reduce size they stick the electronics on the back of the LED - both of which generate a fair bit of heat, so when lumped together, increases the risk of one failing.
I guess you could just buy some 0.68, 0.56, 0.47 ohm 3-5 watt resistors and experiment to see which is the best in terms of brightness/batt life. And add a clamp/heatsink to the resistor. Even if it can't 'pump out' heat outside of the flashlight, at least it can increase the surface area of that resistor and spread the heat around a bit.
If theres room in the flashlight you can get TO-220 packaged resistors, like a transistor, which are very convenient for mounting to a heatsinks, but for <2W might be over kill.
Sorry for the disjointed ramble. I'm not sure how much you know about electronics, you've obviously worked out whats wrong with it (quite well I might add) but I thought I'd add some general info about LED's. If I seem patronizing, apologies, but more information is good eh? you can just 'not read' the boring bits
BT