"The fact remains the picture you posted shows a normal temination on temperature"
I guess I should have been clearer: the figure 2 that I posted above is a good example of why simple temp measurement is not good for NI-MH: the cell temp rises to 41C. It's from a Linear-Tech data sheet pimping one of their products. If you don't mind running the cells above 40C at end of charge it's great.... now ask a battery maker how long the cells last if you cook them that hot on every charge cycle.
The point is that even as far back as 1992 when gates Energy invented the first NI-MH cell, they were warning charger makers not to try to use simple temp rise to terminate fast charge (as had been industry standard practice for NI-CD) and the curves show why. The best method is to measure rate of temperature rise and then terminate on the inflection point wher the rate suddenly increases. That saves a lot of overcharging and temperature stress as it occurs at a cell temp of about 34C, not 40C. It also differentiates between a "smart charger" and a dumb one.
However, it's a lot harder to make a chip that can do that unless you have a uC that can datalog readings at fixed time intervals and calculate rate of change. Obviously, a linear IC house like Linear tech isn't building any such thing.
The Linear Tech article, which was incidentally written by somebody I shared an office with for four years, lumps NI-CD and NI-MH together to try to make you think the chargers for the former are fine to use with the latter. God help anybody who believes that, I hope their house is fireproof. Anyway, they think their part is the cat's pajamas but >40C is pretty hot to run a battery when you charge it.