Hi
There are mathematical formulae to see if a certain method is converging or not. But let's look at it informally. Can successive iterations help us to see if a method is convergent or not for some problem? Suppose, we are finding a root of a function f(x). At iteration #13 we have f(x_13)=0.04, and at iteration #14 we have f(x_14)=0.5 and at iteration #15 f(x_15)=0.6, but at iteration #19 the function f(x_19)=0.0001. The iterations #14 and #15 might suggest that the method has started diverging because the result at the iteration #13 was closer to the root, and one may stop going on with more iterations concluding that the method is not applicable. But I think this is not true. I have seen sometimes at some successive iterations the results indicate as if the method has started diverging but then all of a sudden there is a change and convergence is apparent. So, in a nutshell, to see if a method is convergent or not, one has to look at a set of iterations, say 20, 30, 100, etc., rather just two or three iterations. For example, if over a range of 50 iterations there hasn't been any refinement or there is more divergence then the method is divergent. Is my thinking correct? Please let me know. Thank you.
Regards
PG