As a concrete example, I define
by

The solutions are points of intersection of the circle

(the other solution is the reflection of

Applying Newton's method with
*x*^{(0)}=(0.5,0.5), I obtain the results
shown in Table 1. Several comments about these results are in
order. First of all, the computations were carried out in IEEE double
precision arithmetic, which translates to about 16 decimal digits of
precision. Therefore, for example, where the results show that
and
,
the apparent
discrepancy is due to round-off error. Five iterations of Newton's method
were sufficient to compute the solution exactly to the given precision,
but when
*F*(*x*^{(5)}) is computed in floating point arithmetic, round-off
error caused the result to differ from zero by a very small amount.

Second, the convergence of
to zero follows a definite
pattern:

which suggests that the ratio

is asymptotically constant as . It is difficult to verify this conjecture numerically, since the error so quickly falls below round-off level. For example, I would predict that

but in fact this error is below the precision of the machine, and all I can verify is that is less than about 10

- 1.
- The sequence is said to converge linearly (or q-linearly)
if there exists
such that

- 2.
- The sequence is said to converge superlinearly (or
q-superlinearly) if

- 3.
- The sequence is said to converge quadratically (or
q-quadratically) if there exists
such that

Below I will prove the following theorem:

- 1.
- the Jacobian
*J*(*x*^{*}) of*F*at*x*^{*}is nonsingular, and - 2.
*J*is Lipschitz continuous on a neighborhood of*x*^{*},

Quadratic convergence has two important consequences. First of all, it
guarantees that if a point close to the solution can be found, then
Newton's method will rapidly home in on the exact solution. Second, it
provides a *stopping test* for Newton's method. Since Newton's method
is an iterative algorithm, it is necessary to have some criteria for deciding
whether the current approximation *x*^{(k)} is sufficiently close to the
solution *x*^{*} that the algorithm can be halted. There is a simple
stopping test for any superlinearly convergent sequence, which I will now
derive.

I assume that it is desired to find *x*^{(k)} such that
,
where
is a given error tolerance,
and I also assume that
superlinearly. I will now
show that

This implies that (which is a computable quantity) is a good estimate of when

However, it works well because Newton's method usually does not take small steps unless it is close to the solution. Moreover, having verified that holds, so that is expected to hold, the algorithm then returns

is quite reliable.

To prove (2), I simply use the triangle
inequality,^{1}
the reverse triangle inequality,^{2}
and the definition of superlinear convergence. First of all,

Second,

Therefore,

and so

yields (2).