| OCR Text |
Show 40 All three methods were used in a variety of cases; all performed well. In average performance, the secant method required significantly fewer evaluations than Newton-Raphson, and somewhat fewer than the finite-difference secant. Both the secant method and Newton-Raphson converged quickly even from very poor initial guesses. The author's conclusion is that while Newton-Raphson is a very good method, it may be more powerful than is necessary for this problem, so that the extra evaluations are not warranted. In particular, the solution is constrained to lie in a small finite interval (the parametric span of the 8-spline), which simplifies the minimization problem considerably. Thus the secant method seems to provide the correct balance between efficiency and accuracy, and became the method of choice, although the finite difference method was only slightly less effective. 2.4 Smoothing Approximation: Another Alternative Dierckx has proposed a smoothing metric for B-spline approximations of order k which minimizes a combination of the least squares error and the magnitudes of the jump discontinuities in the k-1 st derivative at the internal knots ).., J j=1, ... , g: n I ( w. II f(t .) - c. 112 ) I I I i=1 1 g + - lim I II tlk)( ).. + 1 ) - tlk)( ).. - 1 ) 112, p t--;>0 j=l J J (2.4) where the ci are data points, ti are the corresponding parameter values, and the parameter p, which will be referred to as the "fit factor,"' controls the trade-off between smoothness and closeness of fit. This metric, which will be referred to as the "jump criterion," effectively reduces the degrees of freedom, but it is not clear that the result is always more ·smooth., than the unsmoothed least squares approximation. Sometimes the effect is good, but in other cases the only apparent result is that the error is greater. Consider the limiting cases: as the fit factor becomes larger, the result approaches the least squares B-spline approximation, which closely approximates the data, but may not be smooth |