next up previous
Next: Generalizations Up: The omnt+ Project A Previous: Characterizing Solutions

Computing Solutions

We have seen how solving the original infinite dimensional nondifferentiable constrained convex minimization problem can be converted to the equivalent problem of solving a finite dimensional (usually) differentiable unconstrained concave maximization problem, which in turn is tackled by solving a system of nonlinear equations.

In general, these nonlinear equations involve integrals, which must be numerically calculated via some quadrature rule. There are special cases (e.g. the tex2html_wrap_inline144 's are piecewise linear) where exact integration can be performed, and our methods should exploit these situations, but the usual case requires a fast robust integrator.

In some cases, ME for example, the Hessian of the dual functional, needed e.g. in Newton's method, is highly structured, and such structure should be used in our Hessian generation schemes. Also, it is sometimes possible to algebraically reduce a Newton step to a simple computation, and this should be implemented when possible.



Ron Haynes
Thu Aug 8 16:22:34 PDT 1996