The user must provide subroutines to define
the nonlinear parts of the objective function and nonlinear constraints.
They are passed to snopt as external parameters funobj
and funcon.
(A dummy subroutine must be provided if the objective or constraints are
purely linear.)
Be careful when coding the call to snopt: the parameters are ordered alphabetically as funcon, funobj. The first call to each function routine is also in that order.
In general, these subroutines should return all function and gradient values on every entry except perhaps the last. This provides maximum reliability and corresponds to the default setting, Derivative level = 3.
In practice it is often convenient not to code gradients. SNOPT is able to estimate gradients by finite differences, by making a call to funobj or funcon for each variable xj whose partial derivatives need to be estimated. However, this reduces the reliability of the optimization algorithms, and it can be very expensive if there are many such variables xj.
As a compromise, SNOPT allows you to code as many gradients as you like. This option is implemented as follows. Just before a function routine is called, each element of the gradient array is initialized to a specific value. On exit, any element retaining that value must be estimated by finite differences.
Some rules of thumb follow.
|
1. |
For maximum reliability, compute all function and gradient values. |
|
2. |
If the gradients are expensive to compute, specify Nonderivative
linesearch and use the input parameter mode
to avoid computing them on certain entries. |
|
3. |
If not all gradients are known, you must specify Derivative level < 3. You should still compute as many gradients as you can. (It often happens that some of them are constant or even zero.) |
|
4. |
Again, if the known gradients are expensive, don't compute them if mode = 0. |
|
5. |
Use the input parameter nState to test for special actions on the first or last entries. |
|
6. |
While the function routines are being developed, use the Verify option to check the computation of gradient elements that are supposedly known. The Start and Stop options may also be helpful. |
|
7. |
The function routines are not called until the linear constraints and bounds on x are satisfied. This helps confine x to regions where the nonlinear functions are likely to be defined. However, be aware of the Minor feasibility tolerance if the functions have singularities. |
|
8. |
Set mode = -1 if the functions are undefined. The linesearch will shorten the step and try again. |
|
9. |
Set mode |