chrisb wrote: ↑Sat Apr 06, 2019 12:40 pm
If you found

*that* it should be easy to find any error too

.

The first to determine is whether it was intended or not.

The thing is that the book indicates that to avoid the Maratos effect (an effect that impedes progression in the minimisation problem), a second order function may be used.

About this second order function, the book says:

The second-order correction step requires evaluation of the constraints c i (x k + p k )

for i ∈ E ∪ I, and therefore it is preferable not to apply it every time the merit function

increases. One strategy is to use it only if the increase in the merit function is accompanied

by an increase in the constraint norm.

It can be shown that when the step p k is generated by the SQP method (18.11) then,

near a solution satisfying second-order sufficient conditions, the algorithm above takes

either the full step p k or the corrected step p k + p̂ k . The merit function does not interfere

with the iteration, so superlinear convergence is attained, as in the local algorithm.

From the code, I understand that logari, at least at the begining, intended to execute the second order step once, the first time.

The book appears to suggest that he won't do it more often because it has to evaluate several constraints, so I understand, because the computational cost is high.

Now, I do not know if logari decided afterwards to execute this step in all the steps, or if he just forgot to make the boolean false after the first iteration.

I have PM-ed logari.

BTW, I like the book. I might buy a hard copy. I do not like to read on the computer almost 700 pages.