-
Notifications
You must be signed in to change notification settings - Fork 290
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature-Request for Default Derivatives (Gradient and Jacobian) #573
Comments
There is an jacobian_approximation option. There is nothing to approximate the Gradient of the objective function. So if you have a nonlinear objective without gradients, then you will have to move it into the constraints ( |
I believe the jacobian_approximation option you speak of is only used in case of derivatives check being enabled. It cannot be used outside of that feature, in the actual optimization portion of IPOPT. Also, it is very much beneficial to verify the correctness of implementation of a given NLP problem, if we had default derivatives IPOPT like |
From a look at https://github.com/coin-or/Ipopt/blob/stable/3.14/src/Interfaces/IpTNLPAdapter.cpp#L2738, I would say that this option does not only effect the derivative tester. Maybe it is confusing that the option is located in this section of the docu. Further, if I put a printf into the eval_jac_g of the HS071 test, i.e., --- a/examples/hs071_cpp/hs071_nlp.cpp
+++ b/examples/hs071_cpp/hs071_nlp.cpp
@@ -228,7 +228,7 @@ bool HS071_NLP::eval_jac_g(
else
{
// return the values of the Jacobian of the constraints
-
+printf("eval_jac_g() called for values\n");
values[0] = x[1] * x[2] * x[3]; // 0,0
values[1] = x[0] * x[2] * x[3]; // 0,1
values[2] = x[0] * x[1] * x[3]; // 0,2 then I get this log
If I also set
So eval_jac_g() from HS071 is no longer called to evaluate the Jacobian. |
Ok, thanks for the example, please let me test this out on my own before we close this issue. I should get back to you by tomorrow morning. Thanks, |
Yes, this is working as shown in the sample example for me. I wish I had the same option with the Gradient of the Objective as well. I guess, you can't win them all, 😄 |
- new option gradient_approximation - requested in #573, shouldn't really be used
Hello,
I have a feature request for IPOPT. Is it possible to implement in IPOPT default derivatives for the objective function and the constraints using the same finite differences method used in the derivative checker feature? For example,
fmincon
of MATLAB does this, in order to usefmincon
, one does not need to supply these derivative functions explicitly.Since there is already a finite differences method available for derivative checker feature, I am assuming it can be repurposed for acting as default derivatives, without the user having to explicitly implementing the functions for the gradient(s) and Jacobian.
Please let us know if this feature request can be honored, if so, if there is anything I can help with for the implementation.
Thanks,
Dhathri
The text was updated successfully, but these errors were encountered: