You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@commons.apache.org by François Laferrière <fr...@yahoo.fr.INVALID> on 2022/10/20 14:35:59 UTC

[math] contribution proposal for multivariate functions optimization (2)

Hello,
Sorry, previous message was a mess....
Based on Apache common math, I have implemented some commonplace optimization algorithms that could be integrated in ACM. This includes
   
   - Gradient Descent (en.wikipedia.org/wiki/Gradient_descent)   

   - Newton Raphson (https://en.wikipedia.org/wiki/Newton's_method_in_optimization)   

   - BFGS (https://en.wikipedia.org/wiki/Broyden–Fletcher–Goldfarb–Shanno_algorithm)   

They are implemented in such a way that other algorithms of the same family (Newton) can be implemented easily from existing building blocks.
I clone http://gitbox.apache.org/repos/asf/commons-math.git but I am a bit lost in the module structure. Should I put my code in one existing commons-math4-* module (if so which one?) or should I create a new module (for instance commons-math-optimization) ?
Many thanks in advance
François Laferrière
 
  

Re: [math] contribution proposal for multivariate functions optimization (2)

Posted by Gilles Sadowski <gi...@gmail.com>.
Hello.

I haven't looked at the code, but thanks for your interest in contributing
to Commons Math.
The more straightforward path is indeed to adapt your code to what is
currently in the "...legacy" module.  However, you can really consider
the other venue, i.e. a design specific to this family of optimizers to be
defined in a separate module.  [In the current "legacy" code we tried
to share as much code as possible, but this has led to various design
issues.]  Which is best is certainly not clear-cut.
My hope is that starting from a clean slate may help us make progress
on several aspects of the library that we wanted to change anyways[1]
(as per the JIRA records).

Regards,
Gilles

[1] For example, usage of external matrix algebra implementations.

Le ven. 21 oct. 2022 à 15:30, François Laferrière
<fr...@yahoo.fr.invalid> a écrit :
>
>   Hello Alex,
> Ichecked the architecture of MultivariateOptimizer family to compareto what I have done so far. I think that what I have done can berefactored to fit in the general framework as extension ofGradientMultivariateOptimizer even though, main differences are :
>    -
> There is no need to provide explicit gradient function, as it can be computed with finite difference (secant approximation to partial derivative).
>
>    -
> There is a need to compute the Hessian matrix. In the same way, it is computed with finite differences.
>
>    -
> My approach uses of an extension of MultivariateFunction that have methods to provide gradient and hessian. But that is perhaps not a so good idea: it would be better to provide generic gradient and hessian classes that are operator applied to plain MultivariateFunction.
>
>
> I also have already written tests to cover at least all nominalcases and a few error cases. Now that I know where to put my code, Iwill integrate it in ACM clone (with all tests) and try to refactorit until it fit seamlessly in ACM classes and concepts.
>
> This may take a while.
>
> Yours truly
> François Laferrière
>
>     Le jeudi 20 octobre 2022 à 20:18:46 UTC+2, Alex Herbert <al...@gmail.com> a écrit :
>
>  Hi,
>
> Thanks for the interest in Commons Math.
>
> Currently all the optimisation code is in commons-math-legacy. I think
> the gradient based methods would fit in:
>
> org.apache.commons.math4.legacy.optim.nonlinear.scalar.gradient
>
> Can your implementations be adapted to work with the existing
> interfaces? The decision to move the entire 'optim' package to a new
> module allows a redesign of interfaces. The old and new can coexist
> but ideally we would want to support only one optimisation
> architecture. Have a look at the current classes and let us know what
> you think.
>
> Regards,
>
> Alex
>
>
>
> On Thu, 20 Oct 2022 at 15:36, François Laferrière
> <fr...@yahoo.fr.invalid> wrote:
> >
> > Hello,
> > Sorry, previous message was a mess....
> > Based on Apache common math, I have implemented some commonplace optimization algorithms that could be integrated in ACM. This includes
> >
> >    - Gradient Descent (en.wikipedia.org/wiki/Gradient_descent)
> >
> >    - Newton Raphson (https://en.wikipedia.org/wiki/Newton's_method_in_optimization)
> >
> >    - BFGS (https://en.wikipedia.org/wiki/Broyden–Fletcher–Goldfarb–Shanno_algorithm)
> >
> > They are implemented in such a way that other algorithms of the same family (Newton) can be implemented easily from existing building blocks.
> > I clone http://gitbox.apache.org/repos/asf/commons-math.git but I am a bit lost in the module structure. Should I put my code in one existing commons-math4-* module (if so which one?) or should I create a new module (for instance commons-math-optimization) ?
> > Many thanks in advance
> > François Laferrière
> >
> >
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@commons.apache.org
> For additional commands, e-mail: dev-help@commons.apache.org
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@commons.apache.org
For additional commands, e-mail: dev-help@commons.apache.org


Re: [math] contribution proposal for multivariate functions optimization (2)

Posted by François Laferrière <fr...@yahoo.fr.INVALID>.
  Hello Alex,
Ichecked the architecture of MultivariateOptimizer family to compareto what I have done so far. I think that what I have done can berefactored to fit in the general framework as extension ofGradientMultivariateOptimizer even though, main differences are :   
   -    
There is no need to provide explicit gradient function, as it can be computed with finite difference (secant approximation to partial derivative).
 
   -    
There is a need to compute the Hessian matrix. In the same way, it is computed with finite differences.
 
   -    
My approach uses of an extension of MultivariateFunction that have methods to provide gradient and hessian. But that is perhaps not a so good idea: it would be better to provide generic gradient and hessian classes that are operator applied to plain MultivariateFunction.


I also have already written tests to cover at least all nominalcases and a few error cases. Now that I know where to put my code, Iwill integrate it in ACM clone (with all tests) and try to refactorit until it fit seamlessly in ACM classes and concepts.

This may take a while.

Yours truly 
François Laferrière

    Le jeudi 20 octobre 2022 à 20:18:46 UTC+2, Alex Herbert <al...@gmail.com> a écrit :  
 
 Hi,

Thanks for the interest in Commons Math.

Currently all the optimisation code is in commons-math-legacy. I think
the gradient based methods would fit in:

org.apache.commons.math4.legacy.optim.nonlinear.scalar.gradient

Can your implementations be adapted to work with the existing
interfaces? The decision to move the entire 'optim' package to a new
module allows a redesign of interfaces. The old and new can coexist
but ideally we would want to support only one optimisation
architecture. Have a look at the current classes and let us know what
you think.

Regards,

Alex



On Thu, 20 Oct 2022 at 15:36, François Laferrière
<fr...@yahoo.fr.invalid> wrote:
>
> Hello,
> Sorry, previous message was a mess....
> Based on Apache common math, I have implemented some commonplace optimization algorithms that could be integrated in ACM. This includes
>
>    - Gradient Descent (en.wikipedia.org/wiki/Gradient_descent)
>
>    - Newton Raphson (https://en.wikipedia.org/wiki/Newton's_method_in_optimization)
>
>    - BFGS (https://en.wikipedia.org/wiki/Broyden–Fletcher–Goldfarb–Shanno_algorithm)
>
> They are implemented in such a way that other algorithms of the same family (Newton) can be implemented easily from existing building blocks.
> I clone http://gitbox.apache.org/repos/asf/commons-math.git but I am a bit lost in the module structure. Should I put my code in one existing commons-math4-* module (if so which one?) or should I create a new module (for instance commons-math-optimization) ?
> Many thanks in advance
> François Laferrière
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@commons.apache.org
For additional commands, e-mail: dev-help@commons.apache.org

  

Re: [math] contribution proposal for multivariate functions optimization (2)

Posted by Alex Herbert <al...@gmail.com>.
Hi,

Thanks for the interest in Commons Math.

Currently all the optimisation code is in commons-math-legacy. I think
the gradient based methods would fit in:

org.apache.commons.math4.legacy.optim.nonlinear.scalar.gradient

Can your implementations be adapted to work with the existing
interfaces? The decision to move the entire 'optim' package to a new
module allows a redesign of interfaces. The old and new can coexist
but ideally we would want to support only one optimisation
architecture. Have a look at the current classes and let us know what
you think.

Regards,

Alex



On Thu, 20 Oct 2022 at 15:36, François Laferrière
<fr...@yahoo.fr.invalid> wrote:
>
> Hello,
> Sorry, previous message was a mess....
> Based on Apache common math, I have implemented some commonplace optimization algorithms that could be integrated in ACM. This includes
>
>    - Gradient Descent (en.wikipedia.org/wiki/Gradient_descent)
>
>    - Newton Raphson (https://en.wikipedia.org/wiki/Newton's_method_in_optimization)
>
>    - BFGS (https://en.wikipedia.org/wiki/Broyden–Fletcher–Goldfarb–Shanno_algorithm)
>
> They are implemented in such a way that other algorithms of the same family (Newton) can be implemented easily from existing building blocks.
> I clone http://gitbox.apache.org/repos/asf/commons-math.git but I am a bit lost in the module structure. Should I put my code in one existing commons-math4-* module (if so which one?) or should I create a new module (for instance commons-math-optimization) ?
> Many thanks in advance
> François Laferrière
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@commons.apache.org
For additional commands, e-mail: dev-help@commons.apache.org