You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@madlib.apache.org by Ed Espino <es...@apache.org> on 2017/09/20 15:13:49 UTC

Status of : SVM: Implement c++ functions for training multi-class svm in mini-batch #75

I'm more curious than anything, what is the current status of the following
PR?

PR: SVM: Implement c++ functions for training multi-class svm in mini-batch
#75
#75 opened on Nov 14, 2016 by mktal
https://github.com/apache/madlib/pull/75

Thanks,
-=e

-- 
*Ed Espino*

Re: Status of : SVM: Implement c++ functions for training multi-class svm in mini-batch #75

Posted by Frank McQuillan <fm...@pivotal.io>.
Good question, Ed.

This was some good work by mktal to build a multi-class svm module.

The issue with the PR is that the mini-batching is embedded in the svm
code, whereas we would prefer to add mini-batching as a general capability
in the stochastic gradient descent framework, so that it can be used by
other modules besides.

There is work in progress on this currently:

Add mini-batching to IGD framework
https://issues.apache.org/jira/browse/MADLIB-1048
for the next release.

When that JIRA is done, we can then finish:

Multi-class SVM with mini-batching
https://issues.apache.org/jira/browse/MADLIB-1037

which is related to the PR you asked about.

I will add this comment into the PR as well.

Frank


On Wed, Sep 20, 2017 at 8:13 AM, Ed Espino <es...@apache.org> wrote:

> I'm more curious than anything, what is the current status of the following
> PR?
>
> PR: SVM: Implement c++ functions for training multi-class svm in mini-batch
> #75
> #75 opened on Nov 14, 2016 by mktal
> https://github.com/apache/madlib/pull/75
>
> Thanks,
> -=e
>
> --
> *Ed Espino*
>