You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@opennlp.apache.org by Jörn Kottmann <ko...@gmail.com> on 2013/03/22 15:32:45 UTC

Liblinear (was: OpenNLP 1.5.3 RC 2 ready for testing)

Sounds interesting, I hope we will find the time to do that in OpenNLP
after the 1.5.3 release too. We already discussed this and I think had 
consensus
on making the machine learning pluggable and then offer a few addons for
existing libraries.

Good to know that liblinear works well, as far as I know its written in 
C/C++,
did you use the Java port of it, or did you wrote a JNI interface?

Jörn

On 03/22/2013 03:08 PM, Jason Baldridge wrote:
> BTW, I've just recently finished integrating Liblinear into Nak (which is
> an adaptation of the maxent portion of OpenNLP). I'm still rounding some
> things out, but so far it is producing more accurate models that are
> trained in less time and without using cutoffs. Here's the code:
> https://github.com/scalanlp/nak
>
> It is still mostly Java, but the liblinear adaptors are in Scala. I've kept
> things such that liblinear retrofits to the interfaces that were in
> opennlp.maxent, though given how well it is working, I'll be stripping
> those out and going with liblinear for everything in upcoming versions.
>
> Happy to answer any questions or help out with any of the above if it might
> be useful!


Re: Liblinear (was: OpenNLP 1.5.3 RC 2 ready for testing)

Posted by Jason Baldridge <ja...@gmail.com>.
I used the Java port. I actually pulled it into nak as nak.liblinear
because the model write/read code did it as text files and I needed access
to the Model member fields in order to do the serialization how I wanted.
Otherwise it remains as is. With a little bit of adaptation, you could
provide a Java wrapper in OpenNLP that follows the same pattern as my Scala
stuff. You'd just need to make it implement AbstractModel, which shouldn't
be too hard. (I have it implement LinearModel, which is just a slight
modification of MaxentModel, and I changed all uses of AbstractModel to
LinearModel in Chalk [the opennlp.tools portion]). -j

On Fri, Mar 22, 2013 at 9:32 AM, Jörn Kottmann <ko...@gmail.com> wrote:

> Sounds interesting, I hope we will find the time to do that in OpenNLP
> after the 1.5.3 release too. We already discussed this and I think had
> consensus
> on making the machine learning pluggable and then offer a few addons for
> existing libraries.
>
> Good to know that liblinear works well, as far as I know its written in
> C/C++,
> did you use the Java port of it, or did you wrote a JNI interface?
>
> Jörn
>
> On 03/22/2013 03:08 PM, Jason Baldridge wrote:
>
>> BTW, I've just recently finished integrating Liblinear into Nak (which is
>> an adaptation of the maxent portion of OpenNLP). I'm still rounding some
>> things out, but so far it is producing more accurate models that are
>> trained in less time and without using cutoffs. Here's the code:
>> https://github.com/scalanlp/**nak <https://github.com/scalanlp/nak>
>>
>> It is still mostly Java, but the liblinear adaptors are in Scala. I've
>> kept
>> things such that liblinear retrofits to the interfaces that were in
>> opennlp.maxent, though given how well it is working, I'll be stripping
>> those out and going with liblinear for everything in upcoming versions.
>>
>> Happy to answer any questions or help out with any of the above if it
>> might
>> be useful!
>>
>
>


-- 
Jason Baldridge
Associate Professor, Department of Linguistics
The University of Texas at Austin
http://www.jasonbaldridge.com
http://twitter.com/jasonbaldridge