You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@opennlp.apache.org by Gustavo Knuppe <gu...@gmail.com> on 2015/05/26 01:54:02 UTC

Deserialization problem?

Hello people,

I was debugging the latest changes and noticed a possible problem.

When I load a parser model (eg public model en-parser-chunking.bin -
v1.5.0) the sub-models are loaded as 1.6.1-SNAPSHOT version, including
different properties from the original sub-models.

This is intentional?

Here's a simple test:

         InputStream is = new FileInputStream("en-parser-chunking.bin");
         ParserModel model = new ParserModel(is);

         ChunkerModel chunker = model.getParserChunkerModel();
         POSModel posmodel = model.getParserTaggerModel();

         Version v1 = model.getVersion(); /* returns: 1.5 */
         Version v2 = chunker.getVersion(); /* returns: 1.6.1-SNAPSHOT */
         Version v3 = posmodel.getVersion(); /* returns: 1.6.1-SNAPSHOT */

Thanks.
Gustavo K.