You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@mahout.apache.org by Jerry Ye <je...@yahoo-inc.com> on 2010/01/20 03:40:59 UTC

org.apache.hadoop.io.serializer.SerializationFactory.getSerializer(SerializationFactory.java:73

Hi,
I'm getting the following error when trying to create vectors from a Solr index.  I've also tried using the arff to mvc utility and I'm getting the exact same error.

Exception in thread "main" java.lang.NullPointerException
    at org.apache.hadoop.io.serializer.SerializationFactory.getSerializer(SerializationFactory.java:73)
    at org.apache.hadoop.io.SequenceFile$Writer.init(SequenceFile.java:910)
    at org.apache.hadoop.io.SequenceFile$RecordCompressWriter.<init>(SequenceFile.java:1074)
    at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:397)
    at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:284)
    at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:265)
    at org.apache.mahout.utils.vectors.lucene.Driver.getSeqFileWriter(Driver.java:226)
    at org.apache.mahout.utils.vectors.lucene.Driver.main(Driver.java:197)

Any ideas?  Thanks.

- jerry

Re: org.apache.hadoop.io.serializer.SerializationFactory.getSerializer(SerializationFactory.java:73

Posted by mgainer <m....@comcast.net>.
Just ran into this myself.  Turns out that your Split object gets created on
the job master node, and needs to get serialized around to workerbee nodes. 
Thus, your split type needs to implement Writable.  See
.../src/mapred/org/apache/hadoop/mapreduce/lib/input/FileSplit.java for an
example.


-- 
View this message in context: http://n3.nabble.com/org-apache-hadoop-io-serializer-SerializationFactory-getSerializer-SerializationFactory-java-73-tp641003p728499.html
Sent from the Mahout User List mailing list archive at Nabble.com.

Re: org.apache.hadoop.io.serializer.SerializationFactory.getSerializer(SerializationFactory.java:73

Posted by Rob Ennals <ro...@gmail.com>.
Good to hear I'm not the only person getting this exception. I sent a
message a couple of days ago saying I was getting this exception
trying to create vectors from Lucene.

Anyone got any ideas what is causing this?


-Rob

On Tue, Jan 19, 2010 at 6:40 PM, Jerry Ye <je...@yahoo-inc.com> wrote:
> Hi,
> I'm getting the following error when trying to create vectors from a Solr index.  I've also tried using the arff to mvc utility and I'm getting the exact same error.
>
> Exception in thread "main" java.lang.NullPointerException
>    at org.apache.hadoop.io.serializer.SerializationFactory.getSerializer(SerializationFactory.java:73)
>    at org.apache.hadoop.io.SequenceFile$Writer.init(SequenceFile.java:910)
>    at org.apache.hadoop.io.SequenceFile$RecordCompressWriter.<init>(SequenceFile.java:1074)
>    at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:397)
>    at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:284)
>    at org.apache.hadoop.io.SequenceFile.createWriter(SequenceFile.java:265)
>    at org.apache.mahout.utils.vectors.lucene.Driver.getSeqFileWriter(Driver.java:226)
>    at org.apache.mahout.utils.vectors.lucene.Driver.main(Driver.java:197)
>
> Any ideas?  Thanks.
>
> - jerry
>