You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by JoeWass <jo...@afandian.com> on 2014/12/10 16:49:34 UTC

KryoException: Buffer overflow for very small input

I have narrowed down my problem to some code plus an input file with a single
very small input (one line). I'm getting a
"com.esotericsoftware.kryo.KryoException: Buffer overflow. Available: 0,
required: 14634430", but as the input is so small I think there's something
else up. I'm not sure what. Can anyone help?

I'm using the Flambo Clojure wrapper (but I don't think it makes much
difference) and spark-core_2.10 "1.1.1". 

Here's the program that crashes. It's in Clojure but it's very
straightforward as I've narrowed it down to the minum-crashing-example. If I
remove one (any) line it works. I understand that for each of the `map`s the
file will be re-read (all one line of it). 

I could just set `spark.kryoserializer.buffer.mb` to a large number, but I
don't think the default should break with such a small input (each object
I'm dealing with should weigh in at no more than 100 bytes) so I want to
understand what's wrong before tweaking the value.  was the maximum single
object size. 

Thanks for your help in advance.





--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/KryoException-Buffer-overflow-for-very-small-input-tp20606.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org