You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by jaredtims <ja...@yahoo.com> on 2015/03/13 22:06:02 UTC

spark flume tryOrIOException NoSuchMethodError

I am trying to process events from a flume avro sink, but i keep getting this
same error.  I am just running it locally using flumes avro-client. With the
following commands to start the job and client.  It seems like it should be
a configuration problems since its a NoSuchMethodError, but everything is
there.

Job command:
spark-submit --master local[2] --class com.streaming.SparkFlume
./target/scala-2.10/streaming.jar localhost 7777

Client command:
flume-ng avro-client --conf $FLUME_BASE/conf -H localhost -p 7777 -F
/etc/passwd


The error:

15/03/13 16:55:10 INFO ReceiverTracker: Stream 0 received 0 blocks
15/03/13 16:55:10 INFO JobScheduler: Added jobs for time 1426280110000 ms
15/03/13 16:55:10 INFO JobScheduler: Starting job streaming job
1426280110000 ms.0 from job set of time 1426280110000 ms
15/03/13 16:55:10 INFO DefaultExecutionContext: Starting job:
CallSite(getCallSite at
DStream.scala:294,org.apache.spark.SparkContext.getCallSite(SparkContext.scala:1077)
org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1.apply(DStream.scala:294)
org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1.apply(DStream.scala:288)
scala.Option.orElse(Option.scala:257)
org.apache.spark.streaming.dstream.DStream.getOrCompute(DStream.scala:285)
org.apache.spark.streaming.dstream.ForEachDStream.generateJob(ForEachDStream.scala:38)
org.apache.spark.streaming.DStreamGraph$$anonfun$1.apply(DStreamGraph.scala:115)
org.apache.spark.streaming.DStreamGraph$$anonfun$1.apply(DStreamGraph.scala:115)
scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:251)
scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:251)
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:251)
scala.collection.AbstractTraversable.flatMap(Traversable.scala:105)
org.apache.spark.streaming.DStreamGraph.generateJobs(DStreamGraph.scala:115)
org.apache.spark.streaming.scheduler.JobGenerator$$anonfun$2.apply(JobGenerator.scala:221)
org.apache.spark.streaming.scheduler.JobGenerator$$anonfun$2.apply(JobGenerator.scala:221)
scala.util.Try$.apply(Try.scala:161)
org.apache.spark.streaming.scheduler.JobGenerator.generateJobs(JobGenerator.scala:221)
org.apache.spark.streaming.scheduler.JobGenerator.org$apache$spark$streaming$scheduler$JobGenerator$$processEvent(JobGenerator.scala:165))
15/03/13 16:55:10 INFO DefaultExecutionContext: Job finished:
CallSite(getCallSite at
DStream.scala:294,org.apache.spark.SparkContext.getCallSite(SparkContext.scala:1077)
org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1.apply(DStream.scala:294)
org.apache.spark.streaming.dstream.DStream$$anonfun$getOrCompute$1.apply(DStream.scala:288)
scala.Option.orElse(Option.scala:257)
org.apache.spark.streaming.dstream.DStream.getOrCompute(DStream.scala:285)
org.apache.spark.streaming.dstream.ForEachDStream.generateJob(ForEachDStream.scala:38)
org.apache.spark.streaming.DStreamGraph$$anonfun$1.apply(DStreamGraph.scala:115)
org.apache.spark.streaming.DStreamGraph$$anonfun$1.apply(DStreamGraph.scala:115)
scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:251)
scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:251)
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:251)
scala.collection.AbstractTraversable.flatMap(Traversable.scala:105)
org.apache.spark.streaming.DStreamGraph.generateJobs(DStreamGraph.scala:115)
org.apache.spark.streaming.scheduler.JobGenerator$$anonfun$2.apply(JobGenerator.scala:221)
org.apache.spark.streaming.scheduler.JobGenerator$$anonfun$2.apply(JobGenerator.scala:221)
scala.util.Try$.apply(Try.scala:161)
org.apache.spark.streaming.scheduler.JobGenerator.generateJobs(JobGenerator.scala:221)
org.apache.spark.streaming.scheduler.JobGenerator.org$apache$spark$streaming$scheduler$JobGenerator$$processEvent(JobGenerator.scala:165)),
took 3.9692E-5 s
15/03/13 16:55:10 INFO JobScheduler: Finished job streaming job
1426280110000 ms.0 from job set of time 1426280110000 ms
15/03/13 16:55:10 INFO JobScheduler: Total delay: 0.008 s for time
1426280110000 ms (execution: 0.003 s)
15/03/13 16:55:10 INFO FilteredRDD: Removing RDD 12 from persistence list
15/03/13 16:55:10 INFO BlockManager: Removing RDD 12
15/03/13 16:55:10 INFO MappedRDD: Removing RDD 11 from persistence list
15/03/13 16:55:10 INFO BlockManager: Removing RDD 11
15/03/13 16:55:10 INFO BlockRDD: Removing RDD 10 from persistence list
15/03/13 16:55:10 INFO BlockManager: Removing RDD 10
15/03/13 16:55:10 INFO FlumeInputDStream: Removing blocks of RDD
BlockRDD[10] at createStream at SparkFlume.scala:45 of time 1426280110000 ms
15/03/13 16:55:11 INFO NettyServer: [id: 0xdcdce18e, /127.0.0.1:50172 =>
/127.0.0.1:7777] OPEN
15/03/13 16:55:11 INFO NettyServer: [id: 0xdcdce18e, /127.0.0.1:50172 =>
/127.0.0.1:7777] BOUND: /127.0.0.1:7777
15/03/13 16:55:11 INFO NettyServer: [id: 0xdcdce18e, /127.0.0.1:50172 =>
/127.0.0.1:7777] CONNECTED: /127.0.0.1:50172
15/03/13 16:55:12 WARN BlockManager: Putting block input-0-1426280112000
failed
15/03/13 16:55:12 ERROR BlockGenerator: Error in block pushing thread
java.lang.NoSuchMethodError:
org.apache.spark.util.Utils$.tryOrIOException(Lscala/Function0;)V
	at
org.apache.spark.streaming.flume.SparkFlumeEvent.writeExternal(FlumeInputDStream.scala:96)
	at
java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1459)
	at
java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1430)
	at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
	at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
	at
org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:42)
	at
org.apache.spark.serializer.SerializationStream.writeAll(Serializer.scala:110)
	at
org.apache.spark.storage.BlockManager.dataSerializeStream(BlockManager.scala:1047)
	at
org.apache.spark.storage.BlockManager.dataSerialize(BlockManager.scala:1056)
	at org.apache.spark.storage.MemoryStore.putArray(MemoryStore.scala:93)
	at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:745)
	at org.apache.spark.storage.BlockManager.putArray(BlockManager.scala:625)
	at
org.apache.spark.streaming.receiver.ReceiverSupervisorImpl.pushArrayBuffer(ReceiverSupervisorImpl.scala:113)
	at
org.apache.spark.streaming.receiver.ReceiverSupervisorImpl$$anon$2.onPushBlock(ReceiverSupervisorImpl.scala:96)
	at
org.apache.spark.streaming.receiver.BlockGenerator.pushBlock(BlockGenerator.scala:140)
	at
org.apache.spark.streaming.receiver.BlockGenerator.org$apache$spark$streaming$receiver$BlockGenerator$$keepPushingBlocks(BlockGenerator.scala:113)
	at
org.apache.spark.streaming.receiver.BlockGenerator$$anon$1.run(BlockGenerator.scala:57)
15/03/13 16:55:12 WARN ReceiverSupervisorImpl: Reported error Error in block
pushing thread - java.lang.NoSuchMethodError:
org.apache.spark.util.Utils$.tryOrIOException(Lscala/Function0;)V
15/03/13 16:55:12 WARN ReceiverTracker: Error reported by receiver for
stream 0: Error in block pushing thread - java.lang.NoSuchMethodError:
org.apache.spark.util.Utils$.tryOrIOException(Lscala/Function0;)V
	at
org.apache.spark.streaming.flume.SparkFlumeEvent.writeExternal(FlumeInputDStream.scala:96)
	at
java.io.ObjectOutputStream.writeExternalData(ObjectOutputStream.java:1459)
	at
java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1430)
	at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)
	at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)
	at
org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:42)
	at
org.apache.spark.serializer.SerializationStream.writeAll(Serializer.scala:110)
	at
org.apache.spark.storage.BlockManager.dataSerializeStream(BlockManager.scala:1047)
	at
org.apache.spark.storage.BlockManager.dataSerialize(BlockManager.scala:1056)
	at org.apache.spark.storage.MemoryStore.putArray(MemoryStore.scala:93)
	at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:745)
	at org.apache.spark.storage.BlockManager.putArray(BlockManager.scala:625)
	at
org.apache.spark.streaming.receiver.ReceiverSupervisorImpl.pushArrayBuffer(ReceiverSupervisorImpl.scala:113)
	at
org.apache.spark.streaming.receiver.ReceiverSupervisorImpl$$anon$2.onPushBlock(ReceiverSupervisorImpl.scala:96)
	at
org.apache.spark.streaming.receiver.BlockGenerator.pushBlock(BlockGenerator.scala:140)
	at
org.apache.spark.streaming.receiver.BlockGenerator.org$apache$spark$streaming$receiver$BlockGenerator$$keepPushingBlocks(BlockGenerator.scala:113)
	at
org.apache.spark.streaming.receiver.BlockGenerator$$anon$1.run(BlockGenerator.scala:57)

15/03/13 16:55:12 INFO NettyServer: [id: 0xdcdce18e, /127.0.0.1:50172 :>
/127.0.0.1:7777] DISCONNECTED
15/03/13 16:55:12 INFO NettyServer: [id: 0xdcdce18e, /127.0.0.1:50172 :>
/127.0.0.1:7777] UNBOUND
15/03/13 16:55:12 INFO NettyServer: [id: 0xdcdce18e, /127.0.0.1:50172 :>
/127.0.0.1:7777] CLOSED
15/03/13 16:55:12 INFO NettyServer: Connection to /127.0.0.1:50172
disconnected.




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/spark-flume-tryOrIOException-NoSuchMethodError-tp22041.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org