You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "jason kim (JIRA)" <ji...@apache.org> on 2015/10/12 09:07:05 UTC

[jira] [Created] (SPARK-11061) baidu

jason kim created SPARK-11061:
---------------------------------

             Summary: baidu
                 Key: SPARK-11061
                 URL: https://issues.apache.org/jira/browse/SPARK-11061
             Project: Spark
          Issue Type: Question
            Reporter: jason kim


Hello, I have a problem, the following

java.io.NotSerializableException: DStream checkpointing has been enabled but the DStreams with their functions are not serializable
Serialization stack:

	at org.apache.spark.streaming.StreamingContext.validate(StreamingContext.scala:550)
	at org.apache.spark.streaming.StreamingContext.liftedTree1$1(StreamingContext.scala:587)
	at org.apache.spark.streaming.StreamingContext.start(StreamingContext.scala:586)
	at com.bj58.spark.streaming.KafkaWordCountTest$.main(KafkaWordCountTest.scala:70)
	at com.bj58.spark.streaming.KafkaWordCountTest.main(KafkaWordCountTest.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org