You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by hager <lo...@yahoo.com> on 2018/09/17 19:58:12 UTC

why display this error

I run this code using spark-submit --jars
spark-streaming-kafka-0-8-assembly_2.10-2.0.0-preview.jar kafka2.py
localhost:9092 test


import sys

from pyspark import SparkContext, SparkConf
from pyspark.streaming import StreamingContext
from pyspark.streaming.kafka import KafkaUtils
from uuid import uuid1

if __name__ == "__main__":
    sc = SparkContext(appName="PythonStreamingRecieverKafkaWordCount")
    ssc = StreamingContext(sc, 2) # 2 second window

    broker, topic = sys.argv[1:]
    kvs = KafkaUtils.createStream(ssc, \
                                  broker, \
                                  "raw-event-streaming-consumer",{topic:1}) 

    lines = kvs.map(lambda x: x[1])
    counts = lines.flatMap(lambda line: line.split(" ")).map(lambda word:
(word, 1)).reduceByKey(lambda a, b: a+b)

    counts.pprint()
    ssc.start()
    ssc.awaitTermination()


*why display this error *
2018-09-17 11:49:19 ERROR ReceiverTracker:91 - Receiver has been stopped.
Try to restart it.
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in
stage 81.0 failed 1 times, most recent failure: Lost task 0.0 in stage 81.0
(TID 81, localhost, executor driver): java.lang.AbstractMethodError
	at
org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:99)



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org