You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@atlas.apache.org by "milad (JIRA)" <ji...@apache.org> on 2016/11/02 21:56:58 UTC

[jira] [Commented] (ATLAS-1265) Help with spark 1.5.2 Error

    [ https://issues.apache.org/jira/browse/ATLAS-1265?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15630663#comment-15630663 ] 

milad commented on ATLAS-1265:
------------------------------

2016-11-02 17:44:40,469 - ott_ml_recommendation_engine - ERROR - Error while running ML process
Traceback (most recent call last):
  File "/home/mf547m/github/AttQuickplay/etl/ott_etl/src/main/resources/ott_ml/ott_ml_base.py", line 109, in run
    self.run_model()
  File "/home/mf547m/github/AttQuickplay/etl/ott_etl/src/main/resources/ott_ml/ott_ml_recommendation_engine.py", line 140, in run_model
    self.recommend(sc, client_events, rank = 12, numIterations = 25, alpha = 0.98, lambda_ = 0.1)
  File "/home/mf547m/github/AttQuickplay/etl/ott_etl/src/main/resources/ott_ml/ott_ml_recommendation_engine.py", line 234, in recommend
    self.logger.info(tester.take(5))
  File "/opt/app/hdp/2.3.4.29-1/spark/python/lib/pyspark.zip/pyspark/rdd.py", line 1299, in take
    res = self.context.runJob(self, takeUpToNumLeft, p)
  File "/opt/app/hdp/2.3.4.29-1/spark/python/lib/pyspark.zip/pyspark/context.py", line 916, in runJob
    port = self._jvm.PythonRDD.runJob(self._jsc.sc(), mappedRDD._jrdd, partitions)
  File "/opt/app/hdp/2.3.4.29-1/spark/python/lib/pyspark.zip/pyspark/rdd.py", line 2388, in _jrdd
    pickled_cmd, bvars, env, includes = _prepare_for_python_RDD(self.ctx, command, self)
  File "/opt/app/hdp/2.3.4.29-1/spark/python/lib/pyspark.zip/pyspark/rdd.py", line 2308, in _prepare_for_python_RDD
    pickled_command = ser.dumps(command)
  File "/opt/app/hdp/2.3.4.29-1/spark/python/lib/pyspark.zip/pyspark/serializers.py", line 428, in dumps
    return cloudpickle.dumps(obj, 2)
  File "/opt/app/hdp/2.3.4.29-1/spark/python/lib/pyspark.zip/pyspark/cloudpickle.py", line 646, in dumps
    cp.dump(obj)
  File "/opt/app/hdp/2.3.4.29-1/spark/python/lib/pyspark.zip/pyspark/cloudpickle.py", line 107, in dump
    return Pickler.dump(self, obj)
  File "/opt/app/python2.7/lib/python2.7/pickle.py", line 224, in dump
    self.save(obj)
  File "/opt/app/python2.7/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "/opt/app/python2.7/lib/python2.7/pickle.py", line 568, in save_tuple
    save(element)
  File "/opt/app/python2.7/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "/opt/app/hdp/2.3.4.29-1/spark/python/lib/pyspark.zip/pyspark/cloudpickle.py", line 199, in save_function
    self.save_function_tuple(obj)
  File "/opt/app/hdp/2.3.4.29-1/spark/python/lib/pyspark.zip/pyspark/cloudpickle.py", line 236, in save_function_tuple
    save((code, closure, base_globals))
  File "/opt/app/python2.7/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "/opt/app/python2.7/lib/python2.7/pickle.py", line 554, in save_tuple
    save(element)
  File "/opt/app/python2.7/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "/opt/app/python2.7/lib/python2.7/pickle.py", line 606, in save_list
    self._batch_appends(iter(obj))
  File "/opt/app/python2.7/lib/python2.7/pickle.py", line 639, in _batch_appends
    save(x)
  File "/opt/app/python2.7/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "/opt/app/hdp/2.3.4.29-1/spark/python/lib/pyspark.zip/pyspark/cloudpickle.py", line 199, in save_function
    self.save_function_tuple(obj)
  File "/opt/app/hdp/2.3.4.29-1/spark/python/lib/pyspark.zip/pyspark/cloudpickle.py", line 236, in save_function_tuple
    save((code, closure, base_globals))
  File "/opt/app/python2.7/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "/opt/app/python2.7/lib/python2.7/pickle.py", line 554, in save_tuple
    save(element)
  File "/opt/app/python2.7/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "/opt/app/python2.7/lib/python2.7/pickle.py", line 606, in save_list
    self._batch_appends(iter(obj))
  File "/opt/app/python2.7/lib/python2.7/pickle.py", line 642, in _batch_appends
    save(tmp[0])
  File "/opt/app/python2.7/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "/opt/app/hdp/2.3.4.29-1/spark/python/lib/pyspark.zip/pyspark/cloudpickle.py", line 390, in save_instancemethod
    obj=obj)
  File "/opt/app/hdp/2.3.4.29-1/spark/python/lib/pyspark.zip/pyspark/cloudpickle.py", line 524, in save_reduce
    save(args)
  File "/opt/app/python2.7/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "/opt/app/python2.7/lib/python2.7/pickle.py", line 554, in save_tuple
    save(element)
  File "/opt/app/python2.7/lib/python2.7/pickle.py", line 331, in save
    self.save_reduce(obj=obj, *rv)
  File "/opt/app/hdp/2.3.4.29-1/spark/python/lib/pyspark.zip/pyspark/cloudpickle.py", line 542, in save_reduce
    save(state)
  File "/opt/app/python2.7/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "/opt/app/python2.7/lib/python2.7/pickle.py", line 655, in save_dict
    self._batch_setitems(obj.iteritems())
  File "/opt/app/python2.7/lib/python2.7/pickle.py", line 687, in _batch_setitems
    save(v)
  File "/opt/app/python2.7/lib/python2.7/pickle.py", line 286, in save
    f(self, obj) # Call unbound method with explicit self
  File "/opt/app/python2.7/lib/python2.7/pickle.py", line 655, in save_dict
    self._batch_setitems(obj.iteritems())
  File "/opt/app/python2.7/lib/python2.7/pickle.py", line 687, in _batch_setitems
    save(v)
  File "/opt/app/python2.7/lib/python2.7/pickle.py", line 306, in save
    rv = reduce(self.proto)
  File "/opt/app/hdp/2.3.4.29-1/spark/python/lib/pyspark.zip/pyspark/context.py", line 257, in __getnewargs__
    "It appears that you are attempting to reference SparkContext from a broadcast "
Exception: It appears that you are attempting to reference SparkContext from a broadcast variable, action, or transforamtion. SparkContext can only be used on the driver, not in code that it run on workers. For more information, see SPARK-5063.

> Help with spark 1.5.2 Error
> ---------------------------
>
>                 Key: ATLAS-1265
>                 URL: https://issues.apache.org/jira/browse/ATLAS-1265
>             Project: Atlas
>          Issue Type: Bug
>            Reporter: milad
>




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)