You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Diana Carroll (JIRA)" <ji...@apache.org> on 2015/07/02 17:06:04 UTC

[jira] [Created] (SPARK-8793) error/warning with pyspark WholeTextFiles.first

Diana Carroll created SPARK-8793:
------------------------------------

             Summary: error/warning with pyspark WholeTextFiles.first
                 Key: SPARK-8793
                 URL: https://issues.apache.org/jira/browse/SPARK-8793
             Project: Spark
          Issue Type: Bug
          Components: PySpark
    Affects Versions: 1.3.0
            Reporter: Diana Carroll
            Priority: Minor
         Attachments: wholefilesbug.txt

In Spark 1.3.0 python, calling first() on sc.wholeTextFiles is not working correctly in pyspark.  It works fine in Scala.

I created a directory with two tiny, simple text files.  
this works:
{code}sc.wholeTextFiles("testdata").collect(){code}
this doesn't:
{code}sc.wholeTextFiles("testdata").first(){code}

The main error message is:
{code}15/07/02 08:01:38 ERROR executor.Executor: Exception in task 0.0 in stage 12.0 (TID 12)
org.apache.spark.api.python.PythonException: Traceback (most recent call last):
  File "/usr/lib/spark/python/pyspark/worker.py", line 101, in main
    process()
  File "/usr/lib/spark/python/pyspark/worker.py", line 96, in process
    serializer.dump_stream(func(split_index, iterator), outfile)
  File "/usr/lib/spark/python/pyspark/serializers.py", line 236, in dump_stream
    vs = list(itertools.islice(iterator, batch))
  File "/usr/lib/spark/python/pyspark/rdd.py", line 1220, in takeUpToNumLeft
    while taken < left:
ImportError: No module named iter
{code}
I will attach the full stack trace to the JIRA.

I'm using CentOS 6.6 with CDH 5.4.3 (Spark 1.3.0).  Tested in both Python 2.6 and 2.7, same results.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org