You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Oleh Koval (JIRA)" <ji...@apache.org> on 2016/11/02 14:08:58 UTC

[jira] [Commented] (SPARK-17775) pyspark: take(num) failed, but collect() worked for big dataset

    [ https://issues.apache.org/jira/browse/SPARK-17775?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15629076#comment-15629076 ] 

Oleh Koval commented on SPARK-17775:
------------------------------------

Seems to be the same issue as [SPARK-12261]

> pyspark: take(num) failed, but collect() worked for big dataset
> ---------------------------------------------------------------
>
>                 Key: SPARK-17775
>                 URL: https://issues.apache.org/jira/browse/SPARK-17775
>             Project: Spark
>          Issue Type: Bug
>         Environment: Spark:1.6.1
> Python 2.7.12 :: Anaconda 4.1.1 (64-bit)
> Windows 7
> One machine
>            Reporter: Rick Lin
>
> Hi, all:
> I ran one dataset with 39,501 data drew from the table of PostgreSQL DB in pyspark.
> The code was as:
> cur1.execute("select id from users")
> users = cur1.fetchall()
> users_rdd = sc.parallelize(users)
> users_rdd.take(1)
> where the error message was as:
> Py4JJavaError: An error occurred while calling z:org.apache.spark.api.python.PythonRDD.runJob.
> : org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, localhost): java.net.SocketException: Connection reset by peer: socket write error
> However, when i changed from take(1) to collect(), it worked, as: 
> [[25],
>  [1439],
> ...
> ]
> When I ran the same code for a small dataset, here take(1) and collect() worked.
> I don't know why this happened and how to fix this problem for a big dataset?
> Could you help me to deal with this problem?
> Thanks



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org