You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2018/11/16 02:55:00 UTC

[jira] [Commented] (SPARK-26031) dataframe can't load correct after saving to local disk in cluster mode

    [ https://issues.apache.org/jira/browse/SPARK-26031?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16688939#comment-16688939 ] 

Hyukjin Kwon commented on SPARK-26031:
--------------------------------------

That's because you're using {{file://...}} in a cluster. The file system should usually be a distributed file system that all nodes can access.

> dataframe can't load correct after saving to local disk in cluster mode
> -----------------------------------------------------------------------
>
>                 Key: SPARK-26031
>                 URL: https://issues.apache.org/jira/browse/SPARK-26031
>             Project: Spark
>          Issue Type: Bug
>          Components: Structured Streaming
>    Affects Versions: 2.3.1
>         Environment: 1 spark master
> 3 spark slaves
>  
>            Reporter: Bihui Jin
>            Priority: Major
>
> Firstly I saved a spark dataframe to local disk in spark cluster mode with "
> df.write \
> .format('json') \
> .save('file:///root/bughunter/', mode='overwrite')
> " (using interface provide by {color:#FF0000}pyspark{color})
> Then I load it with "
> spark.read.format('json').load('file:///root/bughunter/')
> "
> But it faild with " org.apache.spark.sql.AnalysisException: Unable to infer schema for JSON. It must be specified manually."
> And I check every node's disk:
> In master:
> only the file named "_SUCCESS" exists in /root/bughunter/;
> In each slave, there is a folder named "_temporary" exists in /root/bughunter/
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org