You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2016/06/02 16:09:59 UTC

[jira] [Commented] (SPARK-15729) Clarify that saveAs*File doesn't make sense with local FS in cluster context

    [ https://issues.apache.org/jira/browse/SPARK-15729?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15312548#comment-15312548 ] 

Sean Owen commented on SPARK-15729:
-----------------------------------

If it's the same data at the same path, you could read it normally from file:/// yes. What you do on the driver is local to the driver, so writing to its local FS makes sense.

> Clarify that saveAs*File doesn't make sense with local FS in cluster context
> ----------------------------------------------------------------------------
>
>                 Key: SPARK-15729
>                 URL: https://issues.apache.org/jira/browse/SPARK-15729
>             Project: Spark
>          Issue Type: Improvement
>          Components: Documentation
>    Affects Versions: 1.6.1
>            Reporter: Marco Capuccini
>            Priority: Minor
>
> I set up a standalone Spark cluster. I don't need HDFS, so I just want to save the files on the regular file system in a distributed manner. For testing purpose, I opened a Spark Shell, and I run the following code.
> sc.parallelize(1 to 100).saveAsTextFile("file:///mnt/volume/test.txt")
> I got no error from this, but if I go to inspect the /mnt/volume/test.txt folder on each node this is what I see:
> On the master (where I launched the spark shell):
> /mnt/volume/test.txt/_SUCCESS
> On the workers:
> /mnt/volume/test.txt/_temporary
> It seems like some failure occurred, but I didn't get any error. Is this a bug, or am I missing something?



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org