You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Liang-Chi Hsieh (JIRA)" <ji...@apache.org> on 2015/04/25 17:53:39 UTC

[jira] [Commented] (SPARK-7141) saveAsTextFile() on S3 first creates empty prefix

    [ https://issues.apache.org/jira/browse/SPARK-7141?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14512573#comment-14512573 ] 

Liang-Chi Hsieh commented on SPARK-7141:
----------------------------------------

The double slash issue is caused by the Jets3tFileSystemStore implementation in Hadoop.
You can refer to [HADOOP-11444|https://issues.apache.org/jira/browse/HADOOP-11444] and [the discussion on spark-user|https://mail-archives.apache.org/mod_mbox/spark-user/201412.mbox/%3CCAE50=drwWG=eMDM=LsuF-PUzopxFNJ-+7K3Vx_M5mmJfaL2KtA@mail.gmail.com%3E].

> saveAsTextFile() on S3 first creates empty prefix
> -------------------------------------------------
>
>                 Key: SPARK-7141
>                 URL: https://issues.apache.org/jira/browse/SPARK-7141
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 1.3.1
>         Environment: OS X 10.10
>            Reporter: Eric O. LEBIGOT (EOL)
>
> Using {{saveAsTextFile("s3://bucket/prefix"}} actually adds an empty prefix, i.e. it writes to {{s3://bucket//prefix}} (note the double slash).
> Example code (in a {{pyspark}} shell):
> {{rdd = sc.parallelize("abcd")}}
> {{rdd.saveAsTextFile("s3://bucket/prefix")})
> This is quite annoying, as the files cannot be saved in the intended location (they can be read, though, with the original path: {{sc.textFile("s3://bucket/prefix"}}, but the AWS console does not show them in the right place).
> Also, many {{block_*}} files are created directly in the bucket: shouldn't they be deleted? (This may be a separate issue, but maybe it is a path issue as well.)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org