You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Eric O. LEBIGOT (EOL) (JIRA)" <ji...@apache.org> on 2015/05/02 10:10:06 UTC
[jira] [Commented] (SPARK-7141) saveAsTextFile() on S3 first
creates empty prefix
[ https://issues.apache.org/jira/browse/SPARK-7141?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14525120#comment-14525120 ]
Eric O. LEBIGOT (EOL) commented on SPARK-7141:
----------------------------------------------
Alright: switching to s3n:// fixed the issue.
I am not clear about the reason why (the AWS documentation sometimes recommends s3, sometimes s3n, and the AWS CLI sometimes supports s3n, sometimes it doesn't), but I would be interested to know.
Also, if this workaround can be used in all situations, I guess that it would be marked as a more minor bug?
> saveAsTextFile() on S3 first creates empty prefix
> -------------------------------------------------
>
> Key: SPARK-7141
> URL: https://issues.apache.org/jira/browse/SPARK-7141
> Project: Spark
> Issue Type: Bug
> Components: PySpark
> Affects Versions: 1.3.1
> Environment: OS X 10.10
> Reporter: Eric O. LEBIGOT (EOL)
>
> Using {{saveAsTextFile("s3://bucket/prefix")}} actually adds an empty prefix, i.e. it writes to {{s3://bucket//prefix}} (note the double slash).
> Example code (in a {{pyspark}} shell):
> {{rdd = sc.parallelize("abcd")}}
> {{rdd.saveAsTextFile("s3://public_key:private_key@bucket/prefix")}}
> This is quite annoying, as the files cannot be saved in the intended location (they can be read, though, with the original path: {{sc.textFile("s3://bucket/prefix")}}, but the AWS console does not show them in the right place).
> Also, many {{block_*}} files are created directly in the bucket: shouldn't they be deleted? (This may be a separate issue, but maybe it is a path issue as well.)
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org