You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by "Hadoop QA (JIRA)" <ji...@apache.org> on 2007/01/10 22:16:27 UTC
[jira] Commented: (HADOOP-880) Recursive delete for an S3 directory
does not actually delete files or subdirectories
[ https://issues.apache.org/jira/browse/HADOOP-880?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#action_12463722 ]
Hadoop QA commented on HADOOP-880:
----------------------------------
+1, because http://issues.apache.org/jira/secure/attachment/12348686/hadoop-880.patch applied and successfully tested against trunk revision r494905.
> Recursive delete for an S3 directory does not actually delete files or subdirectories
> -------------------------------------------------------------------------------------
>
> Key: HADOOP-880
> URL: https://issues.apache.org/jira/browse/HADOOP-880
> Project: Hadoop
> Issue Type: Bug
> Components: fs
> Affects Versions: 0.10.0
> Reporter: Tom White
> Assigned To: Tom White
> Attachments: hadoop-880.patch
>
>
> Here is the bug report from Michael Stack:
> Here I'm listing a BUCKET directory that was copied up using 'hadoop
> fs', then rmr'ing it and then listing again:
> stack@bregeon:~/checkouts/hadoop$ ./bin/hadoop fs -fs
> s3://ID:SECRET@BUCKET -ls /fromfile
> Found 2 items
> /fromfile/diff.txt <r 1> 591
> /fromfile/x.js <r 1> 2477
> stack@bregeon:~/checkouts/hadoop$ ./bin/hadoop fs -fs
> s3://ID:SECRET@BUCKET -rmr /fromfile
> Deleted /fromfile
> stack@bregeon:~/checkouts/hadoop$ ./bin/hadoop fs -fs
> s3://ID:SECRET@BUCKET -ls /fromfile
> Found 0 items
> The '0 items' is odd because, now, listing my BUCKET using a tool other
> than 'hadoop fs' (i.e. hanzo webs python scripts):
> stack@bregeon:~/checkouts/hadoop.trunk$ s3ls BUCKET
> %2F
> %2Ffromfile%2F.diff.txt.crc
> %2Ffromfile%2F.x.js.crc
> %2Ffromfile%2Fdiff.txt
> %2Ffromfile%2Fx.js
> block_-5013142890590722396
> block_5832002498000415319
> block_6889488315428893905
> block_9120115089645350905
> Its all still there still. I can subsequently do the likes of the
> following:
> stack@bregeon:~/checkouts/hadoop$ ./bin/hadoop fs -fs
> s3://ID:SECRET@BUCKET -rmr /fromfile/diff.txt
> ... and the delete will succeed and looking at the bucket with alternate
> tools shows that it has actually been remove, and so on up the hierarchy.
--
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators: https://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see: http://www.atlassian.com/software/jira