You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by "Stephen Montgomery (JIRA)" <ji...@apache.org> on 2016/02/03 17:50:39 UTC

[jira] [Created] (HADOOP-12763) S3AFileSystem And Hadoop FsShell Operations

Stephen Montgomery created HADOOP-12763:
-------------------------------------------

             Summary: S3AFileSystem And Hadoop FsShell Operations
                 Key: HADOOP-12763
                 URL: https://issues.apache.org/jira/browse/HADOOP-12763
             Project: Hadoop Common
          Issue Type: Bug
          Components: fs/s3
    Affects Versions: 2.7.1
            Reporter: Stephen Montgomery


Hi,
I'm looking at the Hadoop S3A Filesystem and FS Shell commands (specifically -ls and -copyFromLocal/Put).

1. Create S3 bucket eg test-s3a-bucket.
2. List bucket contents using S3A and get an error: 

$ hadoop fs -Dfs.s3n.awsAccessKeyId=... -Dfs.s3n.awsSecretAccessKey=... -Dfs.s3a.access.key=... -Dfs.s3a.secret.key=... -ls s3a://test-s3a-bucket/
16/02/03 16:31:13 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
ls: `s3a://test-s3a-bucket/': No such file or directory

3. List bucket contents using S3N and get no results (fair enough):

$ hadoop fs -Dfs.s3n.awsAccessKeyId=... -Dfs.s3n.awsSecretAccessKey=... -Dfs.s3a.access.key=... -Dfs.s3a.secret.key=... -ls s3n://test-s3a-bucket/
16/02/03 16:32:41 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

4. Attempt to copy a file from local fs to S3A and get an error (with or without the trailing slash):

$ hadoop fs -Dfs.s3n.awsAccessKeyId=... -Dfs.s3n.awsSecretAccessKey=... -Dfs.s3a.access.key=... -Dfs.s3a.secret.key=... -copyFromLocal /tmp/zz s3a://test-s3a-bucket/
16/02/03 16:35:02 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
copyFromLocal: `s3a://test-s3a-bucket/': No such file or directory

5. Attempt to copy a file from local fs to S3N and works:

$ hadoop fs -Dfs.s3n.awsAccessKeyId=... -Dfs.s3n.awsSecretAccessKey=... -Dfs.s3a.access.key=... -Dfs.s3a.secret.key=... -copyFromLocal /tmp/zz s3n://test-s3a-bucket/
16/02/03 16:36:17 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/02/03 16:36:18 INFO s3native.NativeS3FileSystem: OutputStream for key 'zz._COPYING_' writing to tempfile '/tmp/hadoop-monty/s3/output-9212095517127973121.tmp'
16/02/03 16:36:18 INFO s3native.NativeS3FileSystem: OutputStream for key 'zz._COPYING_' closed. Now beginning upload
16/02/03 16:36:18 INFO s3native.NativeS3FileSystem: OutputStream for key 'zz._COPYING_' upload complete

$ hadoop fs -Dfs.s3n.awsAccessKeyId=... -Dfs.s3n.awsSecretAccessKey=... -Dfs.s3a.access.key=... -Dfs.s3a.secret.key=... -ls s3a://test-s3a-bucket/
16/02/03 16:36:44 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Found 1 items
-rw-rw-rw-   1        200 2016-02-03 16:36 s3a://test-s3a-bucket/zz

It seems that basic filesystem operations can't be performed with an empty/new bucket. I have been able to populate buckets with distcp but I wonder if this is because I was copying directories instead of individual files.

I know that S3A uses AmazonS3 client and S3N uses jet3t so different underlying implementations/potentially different behaviours but I mainly used s3n for illustration purposes (and it looks like it's working as expected).

Can someone confirm this behaviour. Is it expected?

Thanks,
Stephen



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)