You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by Subroto <ss...@datameer.com> on 2013/03/05 15:22:40 UTC
S3N copy creating recursive folders
Hi,
I am using Hadoop 1.0.3 and trying to execute:
hadoop fs -cp s3n://acessKey:acessSecret@bucket.name/srcData" /test/srcData
This ends up with:
cp: java.io.IOException: mkdirs: Pathname too long. Limit 8000 characters, 1000 levels.
When I try to list the folder recursively /test/srcData: it lists 998 folders like:
drwxr-xr-x - root supergroup 0 2013-03-05 08:49 /test/srcData/srcData
drwxr-xr-x - root supergroup 0 2013-03-05 08:49 /test/srcData/srcData/srcData
drwxr-xr-x - root supergroup 0 2013-03-05 08:49 /test/srcData/srcData/srcData/srcData
drwxr-xr-x - root supergroup 0 2013-03-05 08:49 /test/srcData/srcData/srcData/srcData/srcData
drwxr-xr-x - root supergroup 0 2013-03-05 08:49 /test/srcData/srcData/srcData/srcData/srcData/srcData
Is there a problem with s3n filesystem ??
Cheers,
Subroto Sanyal