You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by akf1990 akf1990 <ak...@gmail.com> on 2015/09/08 07:32:49 UTC

Fwd: Retry scp over fuse mounted hdfs directory

Hi

1. I tried to scp a file from my localhost to a remote machine. The
destination was on hdfs and it was  moutned using fuse.
Destination: /mount/hdfs/user/biadmin/input/GLCC/gaana
SCP was successful. But in a there i want to scp same file with same name
to same destination(without removing the already SCPed file), this time my
SCP fails giving me following error:

"scp: /mount/hdfs/user/biadmin/input/: Input/output error".
Here is the error printed from fuse debug logs:

open flags: 0x8001 /user/biadmin/input/GLCC/gaana
hdfsOpenFile(/user/biadmin/input/GLCC/gaana):
FileSystem#append((Lorg/apache/hadoop/fs/Path;)Lorg/apache/hadoop/fs/FSDataOutputStream;)
error:
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.protocol.AlreadyBeingCreatedException):
failed to create file /user/biadmin/input/GLCC/gaana for
DFSClient_NONMAPREDUCE_2007745602_24 for client 10.4.0.10 because current
leaseholder is trying to recreate file.

Attached file conatins full error.
Can anybody help me understand the problem here? Am i missing something? Is
there any workaround?(Besides using other programes like sftp etc as it is
a requirement to use SCP in my case).