You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by "Muthupandiyan, Kamaraj" <ka...@hpe.com> on 2020/08/21 14:10:02 UTC

Error while running "hdfs fs" commands

Hi Team,

Whenever I am executing the hdfs fs command and I am getting below error, please help me with this.

[hdfs@hdfsnn1 ~]$ hdfs dfs -ls /user
-ls: java.net.UnknownHostException: hdfs


[hdfs@sym-hdfsnn1 ~]$ hadoop fs -ls / file:///|hdfs:hdfsnn1:8020/
-bash: hdfs:sym-hdfsnn1.lvs.nimblestorage.com:8020/: No such file or directory
-ls: java.net.UnknownHostException: hdfs
Usage: hadoop fs [generic options]
        [-appendToFile <localsrc> ... <dst>]
        [-cat [-ignoreCrc] <src> ...]
        [-checksum <src> ...]
        [-chgrp [-R] GROUP PATH...]
        [-chmod [-R] <MODE[,MODE]... | OCTALMODE> PATH...]
        [-chown [-R] [OWNER][:[GROUP]] PATH...]
        [-copyFromLocal [-f] [-p] [-l] [-d] <localsrc> ... <dst>]
        [-copyToLocal [-f] [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]
        [-count [-q] [-h] [-v] [-t [<storage type>]] [-u] [-x] <path> ...]
        [-cp [-f] [-p | -p[topax]] [-d] <src> ... <dst>]
        [-createSnapshot <snapshotDir> [<snapshotName>]]
        [-deleteSnapshot <snapshotDir> <snapshotName>]
        [-df [-h] [<path> ...]]
        [-du [-s] [-h] [-x] <path> ...]
        [-expunge]
        [-find <path> ... <expression> ...]
        [-get [-f] [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]
        [-getfacl [-R] <path>]
        [-getfattr [-R] {-n name | -d} [-e en] <path>]
        [-getmerge [-nl] [-skip-empty-file] <src> <localdst>]
        [-help [cmd ...]]
        [-ls [-C] [-d] [-h] [-q] [-R] [-t] [-S] [-r] [-u] [<path> ...]]
        [-mkdir [-p] <path> ...]
        [-moveFromLocal <localsrc> ... <dst>]



Regards,
Kamaraj
InfoSight - SRE


Re: Error while running "hdfs fs" commands

Posted by "संजीव (Sanjeev Tripurari)" <sa...@gmail.com>.
hi

you will always get response from active namenode only and not from
standby, there could be different problem you are having..
can you share screen shot of your namenode ui pages from both the nodes.

regards
-Sanjeev




On Mon, 24 Aug 2020 at 11:40, Muthupandiyan, Kamaraj <
kamaraj.muthupandiyan@hpe.com> wrote:

> Hi Sanjeev,
>
>
>
> *Thank you it worked for me.*
>
>
>
> And another issue is , how to execute the hdfs commands on standby nodes?
>
>
>
> *[hdfs@sym-hdfsnn2 zookeeper-3.5.8]$ hadoop fs -ls /*
>
> *ls: Operation category READ is not supported in state standby. Visit
> https://s.apache.org/sbnn-error <https://s.apache.org/sbnn-error>*
>
>
>
> Regards,
>
> Kamaraj
>
> InfoSight - SRE
>
>
>
> *From: *संजीव (Sanjeev Tripurari) <sa...@gmail.com>
> *Sent: *Monday, August 24, 2020 9:54 AM
> *Subject: *Re: Error while running "hdfs fs" commands
>
>
>
> can you add, and check once,
>
>
> -bash-4.4$ cat /etc/hosts
>
> 127.0.0.1       hdfs
>
>
>
>
>
> On Mon, 24 Aug 2020 at 09:46, Muthupandiyan, Kamaraj <
> kamaraj.muthupandiyan@hpe.com> wrote:
>
> Hi Sanjeev,
>
>
>
> Below my host entry on one of the name node
>
>
>
> *[hdfs@sym-hdfsnn1 ~]$ cat /etc/hosts*
>
> *127.0.0.1   localhost localhost.localdomain localhost4
> localhost4.localdomain4*
>
>
>
> *10.175.6.10    sym-hdfsnn1.lvs.nimblestorage.com
> <http://sym-hdfsnn1.lvs.nimblestorage.com> sym-hdfsnn1*
>
>
>
> Regards,
>
> Kamaraj
>
>
>
>
>
> *From: *संजीव (Sanjeev Tripurari) <sa...@gmail.com>
> *Sent: *Monday, August 24, 2020 9:43 AM
> *To: *Muthupandiyan, Kamaraj <ka...@hpe.com>
> *Cc: *user@hadoop.apache.org
> *Subject: *Re: Error while running "hdfs fs" commands
>
>
>
> Hi,
>
>
>
> looks like your hostname is *hdfs,*
>
> can you share  your /etc/hosts file, you will have to enter ip and
> hostname, as it looks you command is doing host lookup..
>
>
>
> Regards
>
> -Sanjeev
>
>
>
> On Fri, 21 Aug 2020 at 21:37, Muthupandiyan, Kamaraj <
> kamaraj.muthupandiyan@hpe.com> wrote:
>
> Hi Team,
>
>
>
> Whenever I am executing the hdfs fs command and I am getting below error,
> please help me with this.
>
>
>
> [hdfs@hdfsnn1 ~]$ hdfs dfs -ls /user
>
> -ls: java.net.UnknownHostException: hdfs
>
>
>
>
>
> [hdfs@sym-hdfsnn1 ~]$ hadoop fs -ls / file:///|hdfs:hdfsnn1:8020/
>
> -bash: hdfs:sym-hdfsnn1.lvs.nimblestorage.com:8020/: No such file or
> directory
>
> -ls: java.net.UnknownHostException: hdfs
>
> Usage: hadoop fs [generic options]
>
>         [-appendToFile <localsrc> ... <dst>]
>
>         [-cat [-ignoreCrc] <src> ...]
>
>         [-checksum <src> ...]
>
>         [-chgrp [-R] GROUP PATH...]
>
>         [-chmod [-R] <MODE[,MODE]... | OCTALMODE> PATH...]
>
>         [-chown [-R] [OWNER][:[GROUP]] PATH...]
>
>         [-copyFromLocal [-f] [-p] [-l] [-d] <localsrc> ... <dst>]
>
>         [-copyToLocal [-f] [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]
>
>         [-count [-q] [-h] [-v] [-t [<storage type>]] [-u] [-x] <path> ...]
>
>         [-cp [-f] [-p | -p[topax]] [-d] <src> ... <dst>]
>
>         [-createSnapshot <snapshotDir> [<snapshotName>]]
>
>         [-deleteSnapshot <snapshotDir> <snapshotName>]
>
>         [-df [-h] [<path> ...]]
>
>         [-du [-s] [-h] [-x] <path> ...]
>
>         [-expunge]
>
>         [-find <path> ... <expression> ...]
>
>         [-get [-f] [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]
>
>         [-getfacl [-R] <path>]
>
>         [-getfattr [-R] {-n name | -d} [-e en] <path>]
>
>         [-getmerge [-nl] [-skip-empty-file] <src> <localdst>]
>
>         [-help [cmd ...]]
>
>         [-ls [-C] [-d] [-h] [-q] [-R] [-t] [-S] [-r] [-u] [<path> ...]]
>
>         [-mkdir [-p] <path> ...]
>
>         [-moveFromLocal <localsrc> ... <dst>]
>
>
>
>
>
>
>
> Regards,
>
> Kamaraj
>
> InfoSight - SRE
>
>
>
>
>
>
>

RE: Error while running "hdfs fs" commands

Posted by "Muthupandiyan, Kamaraj" <ka...@hpe.com>.
Hi Sanjeev,

Thank you it worked for me.

And another issue is , how to execute the hdfs commands on standby nodes?

[hdfs@sym-hdfsnn2 zookeeper-3.5.8]$ hadoop fs -ls /
ls: Operation category READ is not supported in state standby. Visit https://s.apache.org/sbnn-error

Regards,
Kamaraj
InfoSight - SRE

From: संजीव (Sanjeev Tripurari)<ma...@gmail.com>
Sent: Monday, August 24, 2020 9:54 AM
Subject: Re: Error while running "hdfs fs" commands

can you add, and check once,

-bash-4.4$ cat /etc/hosts
127.0.0.1       hdfs


On Mon, 24 Aug 2020 at 09:46, Muthupandiyan, Kamaraj <ka...@hpe.com>> wrote:
Hi Sanjeev,

Below my host entry on one of the name node

[hdfs@sym-hdfsnn1 ~]$ cat /etc/hosts
127.0.0.1   localhost localhost.localdomain localhost4 localhost4.localdomain4

10.175.6.10    sym-hdfsnn1.lvs.nimblestorage.com<http://sym-hdfsnn1.lvs.nimblestorage.com> sym-hdfsnn1

Regards,
Kamaraj


From: संजीव (Sanjeev Tripurari)<ma...@gmail.com>
Sent: Monday, August 24, 2020 9:43 AM
To: Muthupandiyan, Kamaraj<ma...@hpe.com>
Cc: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: Error while running "hdfs fs" commands

Hi,

looks like your hostname is hdfs,
can you share  your /etc/hosts file, you will have to enter ip and hostname, as it looks you command is doing host lookup..

Regards
-Sanjeev

On Fri, 21 Aug 2020 at 21:37, Muthupandiyan, Kamaraj <ka...@hpe.com>> wrote:
Hi Team,

Whenever I am executing the hdfs fs command and I am getting below error, please help me with this.

[hdfs@hdfsnn1 ~]$ hdfs dfs -ls /user
-ls: java.net.UnknownHostException: hdfs


[hdfs@sym-hdfsnn1 ~]$ hadoop fs -ls / file:///|hdfs:hdfsnn1:8020/
-bash: hdfs:sym-hdfsnn1.lvs.nimblestorage.com:8020/<http://sym-hdfsnn1.lvs.nimblestorage.com:8020/>: No such file or directory
-ls: java.net.UnknownHostException: hdfs
Usage: hadoop fs [generic options]
        [-appendToFile <localsrc> ... <dst>]
        [-cat [-ignoreCrc] <src> ...]
        [-checksum <src> ...]
        [-chgrp [-R] GROUP PATH...]
        [-chmod [-R] <MODE[,MODE]... | OCTALMODE> PATH...]
        [-chown [-R] [OWNER][:[GROUP]] PATH...]
        [-copyFromLocal [-f] [-p] [-l] [-d] <localsrc> ... <dst>]
        [-copyToLocal [-f] [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]
        [-count [-q] [-h] [-v] [-t [<storage type>]] [-u] [-x] <path> ...]
        [-cp [-f] [-p | -p[topax]] [-d] <src> ... <dst>]
        [-createSnapshot <snapshotDir> [<snapshotName>]]
        [-deleteSnapshot <snapshotDir> <snapshotName>]
        [-df [-h] [<path> ...]]
        [-du [-s] [-h] [-x] <path> ...]
        [-expunge]
        [-find <path> ... <expression> ...]
        [-get [-f] [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]
        [-getfacl [-R] <path>]
        [-getfattr [-R] {-n name | -d} [-e en] <path>]
        [-getmerge [-nl] [-skip-empty-file] <src> <localdst>]
        [-help [cmd ...]]
        [-ls [-C] [-d] [-h] [-q] [-R] [-t] [-S] [-r] [-u] [<path> ...]]
        [-mkdir [-p] <path> ...]
        [-moveFromLocal <localsrc> ... <dst>]



Regards,
Kamaraj
InfoSight - SRE




Re: Error while running "hdfs fs" commands

Posted by "संजीव (Sanjeev Tripurari)" <sa...@gmail.com>.
can you add, and check once,

-bash-4.4$ cat /etc/hosts
127.0.0.1       hdfs


On Mon, 24 Aug 2020 at 09:46, Muthupandiyan, Kamaraj <
kamaraj.muthupandiyan@hpe.com> wrote:

> Hi Sanjeev,
>
>
>
> Below my host entry on one of the name node
>
>
>
> *[hdfs@sym-hdfsnn1 ~]$ cat /etc/hosts*
>
> *127.0.0.1   localhost localhost.localdomain localhost4
> localhost4.localdomain4*
>
>
>
> *10.175.6.10    sym-hdfsnn1.lvs.nimblestorage.com
> <http://sym-hdfsnn1.lvs.nimblestorage.com> sym-hdfsnn1*
>
>
>
> Regards,
>
> Kamaraj
>
>
>
>
>
> *From: *संजीव (Sanjeev Tripurari) <sa...@gmail.com>
> *Sent: *Monday, August 24, 2020 9:43 AM
> *To: *Muthupandiyan, Kamaraj <ka...@hpe.com>
> *Cc: *user@hadoop.apache.org
> *Subject: *Re: Error while running "hdfs fs" commands
>
>
>
> Hi,
>
>
>
> looks like your hostname is *hdfs,*
>
> can you share  your /etc/hosts file, you will have to enter ip and
> hostname, as it looks you command is doing host lookup..
>
>
>
> Regards
>
> -Sanjeev
>
>
>
> On Fri, 21 Aug 2020 at 21:37, Muthupandiyan, Kamaraj <
> kamaraj.muthupandiyan@hpe.com> wrote:
>
> Hi Team,
>
>
>
> Whenever I am executing the hdfs fs command and I am getting below error,
> please help me with this.
>
>
>
> [hdfs@hdfsnn1 ~]$ hdfs dfs -ls /user
>
> -ls: java.net.UnknownHostException: hdfs
>
>
>
>
>
> [hdfs@sym-hdfsnn1 ~]$ hadoop fs -ls / file:///|hdfs:hdfsnn1:8020/
>
> -bash: hdfs:sym-hdfsnn1.lvs.nimblestorage.com:8020/: No such file or
> directory
>
> -ls: java.net.UnknownHostException: hdfs
>
> Usage: hadoop fs [generic options]
>
>         [-appendToFile <localsrc> ... <dst>]
>
>         [-cat [-ignoreCrc] <src> ...]
>
>         [-checksum <src> ...]
>
>         [-chgrp [-R] GROUP PATH...]
>
>         [-chmod [-R] <MODE[,MODE]... | OCTALMODE> PATH...]
>
>         [-chown [-R] [OWNER][:[GROUP]] PATH...]
>
>         [-copyFromLocal [-f] [-p] [-l] [-d] <localsrc> ... <dst>]
>
>         [-copyToLocal [-f] [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]
>
>         [-count [-q] [-h] [-v] [-t [<storage type>]] [-u] [-x] <path> ...]
>
>         [-cp [-f] [-p | -p[topax]] [-d] <src> ... <dst>]
>
>         [-createSnapshot <snapshotDir> [<snapshotName>]]
>
>         [-deleteSnapshot <snapshotDir> <snapshotName>]
>
>         [-df [-h] [<path> ...]]
>
>         [-du [-s] [-h] [-x] <path> ...]
>
>         [-expunge]
>
>         [-find <path> ... <expression> ...]
>
>         [-get [-f] [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]
>
>         [-getfacl [-R] <path>]
>
>         [-getfattr [-R] {-n name | -d} [-e en] <path>]
>
>         [-getmerge [-nl] [-skip-empty-file] <src> <localdst>]
>
>         [-help [cmd ...]]
>
>         [-ls [-C] [-d] [-h] [-q] [-R] [-t] [-S] [-r] [-u] [<path> ...]]
>
>         [-mkdir [-p] <path> ...]
>
>         [-moveFromLocal <localsrc> ... <dst>]
>
>
>
>
>
>
>
> Regards,
>
> Kamaraj
>
> InfoSight - SRE
>
>
>
>
>

RE: Error while running "hdfs fs" commands

Posted by "Muthupandiyan, Kamaraj" <ka...@hpe.com>.
Hi Sanjeev,

Below my host entry on one of the name node

[hdfs@sym-hdfsnn1 ~]$ cat /etc/hosts
127.0.0.1   localhost localhost.localdomain localhost4 localhost4.localdomain4

10.175.6.10    sym-hdfsnn1.lvs.nimblestorage.com sym-hdfsnn1

Regards,
Kamaraj


From: संजीव (Sanjeev Tripurari)<ma...@gmail.com>
Sent: Monday, August 24, 2020 9:43 AM
To: Muthupandiyan, Kamaraj<ma...@hpe.com>
Cc: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: Error while running "hdfs fs" commands

Hi,

looks like your hostname is hdfs,
can you share  your /etc/hosts file, you will have to enter ip and hostname, as it looks you command is doing host lookup..

Regards
-Sanjeev

On Fri, 21 Aug 2020 at 21:37, Muthupandiyan, Kamaraj <ka...@hpe.com>> wrote:
Hi Team,

Whenever I am executing the hdfs fs command and I am getting below error, please help me with this.

[hdfs@hdfsnn1 ~]$ hdfs dfs -ls /user
-ls: java.net.UnknownHostException: hdfs


[hdfs@sym-hdfsnn1 ~]$ hadoop fs -ls / file:///|hdfs:hdfsnn1:8020/
-bash: hdfs:sym-hdfsnn1.lvs.nimblestorage.com:8020/<http://sym-hdfsnn1.lvs.nimblestorage.com:8020/>: No such file or directory
-ls: java.net.UnknownHostException: hdfs
Usage: hadoop fs [generic options]
        [-appendToFile <localsrc> ... <dst>]
        [-cat [-ignoreCrc] <src> ...]
        [-checksum <src> ...]
        [-chgrp [-R] GROUP PATH...]
        [-chmod [-R] <MODE[,MODE]... | OCTALMODE> PATH...]
        [-chown [-R] [OWNER][:[GROUP]] PATH...]
        [-copyFromLocal [-f] [-p] [-l] [-d] <localsrc> ... <dst>]
        [-copyToLocal [-f] [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]
        [-count [-q] [-h] [-v] [-t [<storage type>]] [-u] [-x] <path> ...]
        [-cp [-f] [-p | -p[topax]] [-d] <src> ... <dst>]
        [-createSnapshot <snapshotDir> [<snapshotName>]]
        [-deleteSnapshot <snapshotDir> <snapshotName>]
        [-df [-h] [<path> ...]]
        [-du [-s] [-h] [-x] <path> ...]
        [-expunge]
        [-find <path> ... <expression> ...]
        [-get [-f] [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]
        [-getfacl [-R] <path>]
        [-getfattr [-R] {-n name | -d} [-e en] <path>]
        [-getmerge [-nl] [-skip-empty-file] <src> <localdst>]
        [-help [cmd ...]]
        [-ls [-C] [-d] [-h] [-q] [-R] [-t] [-S] [-r] [-u] [<path> ...]]
        [-mkdir [-p] <path> ...]
        [-moveFromLocal <localsrc> ... <dst>]



Regards,
Kamaraj
InfoSight - SRE



Re: Error while running "hdfs fs" commands

Posted by "संजीव (Sanjeev Tripurari)" <sa...@gmail.com>.
Hi,

looks like your hostname is *hdfs,*
can you share  your /etc/hosts file, you will have to enter ip and
hostname, as it looks you command is doing host lookup..

Regards
-Sanjeev

On Fri, 21 Aug 2020 at 21:37, Muthupandiyan, Kamaraj <
kamaraj.muthupandiyan@hpe.com> wrote:

> Hi Team,
>
>
>
> Whenever I am executing the hdfs fs command and I am getting below error,
> please help me with this.
>
>
>
> [hdfs@hdfsnn1 ~]$ hdfs dfs -ls /user
>
> -ls: java.net.UnknownHostException: hdfs
>
>
>
>
>
> [hdfs@sym-hdfsnn1 ~]$ hadoop fs -ls / file:///|hdfs:hdfsnn1:8020/
>
> -bash: hdfs:sym-hdfsnn1.lvs.nimblestorage.com:8020/: No such file or
> directory
>
> -ls: java.net.UnknownHostException: hdfs
>
> Usage: hadoop fs [generic options]
>
>         [-appendToFile <localsrc> ... <dst>]
>
>         [-cat [-ignoreCrc] <src> ...]
>
>         [-checksum <src> ...]
>
>         [-chgrp [-R] GROUP PATH...]
>
>         [-chmod [-R] <MODE[,MODE]... | OCTALMODE> PATH...]
>
>         [-chown [-R] [OWNER][:[GROUP]] PATH...]
>
>         [-copyFromLocal [-f] [-p] [-l] [-d] <localsrc> ... <dst>]
>
>         [-copyToLocal [-f] [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]
>
>         [-count [-q] [-h] [-v] [-t [<storage type>]] [-u] [-x] <path> ...]
>
>         [-cp [-f] [-p | -p[topax]] [-d] <src> ... <dst>]
>
>         [-createSnapshot <snapshotDir> [<snapshotName>]]
>
>         [-deleteSnapshot <snapshotDir> <snapshotName>]
>
>         [-df [-h] [<path> ...]]
>
>         [-du [-s] [-h] [-x] <path> ...]
>
>         [-expunge]
>
>         [-find <path> ... <expression> ...]
>
>         [-get [-f] [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]
>
>         [-getfacl [-R] <path>]
>
>         [-getfattr [-R] {-n name | -d} [-e en] <path>]
>
>         [-getmerge [-nl] [-skip-empty-file] <src> <localdst>]
>
>         [-help [cmd ...]]
>
>         [-ls [-C] [-d] [-h] [-q] [-R] [-t] [-S] [-r] [-u] [<path> ...]]
>
>         [-mkdir [-p] <path> ...]
>
>         [-moveFromLocal <localsrc> ... <dst>]
>
>
>
>
>
>
>
> Regards,
>
> Kamaraj
>
> InfoSight - SRE
>
>
>

RE: Error while running "hdfs fs" commands

Posted by "Muthupandiyan, Kamaraj" <ka...@hpe.com>.
Hi Ayush,

Yes, I am able to access namenode UI.

Below the value of namespace

<property>
        <name>dfs.nameservices</name>
        <value>hdfs</value>
   </property>

And below the defaultFS value

<property>
     <name>fs.defaultFS</name>
        <value>hdfs://hdfs</value>
    </property>

Regards,
Kamaraj
InfoSight - SRE

From: Ayush Saxena<ma...@gmail.com>
Sent: Sunday, August 23, 2020 8:46 PM
To: Muthupandiyan, Kamaraj<ma...@hpe.com>
Cc: Mingliang Liu<ma...@gmail.com>; user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: Error while running "hdfs fs" commands

Are you able to access namenode UI? if so, check what is the value for "Namespace:", say it is mycluster then in fs.defaultFs configure hdfs://mycluster
The value might be there in hdfs-site.xml as well, under dfs.nameservices

-Ayush



On Sun, 23 Aug 2020 at 20:20, Muthupandiyan, Kamaraj <ka...@hpe.com>> wrote:
Hi Ayush,

Thanks for your reply, yes we followed the same document and applied the config. For our setup, we have configured HA and its working fine. But, we are getting the following error while executing the the commands.

-ls: java.net.UnknownHostException: hdfs
-mkdir: java.net.UnknownHostException: hdfs


Regards,
Kamaraj
InfoSight - SRE

From: Ayush Saxena<ma...@gmail.com>
Sent: Sunday, August 23, 2020 7:17 PM
To: Muthupandiyan, Kamaraj<ma...@hpe.com>
Cc: Mingliang Liu<ma...@gmail.com>; user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: Error while running "hdfs fs" commands

Hi Kamaraj,
If you are trying to setup an HA cluster, you need couple of more configs as well, You can follow this document, this should answer all your doubts :
https://hadoop.apache.org/docs/r3.3.0/hadoop-project-dist/hadoop-hdfs/HDFSHighAvailabilityWithQJM.html#Deployment<https://hadoop.apache.org/docs/r3.3.0/hadoop-project-dist/hadoop-hdfs/HDFSHighAvailabilityWithQJM.html#Deployment>

-Ayush

On Sun, 23 Aug 2020 at 18:53, Muthupandiyan, Kamaraj <ka...@hpe.com>> wrote:
Hi Mingliang Liu,

Thank you for your reply.

As I am completely new to setting up the Hadoop cluster I am not sure about the value of  fs.defaultFS. I have set the below one as the value, Could you please help me with the correct value.

<property>
        <name>fs.defaultFS</name>
        <value>hdfs://hdfs</value>
    </property>


As a additional information, we have two name nodes in our cluster.

Regards,
Kamaraj
InfoSight - SRE

From: Mingliang Liu<ma...@gmail.com>
Sent: Friday, August 21, 2020 11:14 PM
To: Muthupandiyan, Kamaraj<ma...@hpe.com>
Cc: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: Error while running "hdfs fs" commands

Seems that you have a wrong fs.defaultFS configuration in your core-site.xml

For the second command you should follow the user manual. The input is simply not valid. Try something like:

hadoop fs -ls hdfs://sym-hdfsnn1.lvs.nimblestorage.com/user<http://sym-hdfsnn1.lvs.nimblestorage.com/user>
or
hadoop fs -ls file:///tmp

On Fri, Aug 21, 2020 at 9:07 AM Muthupandiyan, Kamaraj <ka...@hpe.com>> wrote:
Hi Team,

Whenever I am executing the hdfs fs command and I am getting below error, please help me with this.

[hdfs@hdfsnn1 ~]$ hdfs dfs -ls /user
-ls: java.net.UnknownHostException: hdfs


[hdfs@sym-hdfsnn1 ~]$ hadoop fs -ls / file:///%7Chdfs:hdfsnn1:8020/
-bash: hdfs:sym-hdfsnn1.lvs.nimblestorage.com:8020/<http://sym-hdfsnn1.lvs.nimblestorage.com:8020/>: No such file or directory
-ls: java.net.UnknownHostException: hdfs
Usage: hadoop fs [generic options]
        [-appendToFile <localsrc> ... <dst>]
        [-cat [-ignoreCrc] <src> ...]
        [-checksum <src> ...]
        [-chgrp [-R] GROUP PATH...]
        [-chmod [-R] <MODE[,MODE]... | OCTALMODE> PATH...]
        [-chown [-R] [OWNER][:[GROUP]] PATH...]
        [-copyFromLocal [-f] [-p] [-l] [-d] <localsrc> ... <dst>]
        [-copyToLocal [-f] [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]
        [-count [-q] [-h] [-v] [-t [<storage type>]] [-u] [-x] <path> ...]
        [-cp [-f] [-p | -p[topax]] [-d] <src> ... <dst>]
        [-createSnapshot <snapshotDir> [<snapshotName>]]
        [-deleteSnapshot <snapshotDir> <snapshotName>]
        [-df [-h] [<path> ...]]
        [-du [-s] [-h] [-x] <path> ...]
        [-expunge]
        [-find <path> ... <expression> ...]
        [-get [-f] [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]
        [-getfacl [-R] <path>]
        [-getfattr [-R] {-n name | -d} [-e en] <path>]
        [-getmerge [-nl] [-skip-empty-file] <src> <localdst>]
        [-help [cmd ...]]
        [-ls [-C] [-d] [-h] [-q] [-R] [-t] [-S] [-r] [-u] [<path> ...]]
        [-mkdir [-p] <path> ...]
        [-moveFromLocal <localsrc> ... <dst>]



Regards,
Kamaraj
InfoSight - SRE



--
L




Re: Error while running "hdfs fs" commands

Posted by Ayush Saxena <ay...@gmail.com>.
Are you able to access namenode UI? if so, check what is the value for
"Namespace:", say it is mycluster then in fs.defaultFs configure
hdfs://mycluster
The value might be there in hdfs-site.xml as well, under dfs.nameservices

-Ayush



On Sun, 23 Aug 2020 at 20:20, Muthupandiyan, Kamaraj <
kamaraj.muthupandiyan@hpe.com> wrote:

> Hi Ayush,
>
>
>
> Thanks for your reply, yes we followed the same document and applied the
> config. For our setup, we have configured HA and its working fine. But, we
> are getting the following error while executing the the commands.
>
>
>
> *-ls: java.net.UnknownHostException: hdfs*
>
> *-mkdir: java.net.UnknownHostException: hdfs*
>
>
>
>
>
> Regards,
>
> Kamaraj
>
> InfoSight - SRE
>
>
>
> *From: *Ayush Saxena <ay...@gmail.com>
> *Sent: *Sunday, August 23, 2020 7:17 PM
> *To: *Muthupandiyan, Kamaraj <ka...@hpe.com>
> *Cc: *Mingliang Liu <li...@gmail.com>; user@hadoop.apache.org
> *Subject: *Re: Error while running "hdfs fs" commands
>
>
>
> Hi Kamaraj,
>
> If you are trying to setup an HA cluster, you need couple of more configs
> as well, You can follow this document, this should answer all your doubts :
>
>
> https://hadoop.apache.org/docs/r3.3.0/hadoop-project-dist/hadoop-hdfs/HDFSHighAvailabilityWithQJM.html#Deployment
>
>
>
> -Ayush
>
>
>
> On Sun, 23 Aug 2020 at 18:53, Muthupandiyan, Kamaraj <
> kamaraj.muthupandiyan@hpe.com> wrote:
>
> Hi Mingliang Liu,
>
>
>
> Thank you for your reply.
>
>
>
> As I am completely new to setting up the Hadoop cluster I am not sure
> about the value of  *fs.defaultFS.* I have set the below one as the
> value, Could you please help me with the correct value.
>
>
>
> <property>
>
>         <name>fs.defaultFS</name>
>
>         <value>hdfs://hdfs</value>
>
>     </property>
>
>
>
>
>
> As a additional information, we have two name nodes in our cluster.
>
>
>
> Regards,
>
> Kamaraj
>
> InfoSight - SRE
>
>
>
> *From: *Mingliang Liu <li...@gmail.com>
> *Sent: *Friday, August 21, 2020 11:14 PM
> *To: *Muthupandiyan, Kamaraj <ka...@hpe.com>
> *Cc: *user@hadoop.apache.org
> *Subject: *Re: Error while running "hdfs fs" commands
>
>
>
> Seems that you have a wrong fs.defaultFS configuration in your
> core-site.xml
>
>
>
> For the second command you should follow the user manual. The input is
> simply not valid. Try something like:
>
>
>
> hadoop fs -ls hdfs://sym-hdfsnn1.lvs.nimblestorage.com/user
>
> or
>
> hadoop fs -ls file:///tmp
>
>
>
> On Fri, Aug 21, 2020 at 9:07 AM Muthupandiyan, Kamaraj <
> kamaraj.muthupandiyan@hpe.com> wrote:
>
> Hi Team,
>
>
>
> Whenever I am executing the hdfs fs command and I am getting below error,
> please help me with this.
>
>
>
> [hdfs@hdfsnn1 ~]$ hdfs dfs -ls /user
>
> -ls: java.net.UnknownHostException: hdfs
>
>
>
>
>
> [hdfs@sym-hdfsnn1 ~]$ hadoop fs -ls / file:///%7Chdfs:hdfsnn1:8020/
>
> -bash: hdfs:sym-hdfsnn1.lvs.nimblestorage.com:8020/: No such file or
> directory
>
> -ls: java.net.UnknownHostException: hdfs
>
> Usage: hadoop fs [generic options]
>
>         [-appendToFile <localsrc> ... <dst>]
>
>         [-cat [-ignoreCrc] <src> ...]
>
>         [-checksum <src> ...]
>
>         [-chgrp [-R] GROUP PATH...]
>
>         [-chmod [-R] <MODE[,MODE]... | OCTALMODE> PATH...]
>
>         [-chown [-R] [OWNER][:[GROUP]] PATH...]
>
>         [-copyFromLocal [-f] [-p] [-l] [-d] <localsrc> ... <dst>]
>
>         [-copyToLocal [-f] [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]
>
>         [-count [-q] [-h] [-v] [-t [<storage type>]] [-u] [-x] <path> ...]
>
>         [-cp [-f] [-p | -p[topax]] [-d] <src> ... <dst>]
>
>         [-createSnapshot <snapshotDir> [<snapshotName>]]
>
>         [-deleteSnapshot <snapshotDir> <snapshotName>]
>
>         [-df [-h] [<path> ...]]
>
>         [-du [-s] [-h] [-x] <path> ...]
>
>         [-expunge]
>
>         [-find <path> ... <expression> ...]
>
>         [-get [-f] [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]
>
>         [-getfacl [-R] <path>]
>
>         [-getfattr [-R] {-n name | -d} [-e en] <path>]
>
>         [-getmerge [-nl] [-skip-empty-file] <src> <localdst>]
>
>         [-help [cmd ...]]
>
>         [-ls [-C] [-d] [-h] [-q] [-R] [-t] [-S] [-r] [-u] [<path> ...]]
>
>         [-mkdir [-p] <path> ...]
>
>         [-moveFromLocal <localsrc> ... <dst>]
>
>
>
>
>
>
>
> Regards,
>
> Kamaraj
>
> InfoSight - SRE
>
>
>
>
>
>
> --
>
> L
>
>
>
>
>

RE: Error while running "hdfs fs" commands

Posted by "Muthupandiyan, Kamaraj" <ka...@hpe.com>.
Hi Ayush,

Thanks for your reply, yes we followed the same document and applied the config. For our setup, we have configured HA and its working fine. But, we are getting the following error while executing the the commands.

-ls: java.net.UnknownHostException: hdfs
-mkdir: java.net.UnknownHostException: hdfs


Regards,
Kamaraj
InfoSight - SRE

From: Ayush Saxena<ma...@gmail.com>
Sent: Sunday, August 23, 2020 7:17 PM
To: Muthupandiyan, Kamaraj<ma...@hpe.com>
Cc: Mingliang Liu<ma...@gmail.com>; user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: Error while running "hdfs fs" commands

Hi Kamaraj,
If you are trying to setup an HA cluster, you need couple of more configs as well, You can follow this document, this should answer all your doubts :
https://hadoop.apache.org/docs/r3.3.0/hadoop-project-dist/hadoop-hdfs/HDFSHighAvailabilityWithQJM.html#Deployment<https://hadoop.apache.org/docs/r3.3.0/hadoop-project-dist/hadoop-hdfs/HDFSHighAvailabilityWithQJM.html#Deployment>

-Ayush

On Sun, 23 Aug 2020 at 18:53, Muthupandiyan, Kamaraj <ka...@hpe.com>> wrote:
Hi Mingliang Liu,

Thank you for your reply.

As I am completely new to setting up the Hadoop cluster I am not sure about the value of  fs.defaultFS. I have set the below one as the value, Could you please help me with the correct value.

<property>
        <name>fs.defaultFS</name>
        <value>hdfs://hdfs</value>
    </property>


As a additional information, we have two name nodes in our cluster.

Regards,
Kamaraj
InfoSight - SRE

From: Mingliang Liu<ma...@gmail.com>
Sent: Friday, August 21, 2020 11:14 PM
To: Muthupandiyan, Kamaraj<ma...@hpe.com>
Cc: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: Error while running "hdfs fs" commands

Seems that you have a wrong fs.defaultFS configuration in your core-site.xml

For the second command you should follow the user manual. The input is simply not valid. Try something like:

hadoop fs -ls hdfs://sym-hdfsnn1.lvs.nimblestorage.com/user<http://sym-hdfsnn1.lvs.nimblestorage.com/user>
or
hadoop fs -ls file:///tmp

On Fri, Aug 21, 2020 at 9:07 AM Muthupandiyan, Kamaraj <ka...@hpe.com>> wrote:
Hi Team,

Whenever I am executing the hdfs fs command and I am getting below error, please help me with this.

[hdfs@hdfsnn1 ~]$ hdfs dfs -ls /user
-ls: java.net.UnknownHostException: hdfs


[hdfs@sym-hdfsnn1 ~]$ hadoop fs -ls / file:///%7Chdfs:hdfsnn1:8020/
-bash: hdfs:sym-hdfsnn1.lvs.nimblestorage.com:8020/<http://sym-hdfsnn1.lvs.nimblestorage.com:8020/>: No such file or directory
-ls: java.net.UnknownHostException: hdfs
Usage: hadoop fs [generic options]
        [-appendToFile <localsrc> ... <dst>]
        [-cat [-ignoreCrc] <src> ...]
        [-checksum <src> ...]
        [-chgrp [-R] GROUP PATH...]
        [-chmod [-R] <MODE[,MODE]... | OCTALMODE> PATH...]
        [-chown [-R] [OWNER][:[GROUP]] PATH...]
        [-copyFromLocal [-f] [-p] [-l] [-d] <localsrc> ... <dst>]
        [-copyToLocal [-f] [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]
        [-count [-q] [-h] [-v] [-t [<storage type>]] [-u] [-x] <path> ...]
        [-cp [-f] [-p | -p[topax]] [-d] <src> ... <dst>]
        [-createSnapshot <snapshotDir> [<snapshotName>]]
        [-deleteSnapshot <snapshotDir> <snapshotName>]
        [-df [-h] [<path> ...]]
        [-du [-s] [-h] [-x] <path> ...]
        [-expunge]
        [-find <path> ... <expression> ...]
        [-get [-f] [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]
        [-getfacl [-R] <path>]
        [-getfattr [-R] {-n name | -d} [-e en] <path>]
        [-getmerge [-nl] [-skip-empty-file] <src> <localdst>]
        [-help [cmd ...]]
        [-ls [-C] [-d] [-h] [-q] [-R] [-t] [-S] [-r] [-u] [<path> ...]]
        [-mkdir [-p] <path> ...]
        [-moveFromLocal <localsrc> ... <dst>]



Regards,
Kamaraj
InfoSight - SRE



--
L



Re: Error while running "hdfs fs" commands

Posted by Ayush Saxena <ay...@gmail.com>.
Hi Kamaraj,
If you are trying to setup an HA cluster, you need couple of more configs
as well, You can follow this document, this should answer all your doubts :
https://hadoop.apache.org/docs/r3.3.0/hadoop-project-dist/hadoop-hdfs/HDFSHighAvailabilityWithQJM.html#Deployment

-Ayush

On Sun, 23 Aug 2020 at 18:53, Muthupandiyan, Kamaraj <
kamaraj.muthupandiyan@hpe.com> wrote:

> Hi Mingliang Liu,
>
>
>
> Thank you for your reply.
>
>
>
> As I am completely new to setting up the Hadoop cluster I am not sure
> about the value of  *fs.defaultFS.* I have set the below one as the
> value, Could you please help me with the correct value.
>
>
>
> <property>
>
>         <name>fs.defaultFS</name>
>
>         <value>hdfs://hdfs</value>
>
>     </property>
>
>
>
>
>
> As a additional information, we have two name nodes in our cluster.
>
>
>
> Regards,
>
> Kamaraj
>
> InfoSight - SRE
>
>
>
> *From: *Mingliang Liu <li...@gmail.com>
> *Sent: *Friday, August 21, 2020 11:14 PM
> *To: *Muthupandiyan, Kamaraj <ka...@hpe.com>
> *Cc: *user@hadoop.apache.org
> *Subject: *Re: Error while running "hdfs fs" commands
>
>
>
> Seems that you have a wrong fs.defaultFS configuration in your
> core-site.xml
>
>
>
> For the second command you should follow the user manual. The input is
> simply not valid. Try something like:
>
>
>
> hadoop fs -ls hdfs://sym-hdfsnn1.lvs.nimblestorage.com/user
>
> or
>
> hadoop fs -ls file:///tmp
>
>
>
> On Fri, Aug 21, 2020 at 9:07 AM Muthupandiyan, Kamaraj <
> kamaraj.muthupandiyan@hpe.com> wrote:
>
> Hi Team,
>
>
>
> Whenever I am executing the hdfs fs command and I am getting below error,
> please help me with this.
>
>
>
> [hdfs@hdfsnn1 ~]$ hdfs dfs -ls /user
>
> -ls: java.net.UnknownHostException: hdfs
>
>
>
>
>
> [hdfs@sym-hdfsnn1 ~]$ hadoop fs -ls / file:///|hdfs:hdfsnn1:8020/
>
> -bash: hdfs:sym-hdfsnn1.lvs.nimblestorage.com:8020/: No such file or
> directory
>
> -ls: java.net.UnknownHostException: hdfs
>
> Usage: hadoop fs [generic options]
>
>         [-appendToFile <localsrc> ... <dst>]
>
>         [-cat [-ignoreCrc] <src> ...]
>
>         [-checksum <src> ...]
>
>         [-chgrp [-R] GROUP PATH...]
>
>         [-chmod [-R] <MODE[,MODE]... | OCTALMODE> PATH...]
>
>         [-chown [-R] [OWNER][:[GROUP]] PATH...]
>
>         [-copyFromLocal [-f] [-p] [-l] [-d] <localsrc> ... <dst>]
>
>         [-copyToLocal [-f] [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]
>
>         [-count [-q] [-h] [-v] [-t [<storage type>]] [-u] [-x] <path> ...]
>
>         [-cp [-f] [-p | -p[topax]] [-d] <src> ... <dst>]
>
>         [-createSnapshot <snapshotDir> [<snapshotName>]]
>
>         [-deleteSnapshot <snapshotDir> <snapshotName>]
>
>         [-df [-h] [<path> ...]]
>
>         [-du [-s] [-h] [-x] <path> ...]
>
>         [-expunge]
>
>         [-find <path> ... <expression> ...]
>
>         [-get [-f] [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]
>
>         [-getfacl [-R] <path>]
>
>         [-getfattr [-R] {-n name | -d} [-e en] <path>]
>
>         [-getmerge [-nl] [-skip-empty-file] <src> <localdst>]
>
>         [-help [cmd ...]]
>
>         [-ls [-C] [-d] [-h] [-q] [-R] [-t] [-S] [-r] [-u] [<path> ...]]
>
>         [-mkdir [-p] <path> ...]
>
>         [-moveFromLocal <localsrc> ... <dst>]
>
>
>
>
>
>
>
> Regards,
>
> Kamaraj
>
> InfoSight - SRE
>
>
>
>
>
>
> --
>
> L
>
>
>

RE: Error while running "hdfs fs" commands

Posted by "Muthupandiyan, Kamaraj" <ka...@hpe.com>.
Hi Mingliang Liu,

Thank you for your reply.

As I am completely new to setting up the Hadoop cluster I am not sure about the value of  fs.defaultFS. I have set the below one as the value, Could you please help me with the correct value.

<property>
        <name>fs.defaultFS</name>
        <value>hdfs://hdfs</value>
    </property>


As a additional information, we have two name nodes in our cluster.

Regards,
Kamaraj
InfoSight - SRE

From: Mingliang Liu<ma...@gmail.com>
Sent: Friday, August 21, 2020 11:14 PM
To: Muthupandiyan, Kamaraj<ma...@hpe.com>
Cc: user@hadoop.apache.org<ma...@hadoop.apache.org>
Subject: Re: Error while running "hdfs fs" commands

Seems that you have a wrong fs.defaultFS configuration in your core-site.xml

For the second command you should follow the user manual. The input is simply not valid. Try something like:

hadoop fs -ls hdfs://sym-hdfsnn1.lvs.nimblestorage.com/user<http://sym-hdfsnn1.lvs.nimblestorage.com/user>
or
hadoop fs -ls file:///tmp

On Fri, Aug 21, 2020 at 9:07 AM Muthupandiyan, Kamaraj <ka...@hpe.com>> wrote:
Hi Team,

Whenever I am executing the hdfs fs command and I am getting below error, please help me with this.

[hdfs@hdfsnn1 ~]$ hdfs dfs -ls /user
-ls: java.net.UnknownHostException: hdfs


[hdfs@sym-hdfsnn1 ~]$ hadoop fs -ls / file:///|hdfs:hdfsnn1:8020/
-bash: hdfs:sym-hdfsnn1.lvs.nimblestorage.com:8020/<http://sym-hdfsnn1.lvs.nimblestorage.com:8020/>: No such file or directory
-ls: java.net.UnknownHostException: hdfs
Usage: hadoop fs [generic options]
        [-appendToFile <localsrc> ... <dst>]
        [-cat [-ignoreCrc] <src> ...]
        [-checksum <src> ...]
        [-chgrp [-R] GROUP PATH...]
        [-chmod [-R] <MODE[,MODE]... | OCTALMODE> PATH...]
        [-chown [-R] [OWNER][:[GROUP]] PATH...]
        [-copyFromLocal [-f] [-p] [-l] [-d] <localsrc> ... <dst>]
        [-copyToLocal [-f] [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]
        [-count [-q] [-h] [-v] [-t [<storage type>]] [-u] [-x] <path> ...]
        [-cp [-f] [-p | -p[topax]] [-d] <src> ... <dst>]
        [-createSnapshot <snapshotDir> [<snapshotName>]]
        [-deleteSnapshot <snapshotDir> <snapshotName>]
        [-df [-h] [<path> ...]]
        [-du [-s] [-h] [-x] <path> ...]
        [-expunge]
        [-find <path> ... <expression> ...]
        [-get [-f] [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]
        [-getfacl [-R] <path>]
        [-getfattr [-R] {-n name | -d} [-e en] <path>]
        [-getmerge [-nl] [-skip-empty-file] <src> <localdst>]
        [-help [cmd ...]]
        [-ls [-C] [-d] [-h] [-q] [-R] [-t] [-S] [-r] [-u] [<path> ...]]
        [-mkdir [-p] <path> ...]
        [-moveFromLocal <localsrc> ... <dst>]



Regards,
Kamaraj
InfoSight - SRE



--
L


Re: Error while running "hdfs fs" commands

Posted by Mingliang Liu <li...@gmail.com>.
Seems that you have a wrong fs.defaultFS configuration in your core-site.xml

For the second command you should follow the user manual. The input is
simply not valid. Try something like:

hadoop fs -ls hdfs://sym-hdfsnn1.lvs.nimblestorage.com/user
or
hadoop fs -ls file:///tmp

On Fri, Aug 21, 2020 at 9:07 AM Muthupandiyan, Kamaraj <
kamaraj.muthupandiyan@hpe.com> wrote:

> Hi Team,
>
>
>
> Whenever I am executing the hdfs fs command and I am getting below error,
> please help me with this.
>
>
>
> [hdfs@hdfsnn1 ~]$ hdfs dfs -ls /user
>
> -ls: java.net.UnknownHostException: hdfs
>
>
>
>
>
> [hdfs@sym-hdfsnn1 ~]$ hadoop fs -ls / file:///|hdfs:hdfsnn1:8020/
>
> -bash: hdfs:sym-hdfsnn1.lvs.nimblestorage.com:8020/: No such file or
> directory
>
> -ls: java.net.UnknownHostException: hdfs
>
> Usage: hadoop fs [generic options]
>
>         [-appendToFile <localsrc> ... <dst>]
>
>         [-cat [-ignoreCrc] <src> ...]
>
>         [-checksum <src> ...]
>
>         [-chgrp [-R] GROUP PATH...]
>
>         [-chmod [-R] <MODE[,MODE]... | OCTALMODE> PATH...]
>
>         [-chown [-R] [OWNER][:[GROUP]] PATH...]
>
>         [-copyFromLocal [-f] [-p] [-l] [-d] <localsrc> ... <dst>]
>
>         [-copyToLocal [-f] [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]
>
>         [-count [-q] [-h] [-v] [-t [<storage type>]] [-u] [-x] <path> ...]
>
>         [-cp [-f] [-p | -p[topax]] [-d] <src> ... <dst>]
>
>         [-createSnapshot <snapshotDir> [<snapshotName>]]
>
>         [-deleteSnapshot <snapshotDir> <snapshotName>]
>
>         [-df [-h] [<path> ...]]
>
>         [-du [-s] [-h] [-x] <path> ...]
>
>         [-expunge]
>
>         [-find <path> ... <expression> ...]
>
>         [-get [-f] [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]
>
>         [-getfacl [-R] <path>]
>
>         [-getfattr [-R] {-n name | -d} [-e en] <path>]
>
>         [-getmerge [-nl] [-skip-empty-file] <src> <localdst>]
>
>         [-help [cmd ...]]
>
>         [-ls [-C] [-d] [-h] [-q] [-R] [-t] [-S] [-r] [-u] [<path> ...]]
>
>         [-mkdir [-p] <path> ...]
>
>         [-moveFromLocal <localsrc> ... <dst>]
>
>
>
>
>
>
>
> Regards,
>
> Kamaraj
>
> InfoSight - SRE
>
>
>


-- 
L