You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-user@hadoop.apache.org by kumar r <ku...@gmail.com> on 2015/07/15 09:29:30 UTC

hadoop setfacl --set not working

I am windows user, Configured Hadoop-2.6.0 secured with kerberos. Trying to
set ACL for a directory using below command


*hadoop fs -setfacl --set user::rwx,user:user1:---,group::rwx,other::rwx /test1*

It gives


*-setfacl: Too many arguments
Usage: hadoop fs [generic options] -setfacl [-R] [{-b|-k} {-m|-x
<acl_spec>} <path>]|[--set <acl_spec> <path>]*


I have posted question in stackoverflow and the link is


*http://stackoverflow.com/questions/31422810/hadoop-setfacl-set-not-working*
<http://stackoverflow.com/questions/31422810/hadoop-setfacl-set-not-working>

Re: hadoop setfacl --set not working

Posted by Chris Nauroth <cn...@hortonworks.com>.
Hi Kumar,

This is correct syntax for the HDFS setfacl command.  If you're running from Windows cmd.exe, then you may need to wrap command line parameters in quotes if they contain any of the cmd.exe parameter delimiters.  In cmd.exe, the parameter delimiters are space, comma, semicolon and equal sign.  The syntax for an ACL spec contains commas, so we need to wrap that in quotes.  Otherwise, cmd.exe splits it into multiple arguments before invoking the Hadoop code, and this is why you see an error for too many arguments.  When I ran this on Windows, it worked:

hadoop fs -setfacl --set "user::rwx,user:user1:---,group::rwx,other::rwx" /test1

--Chris Nauroth

From: kumar r <ku...@gmail.com>>
Reply-To: "user@hadoop.apache.org<ma...@hadoop.apache.org>" <us...@hadoop.apache.org>>
Date: Wednesday, July 15, 2015 at 12:29 AM
To: "user@hadoop.apache.org<ma...@hadoop.apache.org>" <us...@hadoop.apache.org>>
Subject: hadoop setfacl --set not working

I am windows user, Configured Hadoop-2.6.0 secured with kerberos. Trying to set ACL for a directory using below command


hadoop fs -setfacl --set user::rwx,user:user1:---,group::rwx,other::rwx /test1


It gives

-setfacl: Too many arguments
Usage: hadoop fs [generic options] -setfacl [-R] [{-b|-k} {-m|-x <acl_spec>} <path>]|[--set <acl_spec> <path>]

I have posted question in stackoverflow and the link is

http://stackoverflow.com/questions/31422810/hadoop-setfacl-set-not-working


Re: hadoop setfacl --set not working

Posted by Chris Nauroth <cn...@hortonworks.com>.
Hi Kumar,

This is correct syntax for the HDFS setfacl command.  If you're running from Windows cmd.exe, then you may need to wrap command line parameters in quotes if they contain any of the cmd.exe parameter delimiters.  In cmd.exe, the parameter delimiters are space, comma, semicolon and equal sign.  The syntax for an ACL spec contains commas, so we need to wrap that in quotes.  Otherwise, cmd.exe splits it into multiple arguments before invoking the Hadoop code, and this is why you see an error for too many arguments.  When I ran this on Windows, it worked:

hadoop fs -setfacl --set "user::rwx,user:user1:---,group::rwx,other::rwx" /test1

--Chris Nauroth

From: kumar r <ku...@gmail.com>>
Reply-To: "user@hadoop.apache.org<ma...@hadoop.apache.org>" <us...@hadoop.apache.org>>
Date: Wednesday, July 15, 2015 at 12:29 AM
To: "user@hadoop.apache.org<ma...@hadoop.apache.org>" <us...@hadoop.apache.org>>
Subject: hadoop setfacl --set not working

I am windows user, Configured Hadoop-2.6.0 secured with kerberos. Trying to set ACL for a directory using below command


hadoop fs -setfacl --set user::rwx,user:user1:---,group::rwx,other::rwx /test1


It gives

-setfacl: Too many arguments
Usage: hadoop fs [generic options] -setfacl [-R] [{-b|-k} {-m|-x <acl_spec>} <path>]|[--set <acl_spec> <path>]

I have posted question in stackoverflow and the link is

http://stackoverflow.com/questions/31422810/hadoop-setfacl-set-not-working


Re: hadoop setfacl --set not working

Posted by Chris Nauroth <cn...@hortonworks.com>.
Hi Kumar,

This is correct syntax for the HDFS setfacl command.  If you're running from Windows cmd.exe, then you may need to wrap command line parameters in quotes if they contain any of the cmd.exe parameter delimiters.  In cmd.exe, the parameter delimiters are space, comma, semicolon and equal sign.  The syntax for an ACL spec contains commas, so we need to wrap that in quotes.  Otherwise, cmd.exe splits it into multiple arguments before invoking the Hadoop code, and this is why you see an error for too many arguments.  When I ran this on Windows, it worked:

hadoop fs -setfacl --set "user::rwx,user:user1:---,group::rwx,other::rwx" /test1

--Chris Nauroth

From: kumar r <ku...@gmail.com>>
Reply-To: "user@hadoop.apache.org<ma...@hadoop.apache.org>" <us...@hadoop.apache.org>>
Date: Wednesday, July 15, 2015 at 12:29 AM
To: "user@hadoop.apache.org<ma...@hadoop.apache.org>" <us...@hadoop.apache.org>>
Subject: hadoop setfacl --set not working

I am windows user, Configured Hadoop-2.6.0 secured with kerberos. Trying to set ACL for a directory using below command


hadoop fs -setfacl --set user::rwx,user:user1:---,group::rwx,other::rwx /test1


It gives

-setfacl: Too many arguments
Usage: hadoop fs [generic options] -setfacl [-R] [{-b|-k} {-m|-x <acl_spec>} <path>]|[--set <acl_spec> <path>]

I have posted question in stackoverflow and the link is

http://stackoverflow.com/questions/31422810/hadoop-setfacl-set-not-working


Re: hadoop setfacl --set not working

Posted by Chris Nauroth <cn...@hortonworks.com>.
Hi Kumar,

This is correct syntax for the HDFS setfacl command.  If you're running from Windows cmd.exe, then you may need to wrap command line parameters in quotes if they contain any of the cmd.exe parameter delimiters.  In cmd.exe, the parameter delimiters are space, comma, semicolon and equal sign.  The syntax for an ACL spec contains commas, so we need to wrap that in quotes.  Otherwise, cmd.exe splits it into multiple arguments before invoking the Hadoop code, and this is why you see an error for too many arguments.  When I ran this on Windows, it worked:

hadoop fs -setfacl --set "user::rwx,user:user1:---,group::rwx,other::rwx" /test1

--Chris Nauroth

From: kumar r <ku...@gmail.com>>
Reply-To: "user@hadoop.apache.org<ma...@hadoop.apache.org>" <us...@hadoop.apache.org>>
Date: Wednesday, July 15, 2015 at 12:29 AM
To: "user@hadoop.apache.org<ma...@hadoop.apache.org>" <us...@hadoop.apache.org>>
Subject: hadoop setfacl --set not working

I am windows user, Configured Hadoop-2.6.0 secured with kerberos. Trying to set ACL for a directory using below command


hadoop fs -setfacl --set user::rwx,user:user1:---,group::rwx,other::rwx /test1


It gives

-setfacl: Too many arguments
Usage: hadoop fs [generic options] -setfacl [-R] [{-b|-k} {-m|-x <acl_spec>} <path>]|[--set <acl_spec> <path>]

I have posted question in stackoverflow and the link is

http://stackoverflow.com/questions/31422810/hadoop-setfacl-set-not-working