You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-issues@hadoop.apache.org by "Guram Savinov (Jira)" <ji...@apache.org> on 2020/02/04 07:29:00 UTC
[jira] [Created] (HADOOP-16837) Spark-SQL test running on Windows:
hadoop chgrp warnings
Guram Savinov created HADOOP-16837:
--------------------------------------
Summary: Spark-SQL test running on Windows: hadoop chgrp warnings
Key: HADOOP-16837
URL: https://issues.apache.org/jira/browse/HADOOP-16837
Project: Hadoop Common
Issue Type: Bug
Components: common, fs
Affects Versions: 2.6.5
Environment: Windows 10
Winutils 2.7.1: [https://github.com/steveloughran/winutils/tree/master/hadoop-2.7.1]
Oracle JavaSE 8
SparkSQL 2.4.4 / Hadoop 2.6.5
Using: -Dhive.exec.scratchdir=C:\Users\OSUser\hadoop\tmp\hive
Set: winutils chmod -R 777 \Users\OSUser\hadoop\tmp\hive
Reporter: Guram Savinov
Attachments: HadoopGroupTest.java
Running SparkSQL local embedded unit tests on Win10, using winutils.
Got warnings about 'hadoop chgrp'.
See environment info.
{code:bash}
-chgrp: 'TEST\Domain users' does not match expected pattern for group
Usage: hadoop fs [generic options] -chgrp [-R] GROUP PATH...
-chgrp: 'TEST\Domain users' does not match expected pattern for group
Usage: hadoop fs [generic options] -chgrp [-R] GROUP PATH...
-chgrp: 'TEST\Domain users' does not match expected pattern for group
Usage: hadoop fs [generic options] -chgrp [-R] GROUP PATH...
{code}
Related info on SO: [https://stackoverflow.com/questions/48605907/error-in-pyspark-when-insert-data-in-hive]
hadoop-common-project/hadoop-common/src/main/java/org/apache/hadoop/fs/FsShellPermissions.java:210
The problem is: backslash character isn't included to allowedChars, see attached HadoopGroupTest.java
Original issue in Spark: https://issues.apache.org/jira/browse/SPARK-30701
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-issues-help@hadoop.apache.org