You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "shane knapp (JIRA)" <ji...@apache.org> on 2019/06/23 19:16:00 UTC

[jira] [Comment Edited] (SPARK-27177) Update jenkins locale to en_US.UTF-8

    [ https://issues.apache.org/jira/browse/SPARK-27177?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16870628#comment-16870628 ] 

shane knapp edited comment on SPARK-27177 at 6/23/19 7:15 PM:
--------------------------------------------------------------

i guess we're needing this now for maven builds?  see:  https://issues.apache.org/jira/browse/SPARK-28114?focusedCommentId=16870381&page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel#comment-16870381


was (Author: shaneknapp):
i guess we're needing this now for maven builds?  see:  https://issues.apache.org/jira/browse/SPARK-28114?focusedCommentId=16870381

> Update jenkins locale to en_US.UTF-8
> ------------------------------------
>
>                 Key: SPARK-27177
>                 URL: https://issues.apache.org/jira/browse/SPARK-27177
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Build, jenkins
>    Affects Versions: 3.0.0
>            Reporter: Yuming Wang
>            Assignee: shane knapp
>            Priority: Major
>
> Two test cases will failed on our jenkins since HADOOP-12045(Hadoop-2.8.0). I'd like to update our jenkins locale to en_US.UTF-8 to workaround this issue.
>  How to reproduce:
> {code:java}
> export LANG=
> git clone https://github.com/apache/spark.git && cd spark && git checkout v2.4.0
> build/sbt "hive/testOnly *.HiveDDLSuite" -Phive -Phadoop-2.7 -Dhadoop.version=2.8.0
> {code}
> Stack trace:
> {noformat}
> Caused by: sbt.ForkMain$ForkError: java.nio.file.InvalidPathException: Malformed input or input contains unmappable characters: /home/jenkins/workspace/SparkPullRequestBuilder@2/target/tmp/warehouse-15474fdf-0808-40ab-946d-1309fb05bf26/DaTaBaSe_I.db/tab_ı
> 	at sun.nio.fs.UnixPath.encode(UnixPath.java:147)
> 	at sun.nio.fs.UnixPath.<init>(UnixPath.java:71)
> 	at sun.nio.fs.UnixFileSystem.getPath(UnixFileSystem.java:281)
> 	at java.io.File.toPath(File.java:2234)
> 	at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.getLastAccessTime(RawLocalFileSystem.java:683)
> 	at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.<init>(RawLocalFileSystem.java:694)
> 	at org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:664)
> 	at org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:987)
> 	at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:656)
> 	at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:454)
> 	at org.apache.hadoop.hive.metastore.Warehouse.isDir(Warehouse.java:520)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_core(HiveMetaStore.java:1436)
> 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.create_table_with_environment_context(HiveMetaStore.java:1503)
> {noformat}
> Workaround:
> {code:java}
> export LANG=en_US.UTF-8
> build/sbt "hive/testOnly *.HiveDDLSuite" -Phive -Phadoop-2.7 -Dhadoop.version=2.8.0
> {code}
> More details: 
> https://issues.apache.org/jira/browse/HADOOP-16180
> https://github.com/apache/spark/pull/24044/commits/4c1ec25d3bc64bf358edf1380a7c863596722362



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org