You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues-all@impala.apache.org by "Joe McDonnell (Jira)" <ji...@apache.org> on 2020/06/06 01:23:00 UTC

[jira] [Resolved] (IMPALA-8582) HDFS Datanodes fail to start with USE_CDP_HIVE=true on Centos 6

     [ https://issues.apache.org/jira/browse/IMPALA-8582?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Joe McDonnell resolved IMPALA-8582.
-----------------------------------
    Resolution: Won't Fix

Impala 4 has deprecated Centos 6 support.

> HDFS Datanodes fail to start with USE_CDP_HIVE=true on Centos 6
> ---------------------------------------------------------------
>
>                 Key: IMPALA-8582
>                 URL: https://issues.apache.org/jira/browse/IMPALA-8582
>             Project: IMPALA
>          Issue Type: Bug
>          Components: Infrastructure
>    Affects Versions: Impala 3.3.0
>            Reporter: Joe McDonnell
>            Priority: Critical
>              Labels: broken-build
>
> On Centos 6, the HDFS Datanode won't start up with this error:
> {noformat}
> 2019-05-22 22:35:49,852 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
> ...
> 2019-05-22 22:35:52,497 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in secureMain
> java.lang.RuntimeException: Cannot start datanode because the configured max locked memory size (dfs.datanode.max.locked.memory) is greater than zero and native code is not available.
> at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:1379)
> at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:500)
> at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2782)
> at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2690)
> at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2732)
> at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2876)
> at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2900)
> 2019-05-22 22:35:52,506 INFO org.apache.hadoop.util.ExitUtil: Exiting with status 1: java.lang.RuntimeException: Cannot start datanode because the configured max locked memory size (dfs.datanode.max.locked.memory) is greater than zero and native code is not available.{noformat}
> There must be something about the CDP version of Hadoop binaries that impacts this. As far as I know, the CDP Hadoop binaries are built on Centos 7. This is likely to be fixed by getting appropriate binaries. Anecdotally, this seems fine on Ubuntu 16.04 and Centos 7. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-all-unsubscribe@impala.apache.org
For additional commands, e-mail: issues-all-help@impala.apache.org