You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@hive.apache.org by "Attila Csaba Marosi (JIRA)" <ji...@apache.org> on 2019/01/29 19:16:00 UTC

[jira] [Updated] (HIVE-21181) Hive pre-upgrade tool not working with HDFS HA, tries connecting to nameservice as it was a NameNode

     [ https://issues.apache.org/jira/browse/HIVE-21181?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Attila Csaba Marosi updated HIVE-21181:
---------------------------------------
    Description: 
While preparing a production cluster HDP-2.6.5 -> HDP-3.1 upgrades, we've noticed issues with the hive-pre-upgrade tool, when we tried running it, we got the exception:

{{Found Acid table: default.hello_acid
2019-01-28 15:54:20,331 ERROR [main] acid.PreUpgradeTool (PreUpgradeTool.java:main(152)) - PreUpgradeTool failed
java.lang.IllegalArgumentException: java.net.UnknownHostException: mytestcluster
        at org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:439)
        at org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:321)
        at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:176)
        at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:696)
        at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:636)
        at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:160)
        at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2796)
        at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:99)
        at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2830)
        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2812)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:390)
        at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295)
        at org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool.needsCompaction(PreUpgradeTool.java:417)
        at org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool.getCompactionCommands(PreUpgradeTool.java:384)
        at org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool.getCompactionCommands(PreUpgradeTool.java:374)
        at org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool.prepareAcidUpgradeInternal(PreUpgradeTool.java:235)
        at org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool.main(PreUpgradeTool.java:149)
Caused by: java.net.UnknownHostException: mytestcluster
        ... 17 more}}


We tried running it on a kerberized test cluster built based on the same blueprint like the production clusters, with HDP-2.6.5.0-292, Hive 1.2.1000, HDFS 2.7.3, with HDFS HA and without Hive HA.
We enabled Hive ACID, created the same example ACID table as shown in https://hortonworks.com/tutorial/using-hive-acid-transactions-to-insert-update-and-delete-data/

We followed the steps described at https://docs.hortonworks.com/HDPDocuments/Ambari-2.7.3.0/bk_ambari-upgrade-major/content/prepare_hive_for_upgrade.html , kinit-ed, used the "-Djavax.security.auth.useSubjectCredsOnly=false" parameter.

Without the ACID table there is no issue.
I'm attaching the hdfs-site.xml and core-site.xml.



  was:
While preparing a production cluster HDP-2.6.5 -> HDP-3.1 upgrades, we've noticed issues with the hive-pre-upgrade tool, when we tried running it, we got the exception:

{{Found Acid table: default.hello_acid
2019-01-28 15:54:20,331 ERROR [main] acid.PreUpgradeTool (PreUpgradeTool.java:main(152)) - PreUpgradeTool failed
java.lang.IllegalArgumentException: java.net.UnknownHostException: mytestcluster
        at org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:439)
        at org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:321)
        at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:176)
        at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:696)
        at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:636)
        at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:160)
        at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2796)
        at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:99)
        at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2830)
        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2812)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:390)
        at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295)
        at org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool.needsCompaction(PreUpgradeTool.java:417)
        at org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool.getCompactionCommands(PreUpgradeTool.java:384)
        at org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool.getCompactionCommands(PreUpgradeTool.java:374)
        at org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool.prepareAcidUpgradeInternal(PreUpgradeTool.java:235)
        at org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool.main(PreUpgradeTool.java:149)
Caused by: java.net.UnknownHostException: mytestcluster
        ... 17 more}}


We tried running it on a kerberized test cluster built based on the same blueprint like the production clusters, with HDP-2.6.5.0-292, Hive 1.2.1000, HDFS 2.7.3, with HDFS HA and without Hive HA.
We enabled Hive ACID, created the same example ACID table as shown in https://hortonworks.com/tutorial/using-hive-acid-transactions-to-insert-update-and-delete-data/

We followed the steps described at https://docs.hortonworks.com/HDPDocuments/Ambari-2.7.3.0/bk_ambari-upgrade-major/content/prepare_hive_for_upgrade.html , kinit-ed, used the "-Djavax.security.auth.useSubjectCredsOnly=false" parameter.

Without the ACID table there is no issue.
I'm attaching the hdfs-site.xml and core-site.xml.
Feel free to ping me directly on slack, if any additional detail is needed, we can reproduce the issue on a lab cluster any time.



> Hive pre-upgrade tool not working with HDFS HA, tries connecting to nameservice as it was a NameNode
> ----------------------------------------------------------------------------------------------------
>
>                 Key: HIVE-21181
>                 URL: https://issues.apache.org/jira/browse/HIVE-21181
>             Project: Hive
>          Issue Type: Bug
>          Components: Hive
>    Affects Versions: 1.2.1
>         Environment: Centos 7.4.1708
> kernel 3.10.0-693.11.6.el7.x86_64
> Ambari 2.6.2.2
> HDP-2.6.5.0-292
> Hive 1.2.1000
> HDFS 2.7.3
>            Reporter: Attila Csaba Marosi
>            Priority: Major
>
> While preparing a production cluster HDP-2.6.5 -> HDP-3.1 upgrades, we've noticed issues with the hive-pre-upgrade tool, when we tried running it, we got the exception:
> {{Found Acid table: default.hello_acid
> 2019-01-28 15:54:20,331 ERROR [main] acid.PreUpgradeTool (PreUpgradeTool.java:main(152)) - PreUpgradeTool failed
> java.lang.IllegalArgumentException: java.net.UnknownHostException: mytestcluster
>         at org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:439)
>         at org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:321)
>         at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:176)
>         at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:696)
>         at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:636)
>         at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:160)
>         at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2796)
>         at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:99)
>         at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2830)
>         at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2812)
>         at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:390)
>         at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295)
>         at org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool.needsCompaction(PreUpgradeTool.java:417)
>         at org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool.getCompactionCommands(PreUpgradeTool.java:384)
>         at org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool.getCompactionCommands(PreUpgradeTool.java:374)
>         at org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool.prepareAcidUpgradeInternal(PreUpgradeTool.java:235)
>         at org.apache.hadoop.hive.upgrade.acid.PreUpgradeTool.main(PreUpgradeTool.java:149)
> Caused by: java.net.UnknownHostException: mytestcluster
>         ... 17 more}}
> We tried running it on a kerberized test cluster built based on the same blueprint like the production clusters, with HDP-2.6.5.0-292, Hive 1.2.1000, HDFS 2.7.3, with HDFS HA and without Hive HA.
> We enabled Hive ACID, created the same example ACID table as shown in https://hortonworks.com/tutorial/using-hive-acid-transactions-to-insert-update-and-delete-data/
> We followed the steps described at https://docs.hortonworks.com/HDPDocuments/Ambari-2.7.3.0/bk_ambari-upgrade-major/content/prepare_hive_for_upgrade.html , kinit-ed, used the "-Djavax.security.auth.useSubjectCredsOnly=false" parameter.
> Without the ACID table there is no issue.
> I'm attaching the hdfs-site.xml and core-site.xml.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)