You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@hive.apache.org by "Yu Wang (JIRA)" <ji...@apache.org> on 2019/04/11 07:46:00 UTC

[jira] [Updated] (HIVE-19836) Import to hive as parquet data format failed Could not initialize class org.apache.derby.jdbc.AutoloadedDriver40

     [ https://issues.apache.org/jira/browse/HIVE-19836?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Yu Wang updated HIVE-19836:
---------------------------
    Affects Version/s: 2.3.4

> Import to hive as parquet data format failed Could not initialize class org.apache.derby.jdbc.AutoloadedDriver40
> ----------------------------------------------------------------------------------------------------------------
>
>                 Key: HIVE-19836
>                 URL: https://issues.apache.org/jira/browse/HIVE-19836
>             Project: Hive
>          Issue Type: Bug
>    Affects Versions: 2.3.3, 2.3.4
>            Reporter: Mykhailo Kysliuk
>            Priority: Major
>
> Components:
> hive-2.3.3
> hadoop-2.7.0
> sqoop-1.4.7
> I am trying to use sqoop import command (first time command could succeed, but if you run second time it fails):
> {code:java}
> sqoop import-all-tables --connect jdbc:mysql://127.0.0.1:3306/db1 --username root --password 123456 --as-parquetfile --hive-import --hive-database test --hive-overwrite -m 1
> {code}
> But I am receiving error:
> {code:java}
> 18/06/08 22:03:40 ERROR tool.ImportAllTablesTool: Error during import: Import job failed!
> {code}
> Logs:
> {code:java}
> vim /opt/hadoop/logs/userlogs/application_1528483696847_0003/container_1528483696847_0003_01_000004/syslog
> {code}
> {code:java}
> 2018-06-08 22:03:33,357 FATAL [main] org.apache.hadoop.mapred.YarnChild: Error running child : java.lang.NoClassDefFoundError: Could not initialize class org.apache.derby.jdbc.AutoloadedDriver40
>         at java.lang.Class.forName0(Native Method)
>         at java.lang.Class.forName(Class.java:348)
>         at java.sql.DriverManager.isDriverAllowed(DriverManager.java:556)
>         at java.sql.DriverManager.getConnection(DriverManager.java:661)
>         at java.sql.DriverManager.getConnection(DriverManager.java:247)
>         at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:302)
>         at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:216)
>         at org.apache.sqoop.mapreduce.db.DBInputFormat.setDbConf(DBInputFormat.java:168)
>         at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:161)
>         at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:76)
>         at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:749)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
>         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:422)
>         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
>         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
> {code}
> My HADOOP_CLASSPATH:
> {code}
> [hadoop@hadoop bin]$ echo $HADOOP_CLASSPATH
> .:/opt/jdk1.8.0_171/jre/lib:/opt/jdk1.8.0_171/lib:/opt/jdk1.8.0_171/lib/tools.jar:/opt/hadoop/lib/*:/opt/hive/lib/*
> {code}
> I noticed that there are ambiguous versions of derby jars at hive libs:
> {code}
> [hadoop@hadoop lib]$ ll lib | grep derby
> -rw-r--r--. 1 hadoop root    2838580 січ 16  2016 derby-10.10.2.0.jar
> -rw-r--r--. 1 hadoop root     583719 гру 19  2016 derbyclient-10.11.1.1.jar
> -rw-r--r--. 1 hadoop root     266316 січ 17  2016 derbynet-10.11.1.1.jar
> {code}
> The error disappears if I remove the  derbyclient-10.11.1.1.jar and derbynet-10.11.1.1.jar jars.
> There are pretty much jars that have different versions for the same artifact:
> {code}
> -rw-r--r--. 1 hadoop root    1034049 січ 16  2016 ant-1.6.5.jar
> -rw-r--r--. 1 hadoop root    1997485 січ 16  2016 ant-1.9.1.jar
> -rw-r--r--. 1 hadoop root     539912 січ 16  2016 jetty-6.1.26.jar
> -rw-r--r--. 1 hadoop root    1681148 січ 16  2016 jetty-all-7.6.0.v20120127.jar
> -rw-r--r--. 1 hadoop root     228092 гру 19  2016 jetty-client-9.2.5.v20141112.jar
> -rw-r--r--. 1 hadoop root      15989 гру 19  2016 jetty-continuation-9.2.5.v20141112.jar
> -rw-r--r--. 1 hadoop root     105258 гру 19  2016 jetty-http-9.2.5.v20141112.jar
> -rw-r--r--. 1 hadoop root     105419 гру 19  2016 jetty-io-9.2.5.v20141112.jar
> -rw-r--r--. 1 hadoop root      50644 гру 19  2016 jetty-proxy-9.2.5.v20141112.jar
> -rw-r--r--. 1 hadoop root      95924 гру 19  2016 jetty-security-9.2.5.v20141112.jar
> -rw-r--r--. 1 hadoop root     417278 гру 19  2016 jetty-server-9.2.5.v20141112.jar
> -rw-r--r--. 1 hadoop root     115473 гру 19  2016 jetty-servlet-9.2.5.v20141112.jar
> -rw-r--r--. 1 hadoop root     121692 гру 19  2016 jetty-servlets-9.2.5.v20141112.jar
> -rw-r--r--. 1 hadoop root      18891 січ 16  2016 jetty-sslengine-6.1.26.jar
> -rw-r--r--. 1 hadoop root     177131 січ 16  2016 jetty-util-6.1.26.jar
> -rw-r--r--. 1 hadoop root     347382 гру 19  2016 jetty-util-9.2.5.v20141112.jar
> -rw-r--r--. 1 hadoop root    1024680 січ 16  2016 jsp-2.1-6.1.14.jar
> -rw-r--r--. 1 hadoop root      50493 січ 16  2016 jsp-api-2.0.jar
> -rw-r--r--. 1 hadoop root     134910 січ 16  2016 jsp-api-2.1-6.1.14.jar
> -rw-r--r--. 1 hadoop root     100636 січ 16  2016 jsp-api-2.1.jar
> -rw-r--r--. 1 hadoop root      82123 січ 16  2016 metrics-core-2.2.0.jar
> -rw-r--r--. 1 hadoop root     111908 січ 16  2016 metrics-core-3.1.0.jar
> -rw-r--r--. 1 hadoop root    1199572 січ 16  2016 netty-3.6.2.Final.jar
> -rw-r--r--. 1 hadoop root    2275047 жов  3  2017 netty-all-4.0.52.Final.jar
> -rw-r--r--. 1 hadoop root      97693 січ 16  2016 servlet-api-2.4.jar
> -rw-r--r--. 1 hadoop root     132368 січ 16  2016 servlet-api-2.5-6.1.14.jar
> {code}
> Could this issues with maven ambiguous dependencies at hive be a problem? I think not only derby can be a problem.
>  Do we need to remove it? But there will be too much places to exclude jetty lib at poms, for example. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)