You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@hive.apache.org by "Jesus Camacho Rodriguez (Jira)" <ji...@apache.org> on 2020/01/28 17:29:00 UTC

[jira] [Resolved] (HIVE-19502) Unable to insert values into table stored by JdbcStorageHandler

     [ https://issues.apache.org/jira/browse/HIVE-19502?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Jesus Camacho Rodriguez resolved HIVE-19502.
--------------------------------------------
    Resolution: Duplicate

> Unable to insert values into table stored by JdbcStorageHandler
> ---------------------------------------------------------------
>
>                 Key: HIVE-19502
>                 URL: https://issues.apache.org/jira/browse/HIVE-19502
>             Project: Hive
>          Issue Type: Bug
>          Components: StorageHandler
>    Affects Versions: 2.3.3
>            Reporter: Alexey Vakulenchuk
>            Priority: Blocker
>
> *General Info*
> Hive version : 2.3.3
> {code:java}
> commit 3f7dde31aed44b5440563d3f9d8a8887beccf0be
> Author: Daniel Dai <da...@hortonworks.com>
> Date:   Wed Mar 28 16:46:29 2018 -0700
>     Preparing for 2.3.3 release
> {code}
> Hadoop version: 2.7.2.
> Engine
> {code:java}
> hive> set hive.execution.engine;
> hive.execution.engine=mr{code}
> *Step 1. Create table in mysql*
> {code:java}
> mysql> CREATE TABLE books (book_id INT, book_name VARCHAR(100), author_name  VARCHAR(100), book_isbn  VARCHAR(100));
> {code}
> *Step 2. Create table in hive*
> {code:java}
> CREATE EXTERNAL TABLE books (
> book_id INT, 
> book_name STRING, 
> author_name  STRING, 
> book_isbn  STRING
> ) STORED BY "org.apache.hive.storage.jdbc.JdbcStorageHandler" 
> TBLPROPERTIES ( 
> "hive.sql.database.type" = "MYSQL",
> "hive.sql.jdbc.driver" = "com.mysql.jdbc.Driver", 
> "hive.sql.jdbc.url" = "jdbc:mysql://node1:3306/mysql?user=root&password=123456", 
> "hive.sql.query" = "SELECT book_id, book_name, author_name, book_isbn FROM books",
> "hive.sql.column.mapping" = "book_id=book_id, book_name=book_name, author_name=author_name, book_isbn=book_isbn",
> "hive.sql.jdbc.input.table.name" = "books"
> );
> {code}
> *Step 3. Insert values into hive table*
> {code:java}
> insert into books values (1,'holybible','Jesus', '01');
> {code}
> *Actual result:*
> {code:java}
> Launching Job 1 out of 1
> Number of reduce tasks is set to 0 since there's no reduce operator
> Starting Job = job_1526038512481_0002, Tracking URL = http://c74apache.com:8088/proxy/application_1526038512481_0002/
> Kill Command = /opt/hadoop/bin/hadoop job  -kill job_1526038512481_0002
> Hadoop job information for Stage-3: number of mappers: 1; number of reducers: 0
> 2018-05-11 07:40:27,312 Stage-3 map = 0%,  reduce = 0%
> 2018-05-11 07:40:40,947 Stage-3 map = 100%,  reduce = 0%
> Ended Job = job_1526038512481_0002 with errors
> Error during job, obtaining debugging information...
> Examining task ID: task_1526038512481_0002_m_000000 (and more) from job job_1526038512481_0002
> Task with the most failures(4): 
> -----
> Task ID:
>   task_1526038512481_0002_m_000000
> URL:
>   http://0.0.0.0:8088/taskdetails.jsp?jobid=job_1526038512481_0002&tipid=task_1526038512481_0002_m_000000
> -----
> Diagnostic Messages for this Task:
> Error: java.lang.RuntimeException: Failed to load plan: hdfs://localhost:9000/tmp/hive/hadoop/b943d5b2-2de9-424d-b7bb-6d9ccb1e6465/hive_2018-05-11_07-40-19_643_183408830372672971-1/-mr-10002/a1bd8dbb-0970-41bc-9e95-d5fd2aeea47c/map.xml
>     at org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:481)
>     at org.apache.hadoop.hive.ql.exec.Utilities.getMapWork(Utilities.java:313)
>     at org.apache.hadoop.hive.ql.io.HiveInputFormat.init(HiveInputFormat.java:394)
>     at org.apache.hadoop.hive.ql.io.HiveInputFormat.pushProjectionsAndFilters(HiveInputFormat.java:665)
>     at org.apache.hadoop.hive.ql.io.HiveInputFormat.pushProjectionsAndFilters(HiveInputFormat.java:658)
>     at org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getRecordReader(CombineHiveInputFormat.java:692)
>     at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.<init>(MapTask.java:169)
>     at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:432)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
>     at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:422)
>     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
>     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
> Caused by: org.apache.hive.com.esotericsoftware.kryo.KryoException: Unable to find class: org.apache.hive.storage.jdbc.JdbcInputFormat
> Serialization trace:
> inputFileFormatClass (org.apache.hadoop.hive.ql.plan.TableDesc)
> tableInfo (org.apache.hadoop.hive.ql.plan.FileSinkDesc)
> conf (org.apache.hadoop.hive.ql.exec.FileSinkOperator)
> childOperators (org.apache.hadoop.hive.ql.exec.SelectOperator)
> childOperators (org.apache.hadoop.hive.ql.exec.TableScanOperator)
> aliasToWork (org.apache.hadoop.hive.ql.plan.MapWork)
>     at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:156)
>     at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:133)
>     at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:670)
>     at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClass(SerializationUtilities.java:181)
>     at org.apache.hive.com.esotericsoftware.kryo.serializers.DefaultSerializers$ClassSerializer.read(DefaultSerializers.java:326)
>     at org.apache.hive.com.esotericsoftware.kryo.serializers.DefaultSerializers$ClassSerializer.read(DefaultSerializers.java:314)
>     at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObjectOrNull(Kryo.java:759)
>     at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObjectOrNull(SerializationUtilities.java:199)
>     at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:132)
>     at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
>     at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708)
>     at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:214)
>     at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125)
>     at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
>     at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708)
>     at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:214)
>     at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125)
>     at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
>     at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:790)
>     at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClassAndObject(SerializationUtilities.java:176)
>     at org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:134)
>     at org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:40)
>     at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708)
>     at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:214)
>     at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125)
>     at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
>     at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:790)
>     at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClassAndObject(SerializationUtilities.java:176)
>     at org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:134)
>     at org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:40)
>     at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708)
>     at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:214)
>     at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125)
>     at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
>     at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:790)
>     at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClassAndObject(SerializationUtilities.java:176)
>     at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:161)
>     at org.apache.hive.com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:39)
>     at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708)
>     at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:214)
>     at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125)
>     at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
>     at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:686)
>     at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:206)
>     at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializeObjectByKryo(SerializationUtilities.java:607)
>     at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializePlan(SerializationUtilities.java:494)
>     at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializePlan(SerializationUtilities.java:471)
>     at org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:440)
>     ... 13 more
> Caused by: java.lang.ClassNotFoundException: org.apache.hive.storage.jdbc.JdbcInputFormat
>     at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:338)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>     at java.lang.Class.forName0(Native Method)
>     at java.lang.Class.forName(Class.java:348)
>     at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:154)
>     ... 60 more
> FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask
> MapReduce Jobs Launched: 
> Stage-Stage-3: Map: 1   HDFS Read: 0 HDFS Write: 0 FAIL
> Total MapReduce CPU Time Spent: 0 msec
> {code}
> ===========================
> BTW. If we insert data into table in mysql, data was successfully uploaded in both tables.
> Also, we try add jar's into hive:
> {code:java}
> hive> add jar /opt/apache/hive/hive-2.3.3/lib/hive-jdbc-handler-2.3.3.jar;
> hive> add jar /opt/apache/hive/hive-2.3.3/lib/commons-dbcp-1.4.jar;
> hive> add jar /opt/apache/hive/hive-2.3.3/lib/commons-pool-1.5.4.jar;
> hive> add jar /home/hadoop/jars/mysql-connector-java-5.1.46/mysql-connector-java-5.1.46.jar;
> {code}
> And caught another error:
> {code:java}
> Diagnostic Messages for this Task:
> Error: java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row {"tmp_values_col1":"1","tmp_values_col2":"holybible","tmp_values_col3":"Jesus","tmp_values_col4":"01"}
>     at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:169)
>     at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>     at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
>     at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:422)
>     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
>     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
> Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing row {"tmp_values_col1":"1","tmp_values_col2":"holybible","tmp_values_col3":"Jesus","tmp_values_col4":"01"}
>     at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:562)
>     at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:160)
>     ... 8 more
> Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.UnsupportedOperationException: Write operations are not allowed.
>     at org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketFiles(FileSinkOperator.java:574)
>     at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:674)
>     at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:897)
>     at org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:95)
>     at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:897)
>     at org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:130)
>     at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:148)
>     at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:547)
>     ... 9 more
> Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.UnsupportedOperationException: Write operations are not allowed.
>     at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:274)
>     at org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketForFileIdx(FileSinkOperator.java:619)
>     at org.apache.hadoop.hive.ql.exec.FileSinkOperator.createBucketFiles(FileSinkOperator.java:563)
>     ... 16 more
> Caused by: java.lang.UnsupportedOperationException: Write operations are not allowed.
>     at org.apache.hive.storage.jdbc.JdbcOutputFormat.getHiveRecordWriter(JdbcOutputFormat.java:44)
>     at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getRecordWriter(HiveFileFormatUtils.java:286)
>     at org.apache.hadoop.hive.ql.io.HiveFileFormatUtils.getHiveRecordWriter(HiveFileFormatUtils.java:271)
>     ... 18 more
> FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask
> {code}
> So,  can we at all insert values into table stored by JDBC storage Handler via hive?



--
This message was sent by Atlassian Jira
(v8.3.4#803005)