You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by Gabriel Eisbruch <ga...@gmail.com> on 2013/08/23 01:12:59 UTC
[HIVE2] Error on insert into table dejavu2 partition (ds) select from
jdbc driver
Hi Guys
I am getting the next error when I run an insert query from hive2 jdbc
driver
java.sql.SQLException: Error running query: null
at org.apache.hive.jdbc.Utils.verifySuccess(Utils.java:162)
at org.apache.hive.jdbc.Utils.verifySuccessWithInfo(Utils.java:150)
at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:196)
at org.apache.hive.jdbc.HiveStatement.executeUpdate(HiveStatement.java:275)
at
com.ml.dejavu.hadoop.historymigrator.MigrationTask$2.call(MigrationTask.java:143)
My code before run the query set the next parameters
add jar hdfs://namenode.melidoop.com:8020/libs/MapUDFS-0.0.1-SNAPSHOT.jar
create temporary function void_map as "com.ml.VoidMap"
create temporary function rename_keys as "com.ml.RenameKeys"
create temporary function filter_map as "com.ml.FilterMap"
create temporary function keys_to_lower as "com.ml.KeysToLower"
create temporary function my_nvl_str_to_map as "com.ml.NvlStrToMap"
set hive.exec.dynamic.partition=true
set hive.exec.dynamic.partition.mode=nonstrict
set hive.exec.max.dynamic.partitions=10000
set hive.exec.max.dynamic.partitions.pernode=10000
set mapred.output.compress=true
set hive.exec.compress.output=true
set mapred.output.compression.codec=org.apache.hadoop.io.compress.GzipCodec
set io.compression.codecs=org.apache.hadoop.io.compress.GzipCodec
set io.seqfile.compression.type=BLOCK
set mapred.job.name=##JOB_NAME##
Could you help me with this error? I can't found any similar.
The snipet that I using to run the query it's
for(String subQuery : configureQuery.split("\n")) {
log.debug("Config query ["+subQuery+"]");
stmt.execute(subQuery);
}
log.debug("Running Query ["+query+"]");
stmt = con.createStatement();
stmt.executeUpdate(query);
I can find that everything runs ok before run the executeUpdate.
I am using hive-jdbc-0.10.0-cdh4.3.0.jar
Thanks,
Gabriel.