You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by Mahesh Sankaran <sa...@gmail.com> on 2015/03/10 07:59:47 UTC

Hive 1.2.0 snapshot show errors when inserting intto table

Hi,
    I am working in hive-1.2.0-snapshot configuration.I have configured
metastore with mysql.I can create database and table.But when i try to
insert data into table, am facing following error.


hive> insert into table test values (1);
Query ID = hadoop2_20150310122424_8c7b0c91-384e-46b4-9621-d2d335656d75
Total jobs = 3
Launching Job 1 out of 3
Number of reduce tasks is set to 0 since there's no reduce operator
Starting Job = job_1425963412408_0005, Tracking URL =
http://nn01:8088/proxy/application_1425963412408_0005/
Kill Command = /opt/hadoop-2.4.1/bin/hadoop job  -kill
job_1425963412408_0005
Hadoop job information for Stage-1: number of mappers: 1; number of
reducers: 0
2015-03-10 12:25:17,150 Stage-1 map = 0%,  reduce = 0%
2015-03-10 12:26:05,359 Stage-1 map = 100%,  reduce = 0%, Cumulative CPU
2.21 sec
MapReduce Total cumulative CPU time: 2 seconds 210 msec
Ended Job = job_1425963412408_0005
Stage-4 is selected by condition resolver.
Stage-3 is filtered out by condition resolver.
Stage-5 is filtered out by condition resolver.
Moving data to:
hdfs://hacluster/user/hive/warehouse1/mahesh.db/test/.hive-staging_hive_2015-03-10_12-24-53_478_6968131894660522035-1/-ext-10000
java.lang.NoSuchMethodError:
org.apache.hadoop.hdfs.DFSClient.getKeyProvider()Lorg/apache/hadoop/crypto/key/KeyProvider;
at
org.apache.hadoop.hive.shims.Hadoop23Shims$HdfsEncryptionShim.<init>(Hadoop23Shims.java:1155)
at
org.apache.hadoop.hive.shims.Hadoop23Shims.createHdfsEncryptionShim(Hadoop23Shims.java:1282)
at
org.apache.hadoop.hive.ql.session.SessionState.getHdfsEncryptionShim(SessionState.java:410)
at org.apache.hadoop.hive.ql.metadata.Hive.moveFile(Hive.java:2542)
at org.apache.hadoop.hive.ql.exec.MoveTask.moveFile(MoveTask.java:105)
at org.apache.hadoop.hive.ql.exec.MoveTask.execute(MoveTask.java:222)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
at
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1642)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1402)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1187)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1053)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1043)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:207)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:159)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:370)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:754)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:615)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
FAILED: Execution Error, return code -101 from
org.apache.hadoop.hive.ql.exec.MoveTask.
org.apache.hadoop.hdfs.DFSClient.getKeyProvider()Lorg/apache/hadoop/crypto/key/KeyProvider;
MapReduce Jobs Launched:
Stage-Stage-1: Map: 1   Cumulative CPU: 2.21 sec   HDFS Read: 3225 HDFS
Write: 69 SUCCESS
Total MapReduce CPU Time Spent: 2 seconds 210 msec

My working environment is,
hadoop-2.4.1 wiht HA
hive-1.20 snapshot

Kindly help me to solve this error.

Thanks
Mahesh.S

Re: Hive 1.2.0 snapshot show errors when inserting intto table

Posted by Amith sha <am...@gmail.com>.
same error found while inserting into table



Thanks & Regards
Amithsha


On Tue, Mar 10, 2015 at 12:29 PM, Mahesh Sankaran
<sa...@gmail.com> wrote:
> Hi,
>     I am working in hive-1.2.0-snapshot configuration.I have configured
> metastore with mysql.I can create database and table.But when i try to
> insert data into table, am facing following error.
>
>
> hive> insert into table test values (1);
> Query ID = hadoop2_20150310122424_8c7b0c91-384e-46b4-9621-d2d335656d75
> Total jobs = 3
> Launching Job 1 out of 3
> Number of reduce tasks is set to 0 since there's no reduce operator
> Starting Job = job_1425963412408_0005, Tracking URL =
> http://nn01:8088/proxy/application_1425963412408_0005/
> Kill Command = /opt/hadoop-2.4.1/bin/hadoop job  -kill
> job_1425963412408_0005
> Hadoop job information for Stage-1: number of mappers: 1; number of
> reducers: 0
> 2015-03-10 12:25:17,150 Stage-1 map = 0%,  reduce = 0%
> 2015-03-10 12:26:05,359 Stage-1 map = 100%,  reduce = 0%, Cumulative CPU
> 2.21 sec
> MapReduce Total cumulative CPU time: 2 seconds 210 msec
> Ended Job = job_1425963412408_0005
> Stage-4 is selected by condition resolver.
> Stage-3 is filtered out by condition resolver.
> Stage-5 is filtered out by condition resolver.
> Moving data to:
> hdfs://hacluster/user/hive/warehouse1/mahesh.db/test/.hive-staging_hive_2015-03-10_12-24-53_478_6968131894660522035-1/-ext-10000
> java.lang.NoSuchMethodError:
> org.apache.hadoop.hdfs.DFSClient.getKeyProvider()Lorg/apache/hadoop/crypto/key/KeyProvider;
> at
> org.apache.hadoop.hive.shims.Hadoop23Shims$HdfsEncryptionShim.<init>(Hadoop23Shims.java:1155)
> at
> org.apache.hadoop.hive.shims.Hadoop23Shims.createHdfsEncryptionShim(Hadoop23Shims.java:1282)
> at
> org.apache.hadoop.hive.ql.session.SessionState.getHdfsEncryptionShim(SessionState.java:410)
> at org.apache.hadoop.hive.ql.metadata.Hive.moveFile(Hive.java:2542)
> at org.apache.hadoop.hive.ql.exec.MoveTask.moveFile(MoveTask.java:105)
> at org.apache.hadoop.hive.ql.exec.MoveTask.execute(MoveTask.java:222)
> at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
> at
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88)
> at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1642)
> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1402)
> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1187)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1053)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1043)
> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:207)
> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:159)
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:370)
> at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:754)
> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)
> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:615)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
> FAILED: Execution Error, return code -101 from
> org.apache.hadoop.hive.ql.exec.MoveTask.
> org.apache.hadoop.hdfs.DFSClient.getKeyProvider()Lorg/apache/hadoop/crypto/key/KeyProvider;
> MapReduce Jobs Launched:
> Stage-Stage-1: Map: 1   Cumulative CPU: 2.21 sec   HDFS Read: 3225 HDFS
> Write: 69 SUCCESS
> Total MapReduce CPU Time Spent: 2 seconds 210 msec
>
> My working environment is,
> hadoop-2.4.1 wiht HA
> hive-1.20 snapshot
>
> Kindly help me to solve this error.
>
> Thanks
> Mahesh.S
>
>