You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by Ajit Kumar Shreevastava <Aj...@hcl.com> on 2012/10/03 09:32:43 UTC

hive query fail

Hi All,

I am using oracle as a remote metastore for hive.

Whenever I fired insert or select command on this its run successfully.
But when I want to run select count(1) from pokes; it fails.

[hadoop@NHCLT-PC44-2 ~]$ hive
Logging initialized using configuration in file:/home/hadoop/Hive/conf/hive-log4j.properties
Hive history file=/home/hadoop/tmp/hadoop/hive_job_log_hadoop_201210031257_2024792684.txt
hive> select count(1) from pokes;
Total MapReduce jobs = 1
Launching Job 1 out of 1
Number of reduce tasks determined at compile time: 1
In order to change the average load for a reducer (in bytes):
  set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
  set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
  set mapred.reduce.tasks=<number>
Starting Job = job_201210031252_0003, Tracking URL = http://NHCLT-PC44-2:50030/jobdetails.jsp?jobid=job_201210031252_0003
Kill Command = /home/hadoop/hadoop-1.0.3/bin/hadoop job  -kill job_201210031252_0003

[hadoop@NHCLT-PC44-2 hadoop]$ cat hive.log
2012-10-03 12:57:41,327 WARN  conf.Configuration (Configuration.java:loadResource(1245)) - file:/home/hadoop/Hive/conf/hive-site.xml:a attempt to override final parameter: mapred.job.tracker;  Ignoring.
2012-10-03 12:57:41,726 WARN  conf.Configuration (Configuration.java:loadResource(1245)) - file:/home/hadoop/Hive/conf/hive-site.xml:a attempt to override final parameter: mapred.job.tracker;  Ignoring.
2012-10-03 12:57:41,738 WARN  conf.Configuration (Configuration.java:loadResource(1245)) - file:/home/hadoop/Hive/conf/hive-site.xml:a attempt to override final parameter: mapred.job.tracker;  Ignoring.
2012-10-03 12:57:45,629 ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.eclipse.core.resources" but it cannot be resolved.
2012-10-03 12:57:45,629 ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.eclipse.core.resources" but it cannot be resolved.
2012-10-03 12:57:45,629 ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.eclipse.core.runtime" but it cannot be resolved.
2012-10-03 12:57:45,629 ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.eclipse.core.runtime" but it cannot be resolved.
2012-10-03 12:57:45,630 ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.eclipse.text" but it cannot be resolved.
2012-10-03 12:57:45,630 ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.eclipse.text" but it cannot be resolved.
2012-10-03 12:57:46,321 WARN  conf.Configuration (Configuration.java:loadResource(1245)) - file:/home/hadoop/Hive/conf/hive-site.xml:a attempt to override final parameter: mapred.job.tracker;  Ignoring.
2012-10-03 12:57:50,024 WARN  conf.Configuration (Configuration.java:loadResource(1245)) - file:/tmp/hive-default-4124574712561576117.xml:a attempt to override final parameter: fs.checkpoint.dir;  Ignoring.
2012-10-03 12:57:50,025 WARN  conf.Configuration (Configuration.java:loadResource(1245)) - file:/home/hadoop/Hive/conf/hive-site.xml:a attempt to override final parameter: mapred.job.tracker;  Ignoring.
2012-10-03 12:57:50,570 WARN  mapred.JobClient (JobClient.java:copyAndConfigureFiles(667)) - Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
2012-10-03 12:57:50,748 WARN  snappy.LoadSnappy (LoadSnappy.java:<clinit>(46)) - Snappy native library not loaded

My hive-site.xml is :-->

[hadoop@NHCLT-PC44-2 conf]$ cat hive-site.xml
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!--
   Licensed to the Apache Software Foundation (ASF) under one or more
   contributor license agreements.  See the NOTICE file distributed with
   this work for additional information regarding copyright ownership.
   The ASF licenses this file to You under the Apache License, Version 2.0
   (the "License"); you may not use this file except in compliance with
   the License.  You may obtain a copy of the License at

       http://www.apache.org/licenses/LICENSE-2.0

   Unless required by applicable law or agreed to in writing, software
   distributed under the License is distributed on an "AS IS" BASIS,
   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
   See the License for the specific language governing permissions and
   limitations under the License.
-->

<configuration>
<property>
    <name>hive.exec.scratchdir</name>
        <value>/home/hadoop/tmp/hive-${user.name}</value>
            <description>Scratch space for Hive jobs</description>
              </property>
<property>
  <name>hive.metastore.warehouse.dir</name>
    <value>/home/hadoop/user/hive/warehouse</value>
      <description>location of default database for the warehouse</description>
</property>
<property>
  <name>hive.querylog.location</name>
      <value>/home/hadoop/tmp/${user.name}</value>
        <description>Directory where Directory where session is created in this directory. If this variable set to empty string session is created in this directory. If this variable set to empty string  </description>
</property>
<!--Hive Configuration Variables used to interact with Hadoop-->
<property>
  <name>hadoop.bin.path</name>
      <value>/home/hadoop/hadoop-1.0.3/bin/hadoop</value>
            <description>The location of hadoop script which is used to submit jobs to hadoop when submitting through a separate jvm.</description>
</property>
<property>
  <name>mapred.job.tracker</name>
    <value>hdfs://localhost:8021/</value>
      <description>Datanode1</description>
</property>
<property>
  <name>hadoop.config.dir</name>
  <value>/home/hadoop/hadoop-1.0.3/conf</value>
  <description>The location of the configuration directory of the hadoop installation</description>
</property>
<property>
   <name>javax.jdo.option.ConnectionURL</name>
   <value>jdbc:oracle:thin:@10.99.42.11:1521:clouddb</value>
</property>
<property>
   <name>javax.jdo.option.ConnectionDriverName</name>
   <value>oracle.jdbc.driver.OracleDriver</value>
</property>
<property>
   <name>javax.jdo.option.ConnectionUserName</name>                                 <value>hiveuser</value>
</property>
<property>
     <name>javax.jdo.option.ConnectionPassword</name>
     <value>hiveuser</value>
</property>
</configuration>

Thanks and Regards
Ajit Kumar Shreevastava
ADCOE (App Development Center Of Excellence )
Mobile: 9717775634



::DISCLAIMER::
----------------------------------------------------------------------------------------------------------------------------------------------------

The contents of this e-mail and any attachment(s) are confidential and intended for the named recipient(s) only.
E-mail transmission is not guaranteed to be secure or error-free as information could be intercepted, corrupted,
lost, destroyed, arrive late or incomplete, or may contain viruses in transmission. The e mail and its contents
(with or without referred errors) shall therefore not attach any liability on the originator or HCL or its affiliates.
Views or opinions, if any, presented in this email are solely those of the author and may not necessarily reflect the
views or opinions of HCL or its affiliates. Any form of reproduction, dissemination, copying, disclosure, modification,
distribution and / or publication of this message without the prior written consent of authorized representative of
HCL is strictly prohibited. If you have received this email in error please delete it and notify the sender immediately.
Before opening any email and/or attachments, please check them for viruses and other defects.

----------------------------------------------------------------------------------------------------------------------------------------------------

RE: hive query fail

Posted by Ajit Kumar Shreevastava <Aj...@hcl.com>.
Hi All,

I am using oracle as a remote metastore for hive.

I want to run select count(1) from pokes; it fails.

[hadoop@NHCLT-PC44-2 ~]$ hive
Logging initialized using configuration in file:/home/hadoop/Hive/conf/hive-log4j.properties
Hive history file=/home/hadoop/tmp/hadoop/hive_job_log_hadoop_201210031257_2024792684.txt
hive> select count(1) from pokes;
Total MapReduce jobs = 1
Launching Job 1 out of 1
Number of reduce tasks determined at compile time: 1
In order to change the average load for a reducer (in bytes):
  set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
  set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
  set mapred.reduce.tasks=<number>
Starting Job = job_201210031252_0003, Tracking URL = http://NHCLT-PC44-2:50030/jobdetails.jsp?jobid=job_201210031252_0003
Kill Command = /home/hadoop/hadoop-1.0.3/bin/hadoop job  -kill job_201210031252_0003

[hadoop@NHCLT-PC44-2 hadoop]$ cat hive.log
2012-10-03 17:20:31,965 ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.eclipse.core.resources" but it cannot be resolved.
2012-10-03 17:20:31,965 ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.eclipse.core.resources" but it cannot be resolved.
2012-10-03 17:20:31,967 ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.eclipse.core.runtime" but it cannot be resolved.
2012-10-03 17:20:31,967 ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.eclipse.core.runtime" but it cannot be resolved.
2012-10-03 17:20:31,967 ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.eclipse.text" but it cannot be resolved.
2012-10-03 17:20:31,967 ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.eclipse.text" but it cannot be resolved.
2012-10-03 17:20:36,732 WARN  mapred.JobClient (JobClient.java:copyAndConfigureFiles(667)) - Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
2012-10-03 17:20:36,856 WARN  snappy.LoadSnappy (LoadSnappy.java:<clinit>(46)) - Snappy native library not loaded


My hive-site.xml is :-->

[hadoop@NHCLT-PC44-2 conf]$ cat hive-site.xml
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!--
   Licensed to the Apache Software Foundation (ASF) under one or more
   contributor license agreements.  See the NOTICE file distributed with
   this work for additional information regarding copyright ownership.
   The ASF licenses this file to You under the Apache License, Version 2.0
   (the "License"); you may not use this file except in compliance with
   the License.  You may obtain a copy of the License at

       http://www.apache.org/licenses/LICENSE-2.0

   Unless required by applicable law or agreed to in writing, software
   distributed under the License is distributed on an "AS IS" BASIS,
   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
   See the License for the specific language governing permissions and
   limitations under the License.
-->

<configuration>
<property>
    <name>hive.exec.scratchdir</name>
        <value>/home/hadoop/tmp/hive-${user.name}</value>
            <description>Scratch space for Hive jobs</description>
              </property>
<property>
  <name>hive.metastore.warehouse.dir</name>
    <value>/home/hadoop/user/hive/warehouse</value>
      <description>location of default database for the warehouse</description>
</property>
<property>
  <name>hive.querylog.location</name>
      <value>/home/hadoop/tmp/${user.name}</value>
        <description>Directory where Directory where session is created in this directory. If this variable set to empty string session is created in this directory. If this variable set to empty string  </description>
</property>
<!--Hive Configuration Variables used to interact with Hadoop-->
<property>
  <name>hadoop.bin.path</name>
      <value>/home/hadoop/hadoop-1.0.3/bin/hadoop</value>
            <description>The location of hadoop script which is used to submit jobs to hadoop when submitting through a separate jvm.</description>
</property>
<property>
  <name>mapred.job.tracker</name>
    <value>hdfs://localhost:8021/</value>
      <description>Datanode1</description>
</property>
<property>
  <name>hadoop.config.dir</name>
  <value>/home/hadoop/hadoop-1.0.3/conf</value>
  <description>The location of the configuration directory of the hadoop installation</description>
</property>
<property>
   <name>javax.jdo.option.ConnectionURL</name>
   <value>jdbc:oracle:thin:@10.99.42.11:1521:clouddb</value>
</property>
<property>
   <name>javax.jdo.option.ConnectionDriverName</name>
   <value>oracle.jdbc.driver.OracleDriver</value>
</property>
<property>
   <name>javax.jdo.option.ConnectionUserName</name>
  <value>hiveuser</value>
</property>
<property>
     <name>javax.jdo.option.ConnectionPassword</name>
     <value>hiveuser</value>
</property>
</configuration>

Thanks and Regards
Ajit Kumar Shreevastava
ADCOE (App Development Center Of Excellence )
Mobile: 9717775634



::DISCLAIMER::
----------------------------------------------------------------------------------------------------------------------------------------------------
The contents of this e-mail and any attachment(s) are confidential and intended for the named recipient(s) only.
E-mail transmission is not guaranteed to be secure or error-free as information could be intercepted, corrupted,
lost, destroyed, arrive late or incomplete, or may contain viruses in transmission. The e mail and its contents
(with or without referred errors) shall therefore not attach any liability on the originator or HCL or its affiliates.
Views or opinions, if any, presented in this email are solely those of the author and may not necessarily reflect the
views or opinions of HCL or its affiliates. Any form of reproduction, dissemination, copying, disclosure, modification,
distribution and / or publication of this message without the prior written consent of authorized representative of
HCL is strictly prohibited. If you have received this email in error please delete it and notify the sender immediately.
Before opening any email and/or attachments, please check them for viruses and other defects.
----------------------------------------------------------------------------------------------------------------------------------------------------

Please do not print this email unless it is absolutely necessary.

The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments.

WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email.

www.wipro.com

RE: hive query fail

Posted by yo...@wipro.com.
Hi Ajit,



I have already suggested you that select command dosen't initiate map-reduce, its just dump the data.
check out your network proxy setting for you node.

try to give bypass proxy for the address.



Regards

Yogesh Kumar Dhari

________________________________
From: Ajit Kumar Shreevastava [Ajit.Shreevastava@hcl.com]
Sent: Wednesday, October 03, 2012 1:02 PM
To: user@hive.apache.org
Subject: hive query fail

Hi All,

I am using oracle as a remote metastore for hive.

Whenever I fired insert or select command on this its run successfully.
But when I want to run select count(1) from pokes; it fails.

[hadoop@NHCLT-PC44-2 ~]$ hive
Logging initialized using configuration in file:/home/hadoop/Hive/conf/hive-log4j.properties
Hive history file=/home/hadoop/tmp/hadoop/hive_job_log_hadoop_201210031257_2024792684.txt
hive> select count(1) from pokes;
Total MapReduce jobs = 1
Launching Job 1 out of 1
Number of reduce tasks determined at compile time: 1
In order to change the average load for a reducer (in bytes):
  set hive.exec.reducers.bytes.per.reducer=<number>
In order to limit the maximum number of reducers:
  set hive.exec.reducers.max=<number>
In order to set a constant number of reducers:
  set mapred.reduce.tasks=<number>
Starting Job = job_201210031252_0003, Tracking URL = http://NHCLT-PC44-2:50030/jobdetails.jsp?jobid=job_201210031252_0003
Kill Command = /home/hadoop/hadoop-1.0.3/bin/hadoop job  -kill job_201210031252_0003

[hadoop@NHCLT-PC44-2 hadoop]$ cat hive.log
2012-10-03 12:57:41,327 WARN  conf.Configuration (Configuration.java:loadResource(1245)) - file:/home/hadoop/Hive/conf/hive-site.xml:a attempt to override final parameter: mapred.job.tracker;  Ignoring.
2012-10-03 12:57:41,726 WARN  conf.Configuration (Configuration.java:loadResource(1245)) - file:/home/hadoop/Hive/conf/hive-site.xml:a attempt to override final parameter: mapred.job.tracker;  Ignoring.
2012-10-03 12:57:41,738 WARN  conf.Configuration (Configuration.java:loadResource(1245)) - file:/home/hadoop/Hive/conf/hive-site.xml:a attempt to override final parameter: mapred.job.tracker;  Ignoring.
2012-10-03 12:57:45,629 ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.eclipse.core.resources" but it cannot be resolved.
2012-10-03 12:57:45,629 ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.eclipse.core.resources" but it cannot be resolved.
2012-10-03 12:57:45,629 ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.eclipse.core.runtime" but it cannot be resolved.
2012-10-03 12:57:45,629 ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.eclipse.core.runtime" but it cannot be resolved.
2012-10-03 12:57:45,630 ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.eclipse.text" but it cannot be resolved.
2012-10-03 12:57:45,630 ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core" requires "org.eclipse.text" but it cannot be resolved.
2012-10-03 12:57:46,321 WARN  conf.Configuration (Configuration.java:loadResource(1245)) - file:/home/hadoop/Hive/conf/hive-site.xml:a attempt to override final parameter: mapred.job.tracker;  Ignoring.
2012-10-03 12:57:50,024 WARN  conf.Configuration (Configuration.java:loadResource(1245)) - file:/tmp/hive-default-4124574712561576117.xml:a attempt to override final parameter: fs.checkpoint.dir;  Ignoring.
2012-10-03 12:57:50,025 WARN  conf.Configuration (Configuration.java:loadResource(1245)) - file:/home/hadoop/Hive/conf/hive-site.xml:a attempt to override final parameter: mapred.job.tracker;  Ignoring.
2012-10-03 12:57:50,570 WARN  mapred.JobClient (JobClient.java:copyAndConfigureFiles(667)) - Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
2012-10-03 12:57:50,748 WARN  snappy.LoadSnappy (LoadSnappy.java:<clinit>(46)) - Snappy native library not loaded

My hive-site.xml is :-->

[hadoop@NHCLT-PC44-2 conf]$ cat hive-site.xml
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!--
   Licensed to the Apache Software Foundation (ASF) under one or more
   contributor license agreements.  See the NOTICE file distributed with
   this work for additional information regarding copyright ownership.
   The ASF licenses this file to You under the Apache License, Version 2.0
   (the "License"); you may not use this file except in compliance with
   the License.  You may obtain a copy of the License at

       http://www.apache.org/licenses/LICENSE-2.0

   Unless required by applicable law or agreed to in writing, software
   distributed under the License is distributed on an "AS IS" BASIS,
   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
   See the License for the specific language governing permissions and
   limitations under the License.
-->

<configuration>
<property>
    <name>hive.exec.scratchdir</name>
        <value>/home/hadoop/tmp/hive-${user.name}</value>
            <description>Scratch space for Hive jobs</description>
              </property>
<property>
  <name>hive.metastore.warehouse.dir</name>
    <value>/home/hadoop/user/hive/warehouse</value>
      <description>location of default database for the warehouse</description>
</property>
<property>
  <name>hive.querylog.location</name>
      <value>/home/hadoop/tmp/${user.name}</value>
        <description>Directory where Directory where session is created in this directory. If this variable set to empty string session is created in this directory. If this variable set to empty string  </description>
</property>
<!--Hive Configuration Variables used to interact with Hadoop-->
<property>
  <name>hadoop.bin.path</name>
      <value>/home/hadoop/hadoop-1.0.3/bin/hadoop</value>
            <description>The location of hadoop script which is used to submit jobs to hadoop when submitting through a separate jvm.</description>
</property>
<property>
  <name>mapred.job.tracker</name>
    <value>hdfs://localhost:8021/</value>
      <description>Datanode1</description>
</property>
<property>
  <name>hadoop.config.dir</name>
  <value>/home/hadoop/hadoop-1.0.3/conf</value>
  <description>The location of the configuration directory of the hadoop installation</description>
</property>
<property>
   <name>javax.jdo.option.ConnectionURL</name>
   <value>jdbc:oracle:thin:@10.99.42.11:1521:clouddb</value>
</property>
<property>
   <name>javax.jdo.option.ConnectionDriverName</name>
   <value>oracle.jdbc.driver.OracleDriver</value>
</property>
<property>
   <name>javax.jdo.option.ConnectionUserName</name>                                 <value>hiveuser</value>
</property>
<property>
     <name>javax.jdo.option.ConnectionPassword</name>
     <value>hiveuser</value>
</property>
</configuration>

Thanks and Regards
Ajit Kumar Shreevastava
ADCOE (App Development Center Of Excellence )
Mobile: 9717775634



::DISCLAIMER::
----------------------------------------------------------------------------------------------------------------------------------------------------
The contents of this e-mail and any attachment(s) are confidential and intended for the named recipient(s) only.
E-mail transmission is not guaranteed to be secure or error-free as information could be intercepted, corrupted,
lost, destroyed, arrive late or incomplete, or may contain viruses in transmission. The e mail and its contents
(with or without referred errors) shall therefore not attach any liability on the originator or HCL or its affiliates.
Views or opinions, if any, presented in this email are solely those of the author and may not necessarily reflect the
views or opinions of HCL or its affiliates. Any form of reproduction, dissemination, copying, disclosure, modification,
distribution and / or publication of this message without the prior written consent of authorized representative of
HCL is strictly prohibited. If you have received this email in error please delete it and notify the sender immediately.
Before opening any email and/or attachments, please check them for viruses and other defects.
----------------------------------------------------------------------------------------------------------------------------------------------------

Please do not print this email unless it is absolutely necessary. 

The information contained in this electronic message and any attachments to this message are intended for the exclusive use of the addressee(s) and may contain proprietary, confidential or privileged information. If you are not the intended recipient, you should not disseminate, distribute or copy this e-mail. Please notify the sender immediately and destroy all copies of this message and any attachments. 

WARNING: Computer viruses can be transmitted via email. The recipient should check this email and any attachments for the presence of viruses. The company accepts no liability for any damage caused by any virus transmitted by this email. 

www.wipro.com