You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by Alexander Pivovarov <ap...@gmail.com> on 2015/05/15 23:26:28 UTC
query uses WITH blocks and throws exception if run as Oozie hive
action (hive-0.13.1)
Hi Everyone
I'm using hive-0.13.1 (HDP-2.1.5) and getting the following stacktrace if
run my query (which has WITH block) via Oozie. (BTW, the query works fine
in CLI)
I can't put exact query but the structure is similar to
create table my_consumer
as
with sacusaloan as (select distinct e,f,g from E)
select A.a, A.b, A.c,
if(sacusaloan.id is null, 0, 1) as sacusaloan_status
from (select a,b,c from A) A
left join sacusaloan on (...)
8799 [main] INFO hive.ql.parse.ParseDriver - Parse Completed
8799 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger -
</PERFLOG method=parse start=1431723485500 end=1431723485602
duration=102 from=org.apache.hadoop.hive.ql.Driver>
8799 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - <PERFLOG
method=semanticAnalyze from=org.apache.hadoop.hive.ql.Driver>
8834 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
Starting Semantic Analysis
8837 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
Creating table wk_qualified_outsource_loan_consumer position=13
8861 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
Completed phase 1 of Semantic Analysis
8861 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
Get metadata for source tables
8865 [main] ERROR hive.ql.metadata.Hive -
NoSuchObjectException(message:default.sacusaloan table not found)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
at com.sun.proxy.$Proxy18.getTable(Unknown Source)
at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
8872 [main] ERROR hive.ql.metadata.Hive -
NoSuchObjectException(message:default.recallloan table not found)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
at com.sun.proxy.$Proxy18.getTable(Unknown Source)
at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
9700 [main] ERROR hive.ql.metadata.Hive -
NoSuchObjectException(message:default.loanmob table not found)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
at com.sun.proxy.$Proxy18.getTable(Unknown Source)
at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
9708 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
Get metadata for subqueries
9708 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
Get metadata for source tables
9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
Get metadata for subqueries
9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
Get metadata for destination tables
9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
Get metadata for source tables
9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
Get metadata for subqueries
9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
Get metadata for destination tables
9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
Get metadata for source tables
9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
Get metadata for subqueries
9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
Get metadata for source tables
9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
Get metadata for subqueries
9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
Get metadata for destination tables
9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
Get metadata for destination tables
9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
Get metadata for source tables
9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
Get metadata for subqueries
9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
Get metadata for destination tables
9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
Get metadata for destination tables
9876 [main] INFO org.apache.hadoop.hive.ql.exec.Utilities - Create
dirs hdfs://hadev/tmp/hive-svc-yarn/hive_2015-05-15_13-58-05_500_5122268870471366216-1
with permission rwxrwxrwx recursive false
9894 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
Completed getting MetaData in Semantic Analysis
10277 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
Get metadata for source tables
10289 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
Get metadata for subqueries
10290 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
Get metadata for destination tables
10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
Get metadata for source tables
10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
Get metadata for subqueries
10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
Get metadata for source tables
10320 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
Get metadata for subqueries
10321 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
Get metadata for destination tables
10321 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
Get metadata for destination tables
10816 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
Set stats collection dir :
hdfs://hadev/tmp/hive-svc-yarn/hive_2015-05-15_13-58-05_500_5122268870471366216-1/-ext-10002
Re: query uses WITH blocks and throws exception if run as Oozie hive
action (hive-0.13.1)
Posted by Harsh J <ha...@cloudera.com>.
Your question should be directed to the user@hive.apache.org lists.
On Sat, May 16, 2015 at 4:51 AM, Alexander Pivovarov
<ap...@gmail.com> wrote:
> Looks like I found it
> https://issues.apache.org/jira/browse/HIVE-9409
>
> public class UDTFOperator
> ...
>
> - protected final Log LOG = LogFactory.getLog(this.getClass().getName());
> + protected static final Log LOG =
> LogFactory.getLog(UDTFOperator.class.getName());
>
>
>
> On Fri, May 15, 2015 at 4:17 PM, Alexander Pivovarov <ap...@gmail.com>
> wrote:
>>
>> I also noticed another error message in logs
>>
>> 10848 [main] ERROR org.apache.hadoop.hive.ql.exec.tez.TezJobMonitor -
>> Status: Failed
>> 10849 [main] ERROR org.apache.hadoop.hive.ql.exec.tez.TezJobMonitor -
>> Vertex failed, vertexName=Map 32, vertexId=vertex_1431616132488_6430_1_24,
>> diagnostics=[Vertex Input: dual initializer failed.,
>> org.apache.hive.com.esotericsoftware.kryo.KryoException: Unable to find
>> class: org.apache.commons.logging.impl.SLF4JLocationAwareLog
>> Serialization trace:
>> LOG (org.apache.hadoop.hive.ql.exec.UDTFOperator)
>> childOperators (org.apache.hadoop.hive.ql.exec.SelectOperator)
>> childOperators (org.apache.hadoop.hive.ql.exec.TableScanOperator)
>> aliasToWork (org.apache.hadoop.hive.ql.plan.MapWork)]
>>
>> one of the WITH blocks had explode() UDTF
>> I replaced it with "select ... union all select ... union all select ..."
>> and query is working fine now.
>>
>> Do you know anything about UDTF and Kryo issues fixed after 0.13.1?
>>
>>
>> On Fri, May 15, 2015 at 3:20 PM, Alexander Pivovarov
>> <ap...@gmail.com> wrote:
>>>
>>> Looks like it was fixed in hive-0.14
>>> https://issues.apache.org/jira/browse/HIVE-7079
>>>
>>> On Fri, May 15, 2015 at 2:26 PM, Alexander Pivovarov
>>> <ap...@gmail.com> wrote:
>>>>
>>>> Hi Everyone
>>>>
>>>> I'm using hive-0.13.1 (HDP-2.1.5) and getting the following stacktrace
>>>> if run my query (which has WITH block) via Oozie. (BTW, the query works
>>>> fine in CLI)
>>>>
>>>> I can't put exact query but the structure is similar to
>>>>
>>>> create table my_consumer
>>>> as
>>>> with sacusaloan as (select distinct e,f,g from E)
>>>>
>>>> select A.a, A.b, A.c,
>>>> if(sacusaloan.id is null, 0, 1) as sacusaloan_status
>>>> from (select a,b,c from A) A
>>>> left join sacusaloan on (...)
>>>>
>>>> 8799 [main] INFO hive.ql.parse.ParseDriver - Parse Completed
>>>> 8799 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - </PERFLOG
>>>> method=parse start=1431723485500 end=1431723485602 duration=102
>>>> from=org.apache.hadoop.hive.ql.Driver>
>>>> 8799 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - <PERFLOG
>>>> method=semanticAnalyze from=org.apache.hadoop.hive.ql.Driver>
>>>> 8834 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Starting Semantic Analysis
>>>> 8837 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Creating table wk_qualified_outsource_loan_consumer position=13
>>>> 8861 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Completed phase 1 of Semantic Analysis
>>>> 8861 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for source tables
>>>> 8865 [main] ERROR hive.ql.metadata.Hive -
>>>> NoSuchObjectException(message:default.sacusaloan table not found)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
>>>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
>>>> at
>>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>> at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>> at
>>>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>>>> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
>>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
>>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
>>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
>>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
>>>> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
>>>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
>>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
>>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
>>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
>>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
>>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
>>>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
>>>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
>>>> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
>>>> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
>>>> at
>>>> org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
>>>> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>> at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>> at
>>>> org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
>>>> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>>>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
>>>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>>> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>>>> at java.security.AccessController.doPrivileged(Native Method)
>>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>>> at
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
>>>> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>>>>
>>>> 8872 [main] ERROR hive.ql.metadata.Hive -
>>>> NoSuchObjectException(message:default.recallloan table not found)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
>>>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
>>>> at
>>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>> at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>> at
>>>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>>>> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
>>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
>>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
>>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
>>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
>>>> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
>>>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
>>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
>>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
>>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
>>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
>>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
>>>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
>>>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
>>>> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
>>>> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
>>>> at
>>>> org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
>>>> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>> at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>> at
>>>> org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
>>>> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>>>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
>>>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>>> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>>>> at java.security.AccessController.doPrivileged(Native Method)
>>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>>> at
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
>>>> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>>>>
>>>> 9700 [main] ERROR hive.ql.metadata.Hive -
>>>> NoSuchObjectException(message:default.loanmob table not found)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
>>>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
>>>> at
>>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>> at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>> at
>>>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>>>> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
>>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
>>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
>>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
>>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
>>>> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
>>>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
>>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
>>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
>>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
>>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
>>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
>>>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
>>>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
>>>> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
>>>> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
>>>> at
>>>> org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
>>>> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>> at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>> at
>>>> org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
>>>> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>>>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
>>>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>>> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>>>> at java.security.AccessController.doPrivileged(Native Method)
>>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>>> at
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
>>>> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>>>>
>>>> 9708 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for subqueries
>>>> 9708 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for source tables
>>>> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for subqueries
>>>> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for destination tables
>>>> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for source tables
>>>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for subqueries
>>>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for destination tables
>>>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for source tables
>>>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for subqueries
>>>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for source tables
>>>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for subqueries
>>>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for destination tables
>>>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for destination tables
>>>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for source tables
>>>> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for subqueries
>>>> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for destination tables
>>>> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for destination tables
>>>> 9876 [main] INFO org.apache.hadoop.hive.ql.exec.Utilities - Create
>>>> dirs
>>>> hdfs://hadev/tmp/hive-svc-yarn/hive_2015-05-15_13-58-05_500_5122268870471366216-1
>>>> with permission rwxrwxrwx recursive false
>>>> 9894 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Completed getting MetaData in Semantic Analysis
>>>> 10277 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for source tables
>>>> 10289 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for subqueries
>>>> 10290 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for destination tables
>>>> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for source tables
>>>> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for subqueries
>>>> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for source tables
>>>> 10320 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for subqueries
>>>> 10321 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for destination tables
>>>> 10321 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for destination tables
>>>> 10816 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Set stats collection dir :
>>>> hdfs://hadev/tmp/hive-svc-yarn/hive_2015-05-15_13-58-05_500_5122268870471366216-1/-ext-10002
>>>>
>>>>
>>>
>>
>
--
Harsh J
Re: query uses WITH blocks and throws exception if run as Oozie hive
action (hive-0.13.1)
Posted by Harsh J <ha...@cloudera.com>.
Your question should be directed to the user@hive.apache.org lists.
On Sat, May 16, 2015 at 4:51 AM, Alexander Pivovarov
<ap...@gmail.com> wrote:
> Looks like I found it
> https://issues.apache.org/jira/browse/HIVE-9409
>
> public class UDTFOperator
> ...
>
> - protected final Log LOG = LogFactory.getLog(this.getClass().getName());
> + protected static final Log LOG =
> LogFactory.getLog(UDTFOperator.class.getName());
>
>
>
> On Fri, May 15, 2015 at 4:17 PM, Alexander Pivovarov <ap...@gmail.com>
> wrote:
>>
>> I also noticed another error message in logs
>>
>> 10848 [main] ERROR org.apache.hadoop.hive.ql.exec.tez.TezJobMonitor -
>> Status: Failed
>> 10849 [main] ERROR org.apache.hadoop.hive.ql.exec.tez.TezJobMonitor -
>> Vertex failed, vertexName=Map 32, vertexId=vertex_1431616132488_6430_1_24,
>> diagnostics=[Vertex Input: dual initializer failed.,
>> org.apache.hive.com.esotericsoftware.kryo.KryoException: Unable to find
>> class: org.apache.commons.logging.impl.SLF4JLocationAwareLog
>> Serialization trace:
>> LOG (org.apache.hadoop.hive.ql.exec.UDTFOperator)
>> childOperators (org.apache.hadoop.hive.ql.exec.SelectOperator)
>> childOperators (org.apache.hadoop.hive.ql.exec.TableScanOperator)
>> aliasToWork (org.apache.hadoop.hive.ql.plan.MapWork)]
>>
>> one of the WITH blocks had explode() UDTF
>> I replaced it with "select ... union all select ... union all select ..."
>> and query is working fine now.
>>
>> Do you know anything about UDTF and Kryo issues fixed after 0.13.1?
>>
>>
>> On Fri, May 15, 2015 at 3:20 PM, Alexander Pivovarov
>> <ap...@gmail.com> wrote:
>>>
>>> Looks like it was fixed in hive-0.14
>>> https://issues.apache.org/jira/browse/HIVE-7079
>>>
>>> On Fri, May 15, 2015 at 2:26 PM, Alexander Pivovarov
>>> <ap...@gmail.com> wrote:
>>>>
>>>> Hi Everyone
>>>>
>>>> I'm using hive-0.13.1 (HDP-2.1.5) and getting the following stacktrace
>>>> if run my query (which has WITH block) via Oozie. (BTW, the query works
>>>> fine in CLI)
>>>>
>>>> I can't put exact query but the structure is similar to
>>>>
>>>> create table my_consumer
>>>> as
>>>> with sacusaloan as (select distinct e,f,g from E)
>>>>
>>>> select A.a, A.b, A.c,
>>>> if(sacusaloan.id is null, 0, 1) as sacusaloan_status
>>>> from (select a,b,c from A) A
>>>> left join sacusaloan on (...)
>>>>
>>>> 8799 [main] INFO hive.ql.parse.ParseDriver - Parse Completed
>>>> 8799 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - </PERFLOG
>>>> method=parse start=1431723485500 end=1431723485602 duration=102
>>>> from=org.apache.hadoop.hive.ql.Driver>
>>>> 8799 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - <PERFLOG
>>>> method=semanticAnalyze from=org.apache.hadoop.hive.ql.Driver>
>>>> 8834 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Starting Semantic Analysis
>>>> 8837 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Creating table wk_qualified_outsource_loan_consumer position=13
>>>> 8861 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Completed phase 1 of Semantic Analysis
>>>> 8861 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for source tables
>>>> 8865 [main] ERROR hive.ql.metadata.Hive -
>>>> NoSuchObjectException(message:default.sacusaloan table not found)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
>>>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
>>>> at
>>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>> at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>> at
>>>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>>>> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
>>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
>>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
>>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
>>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
>>>> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
>>>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
>>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
>>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
>>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
>>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
>>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
>>>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
>>>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
>>>> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
>>>> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
>>>> at
>>>> org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
>>>> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>> at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>> at
>>>> org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
>>>> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>>>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
>>>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>>> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>>>> at java.security.AccessController.doPrivileged(Native Method)
>>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>>> at
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
>>>> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>>>>
>>>> 8872 [main] ERROR hive.ql.metadata.Hive -
>>>> NoSuchObjectException(message:default.recallloan table not found)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
>>>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
>>>> at
>>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>> at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>> at
>>>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>>>> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
>>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
>>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
>>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
>>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
>>>> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
>>>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
>>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
>>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
>>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
>>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
>>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
>>>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
>>>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
>>>> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
>>>> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
>>>> at
>>>> org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
>>>> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>> at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>> at
>>>> org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
>>>> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>>>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
>>>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>>> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>>>> at java.security.AccessController.doPrivileged(Native Method)
>>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>>> at
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
>>>> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>>>>
>>>> 9700 [main] ERROR hive.ql.metadata.Hive -
>>>> NoSuchObjectException(message:default.loanmob table not found)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
>>>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
>>>> at
>>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>> at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>> at
>>>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>>>> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
>>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
>>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
>>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
>>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
>>>> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
>>>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
>>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
>>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
>>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
>>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
>>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
>>>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
>>>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
>>>> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
>>>> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
>>>> at
>>>> org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
>>>> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>> at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>> at
>>>> org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
>>>> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>>>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
>>>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>>> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>>>> at java.security.AccessController.doPrivileged(Native Method)
>>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>>> at
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
>>>> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>>>>
>>>> 9708 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for subqueries
>>>> 9708 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for source tables
>>>> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for subqueries
>>>> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for destination tables
>>>> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for source tables
>>>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for subqueries
>>>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for destination tables
>>>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for source tables
>>>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for subqueries
>>>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for source tables
>>>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for subqueries
>>>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for destination tables
>>>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for destination tables
>>>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for source tables
>>>> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for subqueries
>>>> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for destination tables
>>>> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for destination tables
>>>> 9876 [main] INFO org.apache.hadoop.hive.ql.exec.Utilities - Create
>>>> dirs
>>>> hdfs://hadev/tmp/hive-svc-yarn/hive_2015-05-15_13-58-05_500_5122268870471366216-1
>>>> with permission rwxrwxrwx recursive false
>>>> 9894 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Completed getting MetaData in Semantic Analysis
>>>> 10277 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for source tables
>>>> 10289 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for subqueries
>>>> 10290 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for destination tables
>>>> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for source tables
>>>> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for subqueries
>>>> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for source tables
>>>> 10320 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for subqueries
>>>> 10321 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for destination tables
>>>> 10321 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for destination tables
>>>> 10816 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Set stats collection dir :
>>>> hdfs://hadev/tmp/hive-svc-yarn/hive_2015-05-15_13-58-05_500_5122268870471366216-1/-ext-10002
>>>>
>>>>
>>>
>>
>
--
Harsh J
Re: query uses WITH blocks and throws exception if run as Oozie hive
action (hive-0.13.1)
Posted by Harsh J <ha...@cloudera.com>.
Your question should be directed to the user@hive.apache.org lists.
On Sat, May 16, 2015 at 4:51 AM, Alexander Pivovarov
<ap...@gmail.com> wrote:
> Looks like I found it
> https://issues.apache.org/jira/browse/HIVE-9409
>
> public class UDTFOperator
> ...
>
> - protected final Log LOG = LogFactory.getLog(this.getClass().getName());
> + protected static final Log LOG =
> LogFactory.getLog(UDTFOperator.class.getName());
>
>
>
> On Fri, May 15, 2015 at 4:17 PM, Alexander Pivovarov <ap...@gmail.com>
> wrote:
>>
>> I also noticed another error message in logs
>>
>> 10848 [main] ERROR org.apache.hadoop.hive.ql.exec.tez.TezJobMonitor -
>> Status: Failed
>> 10849 [main] ERROR org.apache.hadoop.hive.ql.exec.tez.TezJobMonitor -
>> Vertex failed, vertexName=Map 32, vertexId=vertex_1431616132488_6430_1_24,
>> diagnostics=[Vertex Input: dual initializer failed.,
>> org.apache.hive.com.esotericsoftware.kryo.KryoException: Unable to find
>> class: org.apache.commons.logging.impl.SLF4JLocationAwareLog
>> Serialization trace:
>> LOG (org.apache.hadoop.hive.ql.exec.UDTFOperator)
>> childOperators (org.apache.hadoop.hive.ql.exec.SelectOperator)
>> childOperators (org.apache.hadoop.hive.ql.exec.TableScanOperator)
>> aliasToWork (org.apache.hadoop.hive.ql.plan.MapWork)]
>>
>> one of the WITH blocks had explode() UDTF
>> I replaced it with "select ... union all select ... union all select ..."
>> and query is working fine now.
>>
>> Do you know anything about UDTF and Kryo issues fixed after 0.13.1?
>>
>>
>> On Fri, May 15, 2015 at 3:20 PM, Alexander Pivovarov
>> <ap...@gmail.com> wrote:
>>>
>>> Looks like it was fixed in hive-0.14
>>> https://issues.apache.org/jira/browse/HIVE-7079
>>>
>>> On Fri, May 15, 2015 at 2:26 PM, Alexander Pivovarov
>>> <ap...@gmail.com> wrote:
>>>>
>>>> Hi Everyone
>>>>
>>>> I'm using hive-0.13.1 (HDP-2.1.5) and getting the following stacktrace
>>>> if run my query (which has WITH block) via Oozie. (BTW, the query works
>>>> fine in CLI)
>>>>
>>>> I can't put exact query but the structure is similar to
>>>>
>>>> create table my_consumer
>>>> as
>>>> with sacusaloan as (select distinct e,f,g from E)
>>>>
>>>> select A.a, A.b, A.c,
>>>> if(sacusaloan.id is null, 0, 1) as sacusaloan_status
>>>> from (select a,b,c from A) A
>>>> left join sacusaloan on (...)
>>>>
>>>> 8799 [main] INFO hive.ql.parse.ParseDriver - Parse Completed
>>>> 8799 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - </PERFLOG
>>>> method=parse start=1431723485500 end=1431723485602 duration=102
>>>> from=org.apache.hadoop.hive.ql.Driver>
>>>> 8799 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - <PERFLOG
>>>> method=semanticAnalyze from=org.apache.hadoop.hive.ql.Driver>
>>>> 8834 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Starting Semantic Analysis
>>>> 8837 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Creating table wk_qualified_outsource_loan_consumer position=13
>>>> 8861 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Completed phase 1 of Semantic Analysis
>>>> 8861 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for source tables
>>>> 8865 [main] ERROR hive.ql.metadata.Hive -
>>>> NoSuchObjectException(message:default.sacusaloan table not found)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
>>>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
>>>> at
>>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>> at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>> at
>>>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>>>> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
>>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
>>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
>>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
>>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
>>>> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
>>>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
>>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
>>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
>>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
>>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
>>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
>>>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
>>>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
>>>> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
>>>> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
>>>> at
>>>> org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
>>>> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>> at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>> at
>>>> org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
>>>> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>>>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
>>>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>>> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>>>> at java.security.AccessController.doPrivileged(Native Method)
>>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>>> at
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
>>>> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>>>>
>>>> 8872 [main] ERROR hive.ql.metadata.Hive -
>>>> NoSuchObjectException(message:default.recallloan table not found)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
>>>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
>>>> at
>>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>> at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>> at
>>>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>>>> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
>>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
>>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
>>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
>>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
>>>> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
>>>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
>>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
>>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
>>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
>>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
>>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
>>>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
>>>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
>>>> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
>>>> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
>>>> at
>>>> org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
>>>> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>> at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>> at
>>>> org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
>>>> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>>>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
>>>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>>> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>>>> at java.security.AccessController.doPrivileged(Native Method)
>>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>>> at
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
>>>> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>>>>
>>>> 9700 [main] ERROR hive.ql.metadata.Hive -
>>>> NoSuchObjectException(message:default.loanmob table not found)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
>>>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
>>>> at
>>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>> at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>> at
>>>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>>>> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
>>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
>>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
>>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
>>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
>>>> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
>>>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
>>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
>>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
>>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
>>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
>>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
>>>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
>>>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
>>>> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
>>>> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
>>>> at
>>>> org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
>>>> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>> at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>> at
>>>> org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
>>>> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>>>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
>>>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>>> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>>>> at java.security.AccessController.doPrivileged(Native Method)
>>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>>> at
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
>>>> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>>>>
>>>> 9708 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for subqueries
>>>> 9708 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for source tables
>>>> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for subqueries
>>>> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for destination tables
>>>> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for source tables
>>>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for subqueries
>>>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for destination tables
>>>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for source tables
>>>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for subqueries
>>>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for source tables
>>>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for subqueries
>>>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for destination tables
>>>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for destination tables
>>>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for source tables
>>>> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for subqueries
>>>> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for destination tables
>>>> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for destination tables
>>>> 9876 [main] INFO org.apache.hadoop.hive.ql.exec.Utilities - Create
>>>> dirs
>>>> hdfs://hadev/tmp/hive-svc-yarn/hive_2015-05-15_13-58-05_500_5122268870471366216-1
>>>> with permission rwxrwxrwx recursive false
>>>> 9894 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Completed getting MetaData in Semantic Analysis
>>>> 10277 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for source tables
>>>> 10289 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for subqueries
>>>> 10290 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for destination tables
>>>> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for source tables
>>>> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for subqueries
>>>> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for source tables
>>>> 10320 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for subqueries
>>>> 10321 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for destination tables
>>>> 10321 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for destination tables
>>>> 10816 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Set stats collection dir :
>>>> hdfs://hadev/tmp/hive-svc-yarn/hive_2015-05-15_13-58-05_500_5122268870471366216-1/-ext-10002
>>>>
>>>>
>>>
>>
>
--
Harsh J
Re: query uses WITH blocks and throws exception if run as Oozie hive
action (hive-0.13.1)
Posted by Harsh J <ha...@cloudera.com>.
Your question should be directed to the user@hive.apache.org lists.
On Sat, May 16, 2015 at 4:51 AM, Alexander Pivovarov
<ap...@gmail.com> wrote:
> Looks like I found it
> https://issues.apache.org/jira/browse/HIVE-9409
>
> public class UDTFOperator
> ...
>
> - protected final Log LOG = LogFactory.getLog(this.getClass().getName());
> + protected static final Log LOG =
> LogFactory.getLog(UDTFOperator.class.getName());
>
>
>
> On Fri, May 15, 2015 at 4:17 PM, Alexander Pivovarov <ap...@gmail.com>
> wrote:
>>
>> I also noticed another error message in logs
>>
>> 10848 [main] ERROR org.apache.hadoop.hive.ql.exec.tez.TezJobMonitor -
>> Status: Failed
>> 10849 [main] ERROR org.apache.hadoop.hive.ql.exec.tez.TezJobMonitor -
>> Vertex failed, vertexName=Map 32, vertexId=vertex_1431616132488_6430_1_24,
>> diagnostics=[Vertex Input: dual initializer failed.,
>> org.apache.hive.com.esotericsoftware.kryo.KryoException: Unable to find
>> class: org.apache.commons.logging.impl.SLF4JLocationAwareLog
>> Serialization trace:
>> LOG (org.apache.hadoop.hive.ql.exec.UDTFOperator)
>> childOperators (org.apache.hadoop.hive.ql.exec.SelectOperator)
>> childOperators (org.apache.hadoop.hive.ql.exec.TableScanOperator)
>> aliasToWork (org.apache.hadoop.hive.ql.plan.MapWork)]
>>
>> one of the WITH blocks had explode() UDTF
>> I replaced it with "select ... union all select ... union all select ..."
>> and query is working fine now.
>>
>> Do you know anything about UDTF and Kryo issues fixed after 0.13.1?
>>
>>
>> On Fri, May 15, 2015 at 3:20 PM, Alexander Pivovarov
>> <ap...@gmail.com> wrote:
>>>
>>> Looks like it was fixed in hive-0.14
>>> https://issues.apache.org/jira/browse/HIVE-7079
>>>
>>> On Fri, May 15, 2015 at 2:26 PM, Alexander Pivovarov
>>> <ap...@gmail.com> wrote:
>>>>
>>>> Hi Everyone
>>>>
>>>> I'm using hive-0.13.1 (HDP-2.1.5) and getting the following stacktrace
>>>> if run my query (which has WITH block) via Oozie. (BTW, the query works
>>>> fine in CLI)
>>>>
>>>> I can't put exact query but the structure is similar to
>>>>
>>>> create table my_consumer
>>>> as
>>>> with sacusaloan as (select distinct e,f,g from E)
>>>>
>>>> select A.a, A.b, A.c,
>>>> if(sacusaloan.id is null, 0, 1) as sacusaloan_status
>>>> from (select a,b,c from A) A
>>>> left join sacusaloan on (...)
>>>>
>>>> 8799 [main] INFO hive.ql.parse.ParseDriver - Parse Completed
>>>> 8799 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - </PERFLOG
>>>> method=parse start=1431723485500 end=1431723485602 duration=102
>>>> from=org.apache.hadoop.hive.ql.Driver>
>>>> 8799 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - <PERFLOG
>>>> method=semanticAnalyze from=org.apache.hadoop.hive.ql.Driver>
>>>> 8834 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Starting Semantic Analysis
>>>> 8837 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Creating table wk_qualified_outsource_loan_consumer position=13
>>>> 8861 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Completed phase 1 of Semantic Analysis
>>>> 8861 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for source tables
>>>> 8865 [main] ERROR hive.ql.metadata.Hive -
>>>> NoSuchObjectException(message:default.sacusaloan table not found)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
>>>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
>>>> at
>>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>> at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>> at
>>>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>>>> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
>>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
>>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
>>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
>>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
>>>> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
>>>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
>>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
>>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
>>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
>>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
>>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
>>>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
>>>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
>>>> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
>>>> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
>>>> at
>>>> org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
>>>> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>> at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>> at
>>>> org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
>>>> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>>>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
>>>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>>> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>>>> at java.security.AccessController.doPrivileged(Native Method)
>>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>>> at
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
>>>> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>>>>
>>>> 8872 [main] ERROR hive.ql.metadata.Hive -
>>>> NoSuchObjectException(message:default.recallloan table not found)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
>>>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
>>>> at
>>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>> at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>> at
>>>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>>>> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
>>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
>>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
>>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
>>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
>>>> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
>>>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
>>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
>>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
>>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
>>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
>>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
>>>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
>>>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
>>>> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
>>>> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
>>>> at
>>>> org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
>>>> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>> at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>> at
>>>> org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
>>>> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>>>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
>>>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>>> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>>>> at java.security.AccessController.doPrivileged(Native Method)
>>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>>> at
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
>>>> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>>>>
>>>> 9700 [main] ERROR hive.ql.metadata.Hive -
>>>> NoSuchObjectException(message:default.loanmob table not found)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
>>>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
>>>> at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
>>>> at
>>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>> at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>> at
>>>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>>>> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
>>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
>>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
>>>> at
>>>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
>>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
>>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
>>>> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
>>>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
>>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
>>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
>>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
>>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
>>>> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
>>>> at
>>>> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
>>>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
>>>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
>>>> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
>>>> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
>>>> at
>>>> org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
>>>> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
>>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>> at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>> at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>>> at
>>>> org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
>>>> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>>>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
>>>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>>> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>>>> at java.security.AccessController.doPrivileged(Native Method)
>>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>>> at
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
>>>> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>>>>
>>>> 9708 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for subqueries
>>>> 9708 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for source tables
>>>> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for subqueries
>>>> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for destination tables
>>>> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for source tables
>>>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for subqueries
>>>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for destination tables
>>>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for source tables
>>>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for subqueries
>>>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for source tables
>>>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for subqueries
>>>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for destination tables
>>>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for destination tables
>>>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for source tables
>>>> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for subqueries
>>>> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for destination tables
>>>> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for destination tables
>>>> 9876 [main] INFO org.apache.hadoop.hive.ql.exec.Utilities - Create
>>>> dirs
>>>> hdfs://hadev/tmp/hive-svc-yarn/hive_2015-05-15_13-58-05_500_5122268870471366216-1
>>>> with permission rwxrwxrwx recursive false
>>>> 9894 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Completed getting MetaData in Semantic Analysis
>>>> 10277 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for source tables
>>>> 10289 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for subqueries
>>>> 10290 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for destination tables
>>>> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for source tables
>>>> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for subqueries
>>>> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for source tables
>>>> 10320 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for subqueries
>>>> 10321 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for destination tables
>>>> 10321 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Get metadata for destination tables
>>>> 10816 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer -
>>>> Set stats collection dir :
>>>> hdfs://hadev/tmp/hive-svc-yarn/hive_2015-05-15_13-58-05_500_5122268870471366216-1/-ext-10002
>>>>
>>>>
>>>
>>
>
--
Harsh J
Re: query uses WITH blocks and throws exception if run as Oozie hive
action (hive-0.13.1)
Posted by Alexander Pivovarov <ap...@gmail.com>.
Looks like I found it
https://issues.apache.org/jira/browse/HIVE-9409
public class UDTFOperator
...
- protected final Log LOG = LogFactory.getLog(this.getClass().getName());
+ protected static final Log LOG =
LogFactory.getLog(UDTFOperator.class.getName());
On Fri, May 15, 2015 at 4:17 PM, Alexander Pivovarov <ap...@gmail.com>
wrote:
> I also noticed another error message in logs
>
> 10848 [main] ERROR org.apache.hadoop.hive.ql.exec.tez.TezJobMonitor -
> Status: Failed
> 10849 [main] ERROR org.apache.hadoop.hive.ql.exec.tez.TezJobMonitor -
> Vertex failed, vertexName=Map 32, vertexId=vertex_1431616132488_6430_1_24,
> diagnostics=[Vertex Input: dual initializer failed.,
> org.apache.hive.com.esotericsoftware.kryo.KryoException: Unable to find
> class: org.apache.commons.logging.impl.SLF4JLocationAwareLog
> Serialization trace:
> LOG (org.apache.hadoop.hive.ql.exec.UDTFOperator)
> childOperators (org.apache.hadoop.hive.ql.exec.SelectOperator)
> childOperators (org.apache.hadoop.hive.ql.exec.TableScanOperator)
> aliasToWork (org.apache.hadoop.hive.ql.plan.MapWork)]
>
> one of the WITH blocks had explode() UDTF
> I replaced it with "select ... union all select ... union all select ..."
> and query is working fine now.
>
> Do you know anything about UDTF and Kryo issues fixed after 0.13.1?
>
>
> On Fri, May 15, 2015 at 3:20 PM, Alexander Pivovarov <apivovarov@gmail.com
> > wrote:
>
>> Looks like it was fixed in hive-0.14
>> https://issues.apache.org/jira/browse/HIVE-7079
>>
>> On Fri, May 15, 2015 at 2:26 PM, Alexander Pivovarov <
>> apivovarov@gmail.com> wrote:
>>
>>> Hi Everyone
>>>
>>> I'm using hive-0.13.1 (HDP-2.1.5) and getting the following stacktrace
>>> if run my query (which has WITH block) via Oozie. (BTW, the query works
>>> fine in CLI)
>>>
>>> I can't put exact query but the structure is similar to
>>>
>>> create table my_consumer
>>> as
>>> with sacusaloan as (select distinct e,f,g from E)
>>>
>>> select A.a, A.b, A.c,
>>> if(sacusaloan.id is null, 0, 1) as sacusaloan_status
>>> from (select a,b,c from A) A
>>> left join sacusaloan on (...)
>>>
>>> 8799 [main] INFO hive.ql.parse.ParseDriver - Parse Completed
>>> 8799 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - </PERFLOG method=parse start=1431723485500 end=1431723485602 duration=102 from=org.apache.hadoop.hive.ql.Driver>
>>> 8799 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - <PERFLOG method=semanticAnalyze from=org.apache.hadoop.hive.ql.Driver>
>>> 8834 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Starting Semantic Analysis
>>> 8837 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Creating table wk_qualified_outsource_loan_consumer position=13
>>> 8861 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Completed phase 1 of Semantic Analysis
>>> 8861 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>>> 8865 [main] ERROR hive.ql.metadata.Hive - NoSuchObjectException(message:default.sacusaloan table not found)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
>>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
>>> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>>> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
>>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
>>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
>>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
>>> at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
>>> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
>>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
>>> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
>>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
>>> at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
>>> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
>>> at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
>>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
>>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
>>> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
>>> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
>>> at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
>>> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
>>> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
>>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>>> at java.security.AccessController.doPrivileged(Native Method)
>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
>>> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>>>
>>> 8872 [main] ERROR hive.ql.metadata.Hive - NoSuchObjectException(message:default.recallloan table not found)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
>>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
>>> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>>> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
>>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
>>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
>>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
>>> at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
>>> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
>>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
>>> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
>>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
>>> at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
>>> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
>>> at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
>>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
>>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
>>> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
>>> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
>>> at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
>>> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
>>> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
>>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>>> at java.security.AccessController.doPrivileged(Native Method)
>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
>>> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>>>
>>> 9700 [main] ERROR hive.ql.metadata.Hive - NoSuchObjectException(message:default.loanmob table not found)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
>>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
>>> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>>> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
>>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
>>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
>>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
>>> at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
>>> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
>>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
>>> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
>>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
>>> at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
>>> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
>>> at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
>>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
>>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
>>> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
>>> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
>>> at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
>>> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
>>> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
>>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>>> at java.security.AccessController.doPrivileged(Native Method)
>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
>>> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>>>
>>> 9708 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>>> 9708 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>>> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>>> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>>> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>>> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>>> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>>> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>>> 9876 [main] INFO org.apache.hadoop.hive.ql.exec.Utilities - Create dirs hdfs://hadev/tmp/hive-svc-yarn/hive_2015-05-15_13-58-05_500_5122268870471366216-1 with permission rwxrwxrwx recursive false
>>> 9894 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Completed getting MetaData in Semantic Analysis
>>> 10277 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>>> 10289 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>>> 10290 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>>> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>>> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>>> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>>> 10320 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>>> 10321 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>>> 10321 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>>> 10816 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Set stats collection dir : hdfs://hadev/tmp/hive-svc-yarn/hive_2015-05-15_13-58-05_500_5122268870471366216-1/-ext-10002
>>>
>>>
>>>
>>
>
Re: query uses WITH blocks and throws exception if run as Oozie hive
action (hive-0.13.1)
Posted by Alexander Pivovarov <ap...@gmail.com>.
Looks like I found it
https://issues.apache.org/jira/browse/HIVE-9409
public class UDTFOperator
...
- protected final Log LOG = LogFactory.getLog(this.getClass().getName());
+ protected static final Log LOG =
LogFactory.getLog(UDTFOperator.class.getName());
On Fri, May 15, 2015 at 4:17 PM, Alexander Pivovarov <ap...@gmail.com>
wrote:
> I also noticed another error message in logs
>
> 10848 [main] ERROR org.apache.hadoop.hive.ql.exec.tez.TezJobMonitor -
> Status: Failed
> 10849 [main] ERROR org.apache.hadoop.hive.ql.exec.tez.TezJobMonitor -
> Vertex failed, vertexName=Map 32, vertexId=vertex_1431616132488_6430_1_24,
> diagnostics=[Vertex Input: dual initializer failed.,
> org.apache.hive.com.esotericsoftware.kryo.KryoException: Unable to find
> class: org.apache.commons.logging.impl.SLF4JLocationAwareLog
> Serialization trace:
> LOG (org.apache.hadoop.hive.ql.exec.UDTFOperator)
> childOperators (org.apache.hadoop.hive.ql.exec.SelectOperator)
> childOperators (org.apache.hadoop.hive.ql.exec.TableScanOperator)
> aliasToWork (org.apache.hadoop.hive.ql.plan.MapWork)]
>
> one of the WITH blocks had explode() UDTF
> I replaced it with "select ... union all select ... union all select ..."
> and query is working fine now.
>
> Do you know anything about UDTF and Kryo issues fixed after 0.13.1?
>
>
> On Fri, May 15, 2015 at 3:20 PM, Alexander Pivovarov <apivovarov@gmail.com
> > wrote:
>
>> Looks like it was fixed in hive-0.14
>> https://issues.apache.org/jira/browse/HIVE-7079
>>
>> On Fri, May 15, 2015 at 2:26 PM, Alexander Pivovarov <
>> apivovarov@gmail.com> wrote:
>>
>>> Hi Everyone
>>>
>>> I'm using hive-0.13.1 (HDP-2.1.5) and getting the following stacktrace
>>> if run my query (which has WITH block) via Oozie. (BTW, the query works
>>> fine in CLI)
>>>
>>> I can't put exact query but the structure is similar to
>>>
>>> create table my_consumer
>>> as
>>> with sacusaloan as (select distinct e,f,g from E)
>>>
>>> select A.a, A.b, A.c,
>>> if(sacusaloan.id is null, 0, 1) as sacusaloan_status
>>> from (select a,b,c from A) A
>>> left join sacusaloan on (...)
>>>
>>> 8799 [main] INFO hive.ql.parse.ParseDriver - Parse Completed
>>> 8799 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - </PERFLOG method=parse start=1431723485500 end=1431723485602 duration=102 from=org.apache.hadoop.hive.ql.Driver>
>>> 8799 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - <PERFLOG method=semanticAnalyze from=org.apache.hadoop.hive.ql.Driver>
>>> 8834 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Starting Semantic Analysis
>>> 8837 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Creating table wk_qualified_outsource_loan_consumer position=13
>>> 8861 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Completed phase 1 of Semantic Analysis
>>> 8861 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>>> 8865 [main] ERROR hive.ql.metadata.Hive - NoSuchObjectException(message:default.sacusaloan table not found)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
>>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
>>> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>>> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
>>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
>>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
>>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
>>> at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
>>> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
>>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
>>> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
>>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
>>> at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
>>> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
>>> at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
>>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
>>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
>>> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
>>> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
>>> at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
>>> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
>>> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
>>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>>> at java.security.AccessController.doPrivileged(Native Method)
>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
>>> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>>>
>>> 8872 [main] ERROR hive.ql.metadata.Hive - NoSuchObjectException(message:default.recallloan table not found)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
>>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
>>> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>>> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
>>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
>>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
>>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
>>> at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
>>> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
>>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
>>> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
>>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
>>> at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
>>> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
>>> at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
>>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
>>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
>>> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
>>> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
>>> at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
>>> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
>>> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
>>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>>> at java.security.AccessController.doPrivileged(Native Method)
>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
>>> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>>>
>>> 9700 [main] ERROR hive.ql.metadata.Hive - NoSuchObjectException(message:default.loanmob table not found)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
>>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
>>> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>>> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
>>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
>>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
>>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
>>> at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
>>> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
>>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
>>> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
>>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
>>> at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
>>> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
>>> at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
>>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
>>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
>>> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
>>> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
>>> at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
>>> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
>>> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
>>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>>> at java.security.AccessController.doPrivileged(Native Method)
>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
>>> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>>>
>>> 9708 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>>> 9708 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>>> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>>> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>>> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>>> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>>> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>>> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>>> 9876 [main] INFO org.apache.hadoop.hive.ql.exec.Utilities - Create dirs hdfs://hadev/tmp/hive-svc-yarn/hive_2015-05-15_13-58-05_500_5122268870471366216-1 with permission rwxrwxrwx recursive false
>>> 9894 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Completed getting MetaData in Semantic Analysis
>>> 10277 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>>> 10289 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>>> 10290 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>>> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>>> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>>> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>>> 10320 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>>> 10321 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>>> 10321 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>>> 10816 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Set stats collection dir : hdfs://hadev/tmp/hive-svc-yarn/hive_2015-05-15_13-58-05_500_5122268870471366216-1/-ext-10002
>>>
>>>
>>>
>>
>
Re: query uses WITH blocks and throws exception if run as Oozie hive
action (hive-0.13.1)
Posted by Alexander Pivovarov <ap...@gmail.com>.
Looks like I found it
https://issues.apache.org/jira/browse/HIVE-9409
public class UDTFOperator
...
- protected final Log LOG = LogFactory.getLog(this.getClass().getName());
+ protected static final Log LOG =
LogFactory.getLog(UDTFOperator.class.getName());
On Fri, May 15, 2015 at 4:17 PM, Alexander Pivovarov <ap...@gmail.com>
wrote:
> I also noticed another error message in logs
>
> 10848 [main] ERROR org.apache.hadoop.hive.ql.exec.tez.TezJobMonitor -
> Status: Failed
> 10849 [main] ERROR org.apache.hadoop.hive.ql.exec.tez.TezJobMonitor -
> Vertex failed, vertexName=Map 32, vertexId=vertex_1431616132488_6430_1_24,
> diagnostics=[Vertex Input: dual initializer failed.,
> org.apache.hive.com.esotericsoftware.kryo.KryoException: Unable to find
> class: org.apache.commons.logging.impl.SLF4JLocationAwareLog
> Serialization trace:
> LOG (org.apache.hadoop.hive.ql.exec.UDTFOperator)
> childOperators (org.apache.hadoop.hive.ql.exec.SelectOperator)
> childOperators (org.apache.hadoop.hive.ql.exec.TableScanOperator)
> aliasToWork (org.apache.hadoop.hive.ql.plan.MapWork)]
>
> one of the WITH blocks had explode() UDTF
> I replaced it with "select ... union all select ... union all select ..."
> and query is working fine now.
>
> Do you know anything about UDTF and Kryo issues fixed after 0.13.1?
>
>
> On Fri, May 15, 2015 at 3:20 PM, Alexander Pivovarov <apivovarov@gmail.com
> > wrote:
>
>> Looks like it was fixed in hive-0.14
>> https://issues.apache.org/jira/browse/HIVE-7079
>>
>> On Fri, May 15, 2015 at 2:26 PM, Alexander Pivovarov <
>> apivovarov@gmail.com> wrote:
>>
>>> Hi Everyone
>>>
>>> I'm using hive-0.13.1 (HDP-2.1.5) and getting the following stacktrace
>>> if run my query (which has WITH block) via Oozie. (BTW, the query works
>>> fine in CLI)
>>>
>>> I can't put exact query but the structure is similar to
>>>
>>> create table my_consumer
>>> as
>>> with sacusaloan as (select distinct e,f,g from E)
>>>
>>> select A.a, A.b, A.c,
>>> if(sacusaloan.id is null, 0, 1) as sacusaloan_status
>>> from (select a,b,c from A) A
>>> left join sacusaloan on (...)
>>>
>>> 8799 [main] INFO hive.ql.parse.ParseDriver - Parse Completed
>>> 8799 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - </PERFLOG method=parse start=1431723485500 end=1431723485602 duration=102 from=org.apache.hadoop.hive.ql.Driver>
>>> 8799 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - <PERFLOG method=semanticAnalyze from=org.apache.hadoop.hive.ql.Driver>
>>> 8834 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Starting Semantic Analysis
>>> 8837 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Creating table wk_qualified_outsource_loan_consumer position=13
>>> 8861 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Completed phase 1 of Semantic Analysis
>>> 8861 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>>> 8865 [main] ERROR hive.ql.metadata.Hive - NoSuchObjectException(message:default.sacusaloan table not found)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
>>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
>>> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>>> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
>>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
>>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
>>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
>>> at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
>>> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
>>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
>>> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
>>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
>>> at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
>>> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
>>> at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
>>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
>>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
>>> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
>>> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
>>> at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
>>> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
>>> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
>>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>>> at java.security.AccessController.doPrivileged(Native Method)
>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
>>> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>>>
>>> 8872 [main] ERROR hive.ql.metadata.Hive - NoSuchObjectException(message:default.recallloan table not found)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
>>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
>>> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>>> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
>>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
>>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
>>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
>>> at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
>>> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
>>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
>>> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
>>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
>>> at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
>>> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
>>> at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
>>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
>>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
>>> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
>>> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
>>> at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
>>> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
>>> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
>>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>>> at java.security.AccessController.doPrivileged(Native Method)
>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
>>> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>>>
>>> 9700 [main] ERROR hive.ql.metadata.Hive - NoSuchObjectException(message:default.loanmob table not found)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
>>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
>>> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>>> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
>>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
>>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
>>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
>>> at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
>>> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
>>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
>>> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
>>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
>>> at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
>>> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
>>> at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
>>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
>>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
>>> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
>>> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
>>> at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
>>> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
>>> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
>>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>>> at java.security.AccessController.doPrivileged(Native Method)
>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
>>> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>>>
>>> 9708 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>>> 9708 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>>> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>>> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>>> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>>> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>>> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>>> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>>> 9876 [main] INFO org.apache.hadoop.hive.ql.exec.Utilities - Create dirs hdfs://hadev/tmp/hive-svc-yarn/hive_2015-05-15_13-58-05_500_5122268870471366216-1 with permission rwxrwxrwx recursive false
>>> 9894 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Completed getting MetaData in Semantic Analysis
>>> 10277 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>>> 10289 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>>> 10290 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>>> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>>> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>>> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>>> 10320 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>>> 10321 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>>> 10321 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>>> 10816 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Set stats collection dir : hdfs://hadev/tmp/hive-svc-yarn/hive_2015-05-15_13-58-05_500_5122268870471366216-1/-ext-10002
>>>
>>>
>>>
>>
>
Re: query uses WITH blocks and throws exception if run as Oozie hive
action (hive-0.13.1)
Posted by Alexander Pivovarov <ap...@gmail.com>.
Looks like I found it
https://issues.apache.org/jira/browse/HIVE-9409
public class UDTFOperator
...
- protected final Log LOG = LogFactory.getLog(this.getClass().getName());
+ protected static final Log LOG =
LogFactory.getLog(UDTFOperator.class.getName());
On Fri, May 15, 2015 at 4:17 PM, Alexander Pivovarov <ap...@gmail.com>
wrote:
> I also noticed another error message in logs
>
> 10848 [main] ERROR org.apache.hadoop.hive.ql.exec.tez.TezJobMonitor -
> Status: Failed
> 10849 [main] ERROR org.apache.hadoop.hive.ql.exec.tez.TezJobMonitor -
> Vertex failed, vertexName=Map 32, vertexId=vertex_1431616132488_6430_1_24,
> diagnostics=[Vertex Input: dual initializer failed.,
> org.apache.hive.com.esotericsoftware.kryo.KryoException: Unable to find
> class: org.apache.commons.logging.impl.SLF4JLocationAwareLog
> Serialization trace:
> LOG (org.apache.hadoop.hive.ql.exec.UDTFOperator)
> childOperators (org.apache.hadoop.hive.ql.exec.SelectOperator)
> childOperators (org.apache.hadoop.hive.ql.exec.TableScanOperator)
> aliasToWork (org.apache.hadoop.hive.ql.plan.MapWork)]
>
> one of the WITH blocks had explode() UDTF
> I replaced it with "select ... union all select ... union all select ..."
> and query is working fine now.
>
> Do you know anything about UDTF and Kryo issues fixed after 0.13.1?
>
>
> On Fri, May 15, 2015 at 3:20 PM, Alexander Pivovarov <apivovarov@gmail.com
> > wrote:
>
>> Looks like it was fixed in hive-0.14
>> https://issues.apache.org/jira/browse/HIVE-7079
>>
>> On Fri, May 15, 2015 at 2:26 PM, Alexander Pivovarov <
>> apivovarov@gmail.com> wrote:
>>
>>> Hi Everyone
>>>
>>> I'm using hive-0.13.1 (HDP-2.1.5) and getting the following stacktrace
>>> if run my query (which has WITH block) via Oozie. (BTW, the query works
>>> fine in CLI)
>>>
>>> I can't put exact query but the structure is similar to
>>>
>>> create table my_consumer
>>> as
>>> with sacusaloan as (select distinct e,f,g from E)
>>>
>>> select A.a, A.b, A.c,
>>> if(sacusaloan.id is null, 0, 1) as sacusaloan_status
>>> from (select a,b,c from A) A
>>> left join sacusaloan on (...)
>>>
>>> 8799 [main] INFO hive.ql.parse.ParseDriver - Parse Completed
>>> 8799 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - </PERFLOG method=parse start=1431723485500 end=1431723485602 duration=102 from=org.apache.hadoop.hive.ql.Driver>
>>> 8799 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - <PERFLOG method=semanticAnalyze from=org.apache.hadoop.hive.ql.Driver>
>>> 8834 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Starting Semantic Analysis
>>> 8837 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Creating table wk_qualified_outsource_loan_consumer position=13
>>> 8861 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Completed phase 1 of Semantic Analysis
>>> 8861 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>>> 8865 [main] ERROR hive.ql.metadata.Hive - NoSuchObjectException(message:default.sacusaloan table not found)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
>>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
>>> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>>> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
>>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
>>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
>>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
>>> at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
>>> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
>>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
>>> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
>>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
>>> at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
>>> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
>>> at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
>>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
>>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
>>> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
>>> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
>>> at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
>>> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
>>> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
>>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>>> at java.security.AccessController.doPrivileged(Native Method)
>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
>>> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>>>
>>> 8872 [main] ERROR hive.ql.metadata.Hive - NoSuchObjectException(message:default.recallloan table not found)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
>>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
>>> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>>> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
>>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
>>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
>>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
>>> at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
>>> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
>>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
>>> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
>>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
>>> at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
>>> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
>>> at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
>>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
>>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
>>> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
>>> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
>>> at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
>>> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
>>> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
>>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>>> at java.security.AccessController.doPrivileged(Native Method)
>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
>>> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>>>
>>> 9700 [main] ERROR hive.ql.metadata.Hive - NoSuchObjectException(message:default.loanmob table not found)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
>>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
>>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
>>> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>>> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
>>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
>>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
>>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
>>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
>>> at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
>>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
>>> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
>>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
>>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
>>> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
>>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
>>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
>>> at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
>>> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
>>> at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
>>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
>>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
>>> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
>>> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
>>> at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
>>> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.lang.reflect.Method.invoke(Method.java:606)
>>> at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
>>> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
>>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>>> at java.security.AccessController.doPrivileged(Native Method)
>>> at javax.security.auth.Subject.doAs(Subject.java:415)
>>> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
>>> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>>>
>>> 9708 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>>> 9708 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>>> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>>> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>>> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>>> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>>> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>>> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>>> 9876 [main] INFO org.apache.hadoop.hive.ql.exec.Utilities - Create dirs hdfs://hadev/tmp/hive-svc-yarn/hive_2015-05-15_13-58-05_500_5122268870471366216-1 with permission rwxrwxrwx recursive false
>>> 9894 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Completed getting MetaData in Semantic Analysis
>>> 10277 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>>> 10289 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>>> 10290 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>>> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>>> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>>> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>>> 10320 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>>> 10321 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>>> 10321 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>>> 10816 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Set stats collection dir : hdfs://hadev/tmp/hive-svc-yarn/hive_2015-05-15_13-58-05_500_5122268870471366216-1/-ext-10002
>>>
>>>
>>>
>>
>
Re: query uses WITH blocks and throws exception if run as Oozie hive
action (hive-0.13.1)
Posted by Alexander Pivovarov <ap...@gmail.com>.
I also noticed another error message in logs
10848 [main] ERROR org.apache.hadoop.hive.ql.exec.tez.TezJobMonitor -
Status: Failed
10849 [main] ERROR org.apache.hadoop.hive.ql.exec.tez.TezJobMonitor -
Vertex failed, vertexName=Map 32, vertexId=vertex_1431616132488_6430_1_24,
diagnostics=[Vertex Input: dual initializer failed.,
org.apache.hive.com.esotericsoftware.kryo.KryoException: Unable to find
class: org.apache.commons.logging.impl.SLF4JLocationAwareLog
Serialization trace:
LOG (org.apache.hadoop.hive.ql.exec.UDTFOperator)
childOperators (org.apache.hadoop.hive.ql.exec.SelectOperator)
childOperators (org.apache.hadoop.hive.ql.exec.TableScanOperator)
aliasToWork (org.apache.hadoop.hive.ql.plan.MapWork)]
one of the WITH blocks had explode() UDTF
I replaced it with "select ... union all select ... union all select ..."
and query is working fine now.
Do you know anything about UDTF and Kryo issues fixed after 0.13.1?
On Fri, May 15, 2015 at 3:20 PM, Alexander Pivovarov <ap...@gmail.com>
wrote:
> Looks like it was fixed in hive-0.14
> https://issues.apache.org/jira/browse/HIVE-7079
>
> On Fri, May 15, 2015 at 2:26 PM, Alexander Pivovarov <apivovarov@gmail.com
> > wrote:
>
>> Hi Everyone
>>
>> I'm using hive-0.13.1 (HDP-2.1.5) and getting the following stacktrace
>> if run my query (which has WITH block) via Oozie. (BTW, the query works
>> fine in CLI)
>>
>> I can't put exact query but the structure is similar to
>>
>> create table my_consumer
>> as
>> with sacusaloan as (select distinct e,f,g from E)
>>
>> select A.a, A.b, A.c,
>> if(sacusaloan.id is null, 0, 1) as sacusaloan_status
>> from (select a,b,c from A) A
>> left join sacusaloan on (...)
>>
>> 8799 [main] INFO hive.ql.parse.ParseDriver - Parse Completed
>> 8799 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - </PERFLOG method=parse start=1431723485500 end=1431723485602 duration=102 from=org.apache.hadoop.hive.ql.Driver>
>> 8799 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - <PERFLOG method=semanticAnalyze from=org.apache.hadoop.hive.ql.Driver>
>> 8834 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Starting Semantic Analysis
>> 8837 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Creating table wk_qualified_outsource_loan_consumer position=13
>> 8861 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Completed phase 1 of Semantic Analysis
>> 8861 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>> 8865 [main] ERROR hive.ql.metadata.Hive - NoSuchObjectException(message:default.sacusaloan table not found)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
>> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:606)
>> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
>> at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
>> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
>> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
>> at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
>> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
>> at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
>> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
>> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
>> at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
>> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:606)
>> at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
>> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at javax.security.auth.Subject.doAs(Subject.java:415)
>> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
>> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>>
>> 8872 [main] ERROR hive.ql.metadata.Hive - NoSuchObjectException(message:default.recallloan table not found)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
>> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:606)
>> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
>> at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
>> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
>> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
>> at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
>> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
>> at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
>> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
>> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
>> at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
>> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:606)
>> at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
>> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at javax.security.auth.Subject.doAs(Subject.java:415)
>> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
>> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>>
>> 9700 [main] ERROR hive.ql.metadata.Hive - NoSuchObjectException(message:default.loanmob table not found)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
>> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:606)
>> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
>> at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
>> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
>> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
>> at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
>> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
>> at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
>> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
>> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
>> at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
>> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:606)
>> at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
>> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at javax.security.auth.Subject.doAs(Subject.java:415)
>> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
>> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>>
>> 9708 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>> 9708 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>> 9876 [main] INFO org.apache.hadoop.hive.ql.exec.Utilities - Create dirs hdfs://hadev/tmp/hive-svc-yarn/hive_2015-05-15_13-58-05_500_5122268870471366216-1 with permission rwxrwxrwx recursive false
>> 9894 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Completed getting MetaData in Semantic Analysis
>> 10277 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>> 10289 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>> 10290 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>> 10320 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>> 10321 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>> 10321 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>> 10816 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Set stats collection dir : hdfs://hadev/tmp/hive-svc-yarn/hive_2015-05-15_13-58-05_500_5122268870471366216-1/-ext-10002
>>
>>
>>
>
Re: query uses WITH blocks and throws exception if run as Oozie hive
action (hive-0.13.1)
Posted by Alexander Pivovarov <ap...@gmail.com>.
I also noticed another error message in logs
10848 [main] ERROR org.apache.hadoop.hive.ql.exec.tez.TezJobMonitor -
Status: Failed
10849 [main] ERROR org.apache.hadoop.hive.ql.exec.tez.TezJobMonitor -
Vertex failed, vertexName=Map 32, vertexId=vertex_1431616132488_6430_1_24,
diagnostics=[Vertex Input: dual initializer failed.,
org.apache.hive.com.esotericsoftware.kryo.KryoException: Unable to find
class: org.apache.commons.logging.impl.SLF4JLocationAwareLog
Serialization trace:
LOG (org.apache.hadoop.hive.ql.exec.UDTFOperator)
childOperators (org.apache.hadoop.hive.ql.exec.SelectOperator)
childOperators (org.apache.hadoop.hive.ql.exec.TableScanOperator)
aliasToWork (org.apache.hadoop.hive.ql.plan.MapWork)]
one of the WITH blocks had explode() UDTF
I replaced it with "select ... union all select ... union all select ..."
and query is working fine now.
Do you know anything about UDTF and Kryo issues fixed after 0.13.1?
On Fri, May 15, 2015 at 3:20 PM, Alexander Pivovarov <ap...@gmail.com>
wrote:
> Looks like it was fixed in hive-0.14
> https://issues.apache.org/jira/browse/HIVE-7079
>
> On Fri, May 15, 2015 at 2:26 PM, Alexander Pivovarov <apivovarov@gmail.com
> > wrote:
>
>> Hi Everyone
>>
>> I'm using hive-0.13.1 (HDP-2.1.5) and getting the following stacktrace
>> if run my query (which has WITH block) via Oozie. (BTW, the query works
>> fine in CLI)
>>
>> I can't put exact query but the structure is similar to
>>
>> create table my_consumer
>> as
>> with sacusaloan as (select distinct e,f,g from E)
>>
>> select A.a, A.b, A.c,
>> if(sacusaloan.id is null, 0, 1) as sacusaloan_status
>> from (select a,b,c from A) A
>> left join sacusaloan on (...)
>>
>> 8799 [main] INFO hive.ql.parse.ParseDriver - Parse Completed
>> 8799 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - </PERFLOG method=parse start=1431723485500 end=1431723485602 duration=102 from=org.apache.hadoop.hive.ql.Driver>
>> 8799 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - <PERFLOG method=semanticAnalyze from=org.apache.hadoop.hive.ql.Driver>
>> 8834 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Starting Semantic Analysis
>> 8837 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Creating table wk_qualified_outsource_loan_consumer position=13
>> 8861 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Completed phase 1 of Semantic Analysis
>> 8861 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>> 8865 [main] ERROR hive.ql.metadata.Hive - NoSuchObjectException(message:default.sacusaloan table not found)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
>> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:606)
>> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
>> at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
>> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
>> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
>> at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
>> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
>> at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
>> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
>> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
>> at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
>> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:606)
>> at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
>> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at javax.security.auth.Subject.doAs(Subject.java:415)
>> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
>> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>>
>> 8872 [main] ERROR hive.ql.metadata.Hive - NoSuchObjectException(message:default.recallloan table not found)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
>> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:606)
>> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
>> at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
>> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
>> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
>> at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
>> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
>> at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
>> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
>> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
>> at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
>> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:606)
>> at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
>> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at javax.security.auth.Subject.doAs(Subject.java:415)
>> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
>> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>>
>> 9700 [main] ERROR hive.ql.metadata.Hive - NoSuchObjectException(message:default.loanmob table not found)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
>> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:606)
>> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
>> at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
>> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
>> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
>> at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
>> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
>> at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
>> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
>> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
>> at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
>> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:606)
>> at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
>> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at javax.security.auth.Subject.doAs(Subject.java:415)
>> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
>> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>>
>> 9708 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>> 9708 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>> 9876 [main] INFO org.apache.hadoop.hive.ql.exec.Utilities - Create dirs hdfs://hadev/tmp/hive-svc-yarn/hive_2015-05-15_13-58-05_500_5122268870471366216-1 with permission rwxrwxrwx recursive false
>> 9894 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Completed getting MetaData in Semantic Analysis
>> 10277 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>> 10289 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>> 10290 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>> 10320 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>> 10321 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>> 10321 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>> 10816 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Set stats collection dir : hdfs://hadev/tmp/hive-svc-yarn/hive_2015-05-15_13-58-05_500_5122268870471366216-1/-ext-10002
>>
>>
>>
>
Re: query uses WITH blocks and throws exception if run as Oozie hive
action (hive-0.13.1)
Posted by Alexander Pivovarov <ap...@gmail.com>.
I also noticed another error message in logs
10848 [main] ERROR org.apache.hadoop.hive.ql.exec.tez.TezJobMonitor -
Status: Failed
10849 [main] ERROR org.apache.hadoop.hive.ql.exec.tez.TezJobMonitor -
Vertex failed, vertexName=Map 32, vertexId=vertex_1431616132488_6430_1_24,
diagnostics=[Vertex Input: dual initializer failed.,
org.apache.hive.com.esotericsoftware.kryo.KryoException: Unable to find
class: org.apache.commons.logging.impl.SLF4JLocationAwareLog
Serialization trace:
LOG (org.apache.hadoop.hive.ql.exec.UDTFOperator)
childOperators (org.apache.hadoop.hive.ql.exec.SelectOperator)
childOperators (org.apache.hadoop.hive.ql.exec.TableScanOperator)
aliasToWork (org.apache.hadoop.hive.ql.plan.MapWork)]
one of the WITH blocks had explode() UDTF
I replaced it with "select ... union all select ... union all select ..."
and query is working fine now.
Do you know anything about UDTF and Kryo issues fixed after 0.13.1?
On Fri, May 15, 2015 at 3:20 PM, Alexander Pivovarov <ap...@gmail.com>
wrote:
> Looks like it was fixed in hive-0.14
> https://issues.apache.org/jira/browse/HIVE-7079
>
> On Fri, May 15, 2015 at 2:26 PM, Alexander Pivovarov <apivovarov@gmail.com
> > wrote:
>
>> Hi Everyone
>>
>> I'm using hive-0.13.1 (HDP-2.1.5) and getting the following stacktrace
>> if run my query (which has WITH block) via Oozie. (BTW, the query works
>> fine in CLI)
>>
>> I can't put exact query but the structure is similar to
>>
>> create table my_consumer
>> as
>> with sacusaloan as (select distinct e,f,g from E)
>>
>> select A.a, A.b, A.c,
>> if(sacusaloan.id is null, 0, 1) as sacusaloan_status
>> from (select a,b,c from A) A
>> left join sacusaloan on (...)
>>
>> 8799 [main] INFO hive.ql.parse.ParseDriver - Parse Completed
>> 8799 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - </PERFLOG method=parse start=1431723485500 end=1431723485602 duration=102 from=org.apache.hadoop.hive.ql.Driver>
>> 8799 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - <PERFLOG method=semanticAnalyze from=org.apache.hadoop.hive.ql.Driver>
>> 8834 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Starting Semantic Analysis
>> 8837 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Creating table wk_qualified_outsource_loan_consumer position=13
>> 8861 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Completed phase 1 of Semantic Analysis
>> 8861 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>> 8865 [main] ERROR hive.ql.metadata.Hive - NoSuchObjectException(message:default.sacusaloan table not found)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
>> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:606)
>> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
>> at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
>> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
>> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
>> at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
>> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
>> at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
>> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
>> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
>> at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
>> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:606)
>> at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
>> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at javax.security.auth.Subject.doAs(Subject.java:415)
>> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
>> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>>
>> 8872 [main] ERROR hive.ql.metadata.Hive - NoSuchObjectException(message:default.recallloan table not found)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
>> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:606)
>> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
>> at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
>> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
>> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
>> at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
>> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
>> at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
>> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
>> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
>> at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
>> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:606)
>> at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
>> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at javax.security.auth.Subject.doAs(Subject.java:415)
>> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
>> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>>
>> 9700 [main] ERROR hive.ql.metadata.Hive - NoSuchObjectException(message:default.loanmob table not found)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
>> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:606)
>> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
>> at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
>> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
>> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
>> at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
>> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
>> at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
>> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
>> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
>> at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
>> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:606)
>> at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
>> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at javax.security.auth.Subject.doAs(Subject.java:415)
>> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
>> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>>
>> 9708 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>> 9708 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>> 9876 [main] INFO org.apache.hadoop.hive.ql.exec.Utilities - Create dirs hdfs://hadev/tmp/hive-svc-yarn/hive_2015-05-15_13-58-05_500_5122268870471366216-1 with permission rwxrwxrwx recursive false
>> 9894 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Completed getting MetaData in Semantic Analysis
>> 10277 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>> 10289 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>> 10290 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>> 10320 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>> 10321 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>> 10321 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>> 10816 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Set stats collection dir : hdfs://hadev/tmp/hive-svc-yarn/hive_2015-05-15_13-58-05_500_5122268870471366216-1/-ext-10002
>>
>>
>>
>
Re: query uses WITH blocks and throws exception if run as Oozie hive
action (hive-0.13.1)
Posted by Alexander Pivovarov <ap...@gmail.com>.
I also noticed another error message in logs
10848 [main] ERROR org.apache.hadoop.hive.ql.exec.tez.TezJobMonitor -
Status: Failed
10849 [main] ERROR org.apache.hadoop.hive.ql.exec.tez.TezJobMonitor -
Vertex failed, vertexName=Map 32, vertexId=vertex_1431616132488_6430_1_24,
diagnostics=[Vertex Input: dual initializer failed.,
org.apache.hive.com.esotericsoftware.kryo.KryoException: Unable to find
class: org.apache.commons.logging.impl.SLF4JLocationAwareLog
Serialization trace:
LOG (org.apache.hadoop.hive.ql.exec.UDTFOperator)
childOperators (org.apache.hadoop.hive.ql.exec.SelectOperator)
childOperators (org.apache.hadoop.hive.ql.exec.TableScanOperator)
aliasToWork (org.apache.hadoop.hive.ql.plan.MapWork)]
one of the WITH blocks had explode() UDTF
I replaced it with "select ... union all select ... union all select ..."
and query is working fine now.
Do you know anything about UDTF and Kryo issues fixed after 0.13.1?
On Fri, May 15, 2015 at 3:20 PM, Alexander Pivovarov <ap...@gmail.com>
wrote:
> Looks like it was fixed in hive-0.14
> https://issues.apache.org/jira/browse/HIVE-7079
>
> On Fri, May 15, 2015 at 2:26 PM, Alexander Pivovarov <apivovarov@gmail.com
> > wrote:
>
>> Hi Everyone
>>
>> I'm using hive-0.13.1 (HDP-2.1.5) and getting the following stacktrace
>> if run my query (which has WITH block) via Oozie. (BTW, the query works
>> fine in CLI)
>>
>> I can't put exact query but the structure is similar to
>>
>> create table my_consumer
>> as
>> with sacusaloan as (select distinct e,f,g from E)
>>
>> select A.a, A.b, A.c,
>> if(sacusaloan.id is null, 0, 1) as sacusaloan_status
>> from (select a,b,c from A) A
>> left join sacusaloan on (...)
>>
>> 8799 [main] INFO hive.ql.parse.ParseDriver - Parse Completed
>> 8799 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - </PERFLOG method=parse start=1431723485500 end=1431723485602 duration=102 from=org.apache.hadoop.hive.ql.Driver>
>> 8799 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - <PERFLOG method=semanticAnalyze from=org.apache.hadoop.hive.ql.Driver>
>> 8834 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Starting Semantic Analysis
>> 8837 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Creating table wk_qualified_outsource_loan_consumer position=13
>> 8861 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Completed phase 1 of Semantic Analysis
>> 8861 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>> 8865 [main] ERROR hive.ql.metadata.Hive - NoSuchObjectException(message:default.sacusaloan table not found)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
>> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:606)
>> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
>> at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
>> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
>> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
>> at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
>> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
>> at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
>> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
>> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
>> at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
>> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:606)
>> at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
>> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at javax.security.auth.Subject.doAs(Subject.java:415)
>> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
>> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>>
>> 8872 [main] ERROR hive.ql.metadata.Hive - NoSuchObjectException(message:default.recallloan table not found)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
>> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:606)
>> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
>> at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
>> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
>> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
>> at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
>> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
>> at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
>> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
>> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
>> at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
>> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:606)
>> at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
>> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at javax.security.auth.Subject.doAs(Subject.java:415)
>> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
>> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>>
>> 9700 [main] ERROR hive.ql.metadata.Hive - NoSuchObjectException(message:default.loanmob table not found)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
>> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
>> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
>> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:606)
>> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
>> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
>> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
>> at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
>> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
>> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
>> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
>> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
>> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
>> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
>> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
>> at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
>> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
>> at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
>> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
>> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
>> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
>> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
>> at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
>> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:606)
>> at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
>> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
>> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at javax.security.auth.Subject.doAs(Subject.java:415)
>> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
>> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>>
>> 9708 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>> 9708 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>> 9876 [main] INFO org.apache.hadoop.hive.ql.exec.Utilities - Create dirs hdfs://hadev/tmp/hive-svc-yarn/hive_2015-05-15_13-58-05_500_5122268870471366216-1 with permission rwxrwxrwx recursive false
>> 9894 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Completed getting MetaData in Semantic Analysis
>> 10277 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>> 10289 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>> 10290 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
>> 10320 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
>> 10321 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>> 10321 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
>> 10816 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Set stats collection dir : hdfs://hadev/tmp/hive-svc-yarn/hive_2015-05-15_13-58-05_500_5122268870471366216-1/-ext-10002
>>
>>
>>
>
Re: query uses WITH blocks and throws exception if run as Oozie hive
action (hive-0.13.1)
Posted by Alexander Pivovarov <ap...@gmail.com>.
Looks like it was fixed in hive-0.14
https://issues.apache.org/jira/browse/HIVE-7079
On Fri, May 15, 2015 at 2:26 PM, Alexander Pivovarov <ap...@gmail.com>
wrote:
> Hi Everyone
>
> I'm using hive-0.13.1 (HDP-2.1.5) and getting the following stacktrace
> if run my query (which has WITH block) via Oozie. (BTW, the query works
> fine in CLI)
>
> I can't put exact query but the structure is similar to
>
> create table my_consumer
> as
> with sacusaloan as (select distinct e,f,g from E)
>
> select A.a, A.b, A.c,
> if(sacusaloan.id is null, 0, 1) as sacusaloan_status
> from (select a,b,c from A) A
> left join sacusaloan on (...)
>
> 8799 [main] INFO hive.ql.parse.ParseDriver - Parse Completed
> 8799 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - </PERFLOG method=parse start=1431723485500 end=1431723485602 duration=102 from=org.apache.hadoop.hive.ql.Driver>
> 8799 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - <PERFLOG method=semanticAnalyze from=org.apache.hadoop.hive.ql.Driver>
> 8834 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Starting Semantic Analysis
> 8837 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Creating table wk_qualified_outsource_loan_consumer position=13
> 8861 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Completed phase 1 of Semantic Analysis
> 8861 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
> 8865 [main] ERROR hive.ql.metadata.Hive - NoSuchObjectException(message:default.sacusaloan table not found)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
> at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
> at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
> at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
> at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>
> 8872 [main] ERROR hive.ql.metadata.Hive - NoSuchObjectException(message:default.recallloan table not found)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
> at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
> at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
> at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
> at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>
> 9700 [main] ERROR hive.ql.metadata.Hive - NoSuchObjectException(message:default.loanmob table not found)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
> at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
> at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
> at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
> at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>
> 9708 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
> 9708 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
> 9876 [main] INFO org.apache.hadoop.hive.ql.exec.Utilities - Create dirs hdfs://hadev/tmp/hive-svc-yarn/hive_2015-05-15_13-58-05_500_5122268870471366216-1 with permission rwxrwxrwx recursive false
> 9894 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Completed getting MetaData in Semantic Analysis
> 10277 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
> 10289 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
> 10290 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
> 10320 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
> 10321 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
> 10321 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
> 10816 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Set stats collection dir : hdfs://hadev/tmp/hive-svc-yarn/hive_2015-05-15_13-58-05_500_5122268870471366216-1/-ext-10002
>
>
>
Re: query uses WITH blocks and throws exception if run as Oozie hive
action (hive-0.13.1)
Posted by Alexander Pivovarov <ap...@gmail.com>.
Looks like it was fixed in hive-0.14
https://issues.apache.org/jira/browse/HIVE-7079
On Fri, May 15, 2015 at 2:26 PM, Alexander Pivovarov <ap...@gmail.com>
wrote:
> Hi Everyone
>
> I'm using hive-0.13.1 (HDP-2.1.5) and getting the following stacktrace
> if run my query (which has WITH block) via Oozie. (BTW, the query works
> fine in CLI)
>
> I can't put exact query but the structure is similar to
>
> create table my_consumer
> as
> with sacusaloan as (select distinct e,f,g from E)
>
> select A.a, A.b, A.c,
> if(sacusaloan.id is null, 0, 1) as sacusaloan_status
> from (select a,b,c from A) A
> left join sacusaloan on (...)
>
> 8799 [main] INFO hive.ql.parse.ParseDriver - Parse Completed
> 8799 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - </PERFLOG method=parse start=1431723485500 end=1431723485602 duration=102 from=org.apache.hadoop.hive.ql.Driver>
> 8799 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - <PERFLOG method=semanticAnalyze from=org.apache.hadoop.hive.ql.Driver>
> 8834 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Starting Semantic Analysis
> 8837 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Creating table wk_qualified_outsource_loan_consumer position=13
> 8861 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Completed phase 1 of Semantic Analysis
> 8861 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
> 8865 [main] ERROR hive.ql.metadata.Hive - NoSuchObjectException(message:default.sacusaloan table not found)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
> at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
> at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
> at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
> at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>
> 8872 [main] ERROR hive.ql.metadata.Hive - NoSuchObjectException(message:default.recallloan table not found)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
> at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
> at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
> at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
> at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>
> 9700 [main] ERROR hive.ql.metadata.Hive - NoSuchObjectException(message:default.loanmob table not found)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
> at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
> at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
> at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
> at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>
> 9708 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
> 9708 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
> 9876 [main] INFO org.apache.hadoop.hive.ql.exec.Utilities - Create dirs hdfs://hadev/tmp/hive-svc-yarn/hive_2015-05-15_13-58-05_500_5122268870471366216-1 with permission rwxrwxrwx recursive false
> 9894 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Completed getting MetaData in Semantic Analysis
> 10277 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
> 10289 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
> 10290 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
> 10320 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
> 10321 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
> 10321 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
> 10816 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Set stats collection dir : hdfs://hadev/tmp/hive-svc-yarn/hive_2015-05-15_13-58-05_500_5122268870471366216-1/-ext-10002
>
>
>
Re: query uses WITH blocks and throws exception if run as Oozie hive
action (hive-0.13.1)
Posted by Alexander Pivovarov <ap...@gmail.com>.
Looks like it was fixed in hive-0.14
https://issues.apache.org/jira/browse/HIVE-7079
On Fri, May 15, 2015 at 2:26 PM, Alexander Pivovarov <ap...@gmail.com>
wrote:
> Hi Everyone
>
> I'm using hive-0.13.1 (HDP-2.1.5) and getting the following stacktrace
> if run my query (which has WITH block) via Oozie. (BTW, the query works
> fine in CLI)
>
> I can't put exact query but the structure is similar to
>
> create table my_consumer
> as
> with sacusaloan as (select distinct e,f,g from E)
>
> select A.a, A.b, A.c,
> if(sacusaloan.id is null, 0, 1) as sacusaloan_status
> from (select a,b,c from A) A
> left join sacusaloan on (...)
>
> 8799 [main] INFO hive.ql.parse.ParseDriver - Parse Completed
> 8799 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - </PERFLOG method=parse start=1431723485500 end=1431723485602 duration=102 from=org.apache.hadoop.hive.ql.Driver>
> 8799 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - <PERFLOG method=semanticAnalyze from=org.apache.hadoop.hive.ql.Driver>
> 8834 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Starting Semantic Analysis
> 8837 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Creating table wk_qualified_outsource_loan_consumer position=13
> 8861 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Completed phase 1 of Semantic Analysis
> 8861 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
> 8865 [main] ERROR hive.ql.metadata.Hive - NoSuchObjectException(message:default.sacusaloan table not found)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
> at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
> at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
> at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
> at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>
> 8872 [main] ERROR hive.ql.metadata.Hive - NoSuchObjectException(message:default.recallloan table not found)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
> at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
> at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
> at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
> at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>
> 9700 [main] ERROR hive.ql.metadata.Hive - NoSuchObjectException(message:default.loanmob table not found)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
> at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
> at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
> at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
> at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>
> 9708 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
> 9708 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
> 9876 [main] INFO org.apache.hadoop.hive.ql.exec.Utilities - Create dirs hdfs://hadev/tmp/hive-svc-yarn/hive_2015-05-15_13-58-05_500_5122268870471366216-1 with permission rwxrwxrwx recursive false
> 9894 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Completed getting MetaData in Semantic Analysis
> 10277 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
> 10289 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
> 10290 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
> 10320 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
> 10321 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
> 10321 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
> 10816 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Set stats collection dir : hdfs://hadev/tmp/hive-svc-yarn/hive_2015-05-15_13-58-05_500_5122268870471366216-1/-ext-10002
>
>
>
Re: query uses WITH blocks and throws exception if run as Oozie hive
action (hive-0.13.1)
Posted by Alexander Pivovarov <ap...@gmail.com>.
Looks like it was fixed in hive-0.14
https://issues.apache.org/jira/browse/HIVE-7079
On Fri, May 15, 2015 at 2:26 PM, Alexander Pivovarov <ap...@gmail.com>
wrote:
> Hi Everyone
>
> I'm using hive-0.13.1 (HDP-2.1.5) and getting the following stacktrace
> if run my query (which has WITH block) via Oozie. (BTW, the query works
> fine in CLI)
>
> I can't put exact query but the structure is similar to
>
> create table my_consumer
> as
> with sacusaloan as (select distinct e,f,g from E)
>
> select A.a, A.b, A.c,
> if(sacusaloan.id is null, 0, 1) as sacusaloan_status
> from (select a,b,c from A) A
> left join sacusaloan on (...)
>
> 8799 [main] INFO hive.ql.parse.ParseDriver - Parse Completed
> 8799 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - </PERFLOG method=parse start=1431723485500 end=1431723485602 duration=102 from=org.apache.hadoop.hive.ql.Driver>
> 8799 [main] INFO org.apache.hadoop.hive.ql.log.PerfLogger - <PERFLOG method=semanticAnalyze from=org.apache.hadoop.hive.ql.Driver>
> 8834 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Starting Semantic Analysis
> 8837 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Creating table wk_qualified_outsource_loan_consumer position=13
> 8861 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Completed phase 1 of Semantic Analysis
> 8861 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
> 8865 [main] ERROR hive.ql.metadata.Hive - NoSuchObjectException(message:default.sacusaloan table not found)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
> at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
> at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
> at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
> at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>
> 8872 [main] ERROR hive.ql.metadata.Hive - NoSuchObjectException(message:default.recallloan table not found)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
> at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
> at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
> at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
> at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>
> 9700 [main] ERROR hive.ql.metadata.Hive - NoSuchObjectException(message:default.loanmob table not found)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29338)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result$get_table_resultStandardScheme.read(ThriftHiveMetastore.java:29306)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$get_table_result.read(ThriftHiveMetastore.java:29237)
> at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_table(ThriftHiveMetastore.java:1036)
> at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_table(ThriftHiveMetastore.java:1022)
> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
> at com.sun.proxy.$Proxy18.getTable(Unknown Source)
> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:976)
> at org.apache.hadoop.hive.ql.metadata.Hive.getTable(Hive.java:918)
> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1263)
> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:1232)
> at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:9252)
> at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:327)
> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:427)
> at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:323)
> at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:980)
> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1045)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:916)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:906)
> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)
> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:423)
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:359)
> at org.apache.hadoop.hive.cli.CliDriver.processReader(CliDriver.java:456)
> at org.apache.hadoop.hive.cli.CliDriver.processFile(CliDriver.java:466)
> at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:749)
> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:686)
> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:625)
> at org.apache.oozie.action.hadoop.HiveMain.runHive(HiveMain.java:316)
> at org.apache.oozie.action.hadoop.HiveMain.run(HiveMain.java:277)
> at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:38)
> at org.apache.oozie.action.hadoop.HiveMain.main(HiveMain.java:66)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:225)
> at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1594)
> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
>
> 9708 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
> 9708 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
> 9798 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
> 9815 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
> 9827 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
> 9852 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
> 9876 [main] INFO org.apache.hadoop.hive.ql.exec.Utilities - Create dirs hdfs://hadev/tmp/hive-svc-yarn/hive_2015-05-15_13-58-05_500_5122268870471366216-1 with permission rwxrwxrwx recursive false
> 9894 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Completed getting MetaData in Semantic Analysis
> 10277 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
> 10289 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
> 10290 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
> 10294 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for source tables
> 10320 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for subqueries
> 10321 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
> 10321 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Get metadata for destination tables
> 10816 [main] INFO org.apache.hadoop.hive.ql.parse.SemanticAnalyzer - Set stats collection dir : hdfs://hadev/tmp/hive-svc-yarn/hive_2015-05-15_13-58-05_500_5122268870471366216-1/-ext-10002
>
>
>