You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@drill.apache.org by Li HM <hm...@gmail.com> on 2014/11/08 00:09:34 UTC

Help: Drill 0.7.0 unable to query hdfs file

fresh compiled drill 0.7.0 with hadoop-2.5.1. Whenever querying a hdfs
file, I get the following error from sqlline

Query failed: Failure while running sql.

Error: exception while executing query: Failure while executing query.
(state=,code=0)

Checking the drillbit log, there are long java exceptions. Anybody
know what would be the issue?

The fatal one looks like a missing class.
Caused by: java.lang.NoClassDefFoundError:
org/apache/hadoop/yarn/api/ApplicationClientProtocolPB. but the class
is in hadoop-yarn-common-0.5.1.jar

> jar tf hadoop-yarn-common-2.5.1.jar | grep "org/apache/hadoop/yarn/api/ApplicationClientProtocolPB.class"
org/apache/hadoop/yarn/api/ApplicationClientProtocolPB.class

Please help, thanks!

2014-11-07 22:41:45,283
[84d32586-a999-4b4b-a05c-620058a29fb6:frag:0:0] WARN
o.a.d.exec.work.foreman.QueryStatus - Update finished query state :
COMPLETED
2014-11-07 22:41:52,136 [UserServer-1] WARN
o.a.d.exec.work.foreman.QueryStatus - Update running or pending query
state : PENDING
2014-11-07 22:41:54,697 [a1edbba9-b61a-4eb8-a0bd-8afa41a3db52:foreman]
WARN  o.a.d.e.s.dfs.WorkspaceSchemaFactory - Failure while trying to
load .drill file.
java.io.IOException: Failed on local exception: java.io.IOException:
Couldn't set up IO streams; Host Details : local host is:
"*.*.*.*/*.*.*.*"; destination host is: "*.*.*.*":8020;
        at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
~[hadoop-common-2.5.1.jar:na]
        at org.apache.hadoop.ipc.Client.call(Client.java:1375)
~[hadoop-common-2.5.1.jar:na]
        at org.apache.hadoop.ipc.Client.call(Client.java:1324)
~[hadoop-common-2.5.1.jar:na]
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
~[hadoop-common-2.5.1.jar:na]
        at com.sun.proxy.$Proxy42.getFileInfo(Unknown Source) ~[na:na]
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:707)
~[hadoop-hdfs-2.5.1.jar:na]


                             426,2-9       66%
        at com.sun.proxy.$Proxy42.getFileInfo(Unknown Source) ~[na:na]
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:707)
~[hadoop-hdfs-2.5.1.jar:na]
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
~[na:1.7.0_17]
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
~[na:1.7.0_17]
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
~[na:1.7.0_17]
        at java.lang.reflect.Method.invoke(Method.java:601) ~[na:1.7.0_17]
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
~[hadoop-common-2.5.1.jar:na]
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
~[hadoop-common-2.5.1.jar:na]
        at com.sun.proxy.$Proxy43.getFileInfo(Unknown Source) ~[na:na]
        at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1785)
~[hadoop-hdfs-2.5.1.jar:na]
        at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1068)
~[hadoop-hdfs-2.5.1.jar:na]
        at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1064)
~[hadoop-hdfs-2.5.1.jar:na]
        at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
~[hadoop-common-2.5.1.jar:na]
        at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1064)
~[hadoop-hdfs-2.5.1.jar:na]
        at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:59)
~[hadoop-common-2.5.1.jar:na]
        at org.apache.hadoop.fs.Globber.matchFilter(Globber.java:276)
~[hadoop-common-2.5.1.jar:na]
        at org.apache.hadoop.fs.Globber.applyFilters(Globber.java:258)
~[hadoop-common-2.5.1.jar:na]
        at org.apache.hadoop.fs.Globber.glob(Globber.java:226)
~[hadoop-common-2.5.1.jar:na]
        at org.apache.hadoop.fs.Globber.glob(Globber.java:177)
~[hadoop-common-2.5.1.jar:na]
        at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1623)
~[hadoop-common-2.5.1.jar:na]
        at org.apache.drill.exec.dotdrill.DotDrillUtil.getDotDrills(DotDrillUtil.java:57)
~[drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
        at org.apache.drill.exec.store.dfs.WorkspaceSchemaFactory$WorkspaceSchema.getTable(WorkspaceSchemaFactory.java:259)
~[drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
        at org.apache.drill.exec.store.dfs.FileSystemSchemaFactory$FileSystemSchema.getTable(FileSystemSchemaFactory.java:97)
[drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
        at net.hydromatic.optiq.jdbc.SimpleOptiqSchema.getTable(SimpleOptiqSchema.java:75)
[optiq-core-0.9-drill-r6.jar:na]
        at net.hydromatic.optiq.prepare.OptiqCatalogReader.getTableFrom(OptiqCatalogReader.java:87)
[optiq-core-0.9-drill-r6.jar:na]
        at net.hydromatic.optiq.prepare.OptiqCatalogReader.getTable(OptiqCatalogReader.java:70)
[optiq-core-0.9-drill-r6.jar:na]
        at net.hydromatic.optiq.prepare.OptiqCatalogReader.getTable(OptiqCatalogReader.java:1)
[optiq-core-0.9-drill-r6.jar:na]
        at org.eigenbase.sql.validate.EmptyScope.getTableNamespace(EmptyScope.java:67)
[optiq-core-0.9-drill-r6.jar:na]
        at org.eigenbase.sql.validate.IdentifierNamespace.validateImpl(IdentifierNamespace.java:75)
[optiq-core-0.9-drill-r6.jar:na]
        at org.eigenbase.sql.validate.AbstractNamespace.validate(AbstractNamespace.java:85)
[optiq-core-0.9-drill-r6.jar:na]
        at org.eigenbase.sql.validate.SqlValidatorImpl.validateNamespace(SqlValidatorImpl.java:779)
[optiq-core-0.9-drill-r6.jar:na]
        at org.eigenbase.sql.validate.SqlValidatorImpl.validateQuery(SqlValidatorImpl.java:768)
[optiq-core-0.9-drill-r6.jar:na]
        at org.eigenbase.sql.validate.SqlValidatorImpl.validateFrom(SqlValidatorImpl.java:2599)
[optiq-core-0.9-drill-r6.jar:na]
        at org.eigenbase.sql.validate.SqlValidatorImpl.validateSelect(SqlValidatorImpl.java:2807)
[optiq-core-0.9-drill-r6.jar:na]
        at org.eigenbase.sql.validate.SelectNamespace.validateImpl(SelectNamespace.java:60)
[optiq-core-0.9-drill-r6.jar:na]
        at org.eigenbase.sql.validate.AbstractNamespace.validate(AbstractNamespace.java:85)
[optiq-core-0.9-drill-r6.jar:na]
        at org.eigenbase.sql.validate.SqlValidatorImpl.validateNamespace(SqlValidatorImpl.java:779)
[optiq-core-0.9-drill-r6.jar:na]
        at org.eigenbase.sql.validate.SqlValidatorImpl.validateQuery(SqlValidatorImpl.java:768)
[optiq-core-0.9-drill-r6.jar:na]
        at org.eigenbase.sql.SqlSelect.validate(SqlSelect.java:208)
[optiq-core-0.9-drill-r6.jar:na]
        at org.eigenbase.sql.validate.SqlValidatorImpl.validateScopedExpression(SqlValidatorImpl.java:742)
[optiq-core-0.9-drill-r6.jar:na]
        at org.eigenbase.sql.validate.SqlValidatorImpl.validate(SqlValidatorImpl.java:458)
[optiq-core-0.9-drill-r6.jar:na]


                             425,2-9       73%
        at org.eigenbase.sql.validate.SqlValidatorImpl.validateScopedExpression(SqlValidatorImpl.java:742)
[optiq-core-0.9-drill-r6.jar:na]
        at org.eigenbase.sql.validate.SqlValidatorImpl.validate(SqlValidatorImpl.java:458)
[optiq-core-0.9-drill-r6.jar:na]
        at net.hydromatic.optiq.prepare.PlannerImpl.validate(PlannerImpl.java:173)
[optiq-core-0.9-drill-r6.jar:na]
        at org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.validateNode(DefaultSqlHandler.java:145)
[drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
        at org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.getPlan(DefaultSqlHandler.java:125)
[drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
        at org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:132)
[drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
        at org.apache.drill.exec.work.foreman.Foreman.runSQL(Foreman.java:384)
[drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
        at org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:204)
[drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
        at org.apache.drill.exec.work.WorkManager$RunnableWrapper.run(WorkManager.java:249)
[drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
[na:1.7.0_17]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
[na:1.7.0_17]
        at java.lang.Thread.run(Thread.java:722) [na:1.7.0_17]
Caused by: java.io.IOException: Couldn't set up IO streams
        at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:753)
~[hadoop-common-2.5.1.jar:na]
        at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:368)
~[hadoop-common-2.5.1.jar:na]
        at org.apache.hadoop.ipc.Client.getConnection(Client.java:1423)
~[hadoop-common-2.5.1.jar:na]
        at org.apache.hadoop.ipc.Client.call(Client.java:1342)
~[hadoop-common-2.5.1.jar:na]
        ... 53 common frames omitted
Caused by: java.lang.NoClassDefFoundError:
org/apache/hadoop/yarn/api/ApplicationClientProtocolPB
        at org.apache.hadoop.yarn.security.client.ClientRMSecurityInfo.getTokenInfo(ClientRMSecurityInfo.java:65)
~[hadoop-yarn-common-2.5.1.jar:na]
        at org.apache.hadoop.security.SecurityUtil.getTokenInfo(SecurityUtil.java:327)
~[hadoop-common-2.5.1.jar:na]
        at org.apache.hadoop.security.SaslRpcClient.getServerToken(SaslRpcClient.java:262)
~[hadoop-common-2.5.1.jar:na]
        at org.apache.hadoop.security.SaslRpcClient.createSaslClient(SaslRpcClient.java:218)
~[hadoop-common-2.5.1.jar:na]
        at org.apache.hadoop.security.SaslRpcClient.selectSaslClient(SaslRpcClient.java:158)
~[hadoop-common-2.5.1.jar:na]
        at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:388)
~[hadoop-common-2.5.1.jar:na]
        at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:702)
~[hadoop-common-2.5.1.jar:na]
        at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:698)
~[hadoop-common-2.5.1.jar:na]
        at java.security.AccessController.doPrivileged(Native Method)
~[na:1.7.0_17]
        at javax.security.auth.Subject.doAs(Subject.java:415) ~[na:1.7.0_17]
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1637)
~[hadoop-common-2.5.1.jar:na]
        at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:697)
~[hadoop-common-2.5.1.jar:na]
        ... 56 common frames omitted
2014-11-07 22:41:59,250 [a1edbba9-b61a-4eb8-a0bd-8afa41a3db52:foreman]
WARN  o.a.d.e.s.dfs.WorkspaceSchemaFactory - Failure while trying to
load .drill file.
java.io.IOException: Failed on local exception: java.io.IOException:
Couldn't set up IO streams; Host Details : local host is: "*.*.*.*";
destination host is: "*.*.*.*":8020;
        at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
~[hadoop-common-2.5.1.jar:na]
        at org.apache.hadoop.ipc.Client.call(Client.java:1375)
~[hadoop-common-2.5.1.jar:na]
        at org.apache.hadoop.ipc.Client.call(Client.java:1324)
~[hadoop-common-2.5.1.jar:na]
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
~[hadoop-common-2.5.1.jar:na]
        at com.sun.proxy.$Proxy42.getFileInfo(Unknown Source) ~[na:na]
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:707)
~[hadoop-hdfs-2.5.1.jar:na]

Re: Help: Drill 0.7.0 unable to query hdfs file

Posted by Hmxxyy <hm...@gmail.com>.
Yes, the hadoop cluster uses Kerberos auentication.

Sent from my iPhone

> On Nov 7, 2014, at 11:22 PM, Aditya <ad...@gmail.com> wrote:
> 
> Are you running against a secure HDFS cluster?
> 
>> On Fri, Nov 7, 2014 at 10:00 PM, Hmxxyy <hm...@gmail.com> wrote:
>> 
>> Anybody has any clue?
>> 
>> Sent from my iPhone
>> 
>>> On Nov 7, 2014, at 3:09 PM, Li HM <hm...@gmail.com> wrote:
>>> 
>>> fresh compiled drill 0.7.0 with hadoop-2.5.1. Whenever querying a hdfs
>>> file, I get the following error from sqlline
>>> 
>>> Query failed: Failure while running sql.
>>> 
>>> Error: exception while executing query: Failure while executing query.
>>> (state=,code=0)
>>> 
>>> Checking the drillbit log, there are long java exceptions. Anybody
>>> know what would be the issue?
>>> 
>>> The fatal one looks like a missing class.
>>> Caused by: java.lang.NoClassDefFoundError:
>>> org/apache/hadoop/yarn/api/ApplicationClientProtocolPB. but the class
>>> is in hadoop-yarn-common-0.5.1.jar
>>> 
>>>> jar tf hadoop-yarn-common-2.5.1.jar | grep
>> "org/apache/hadoop/yarn/api/ApplicationClientProtocolPB.class"
>>> org/apache/hadoop/yarn/api/ApplicationClientProtocolPB.class
>>> 
>>> Please help, thanks!
>>> 
>>> 2014-11-07 22:41:45,283
>>> [84d32586-a999-4b4b-a05c-620058a29fb6:frag:0:0] WARN
>>> o.a.d.exec.work.foreman.QueryStatus - Update finished query state :
>>> COMPLETED
>>> 2014-11-07 22:41:52,136 [UserServer-1] WARN
>>> o.a.d.exec.work.foreman.QueryStatus - Update running or pending query
>>> state : PENDING
>>> 2014-11-07 22:41:54,697 [a1edbba9-b61a-4eb8-a0bd-8afa41a3db52:foreman]
>>> WARN  o.a.d.e.s.dfs.WorkspaceSchemaFactory - Failure while trying to
>>> load .drill file.
>>> java.io.IOException: Failed on local exception: java.io.IOException:
>>> Couldn't set up IO streams; Host Details : local host is:
>>> "*.*.*.*/*.*.*.*"; destination host is: "*.*.*.*":8020;
>>>       at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
>>> ~[hadoop-common-2.5.1.jar:na]
>>>       at org.apache.hadoop.ipc.Client.call(Client.java:1375)
>>> ~[hadoop-common-2.5.1.jar:na]
>>>       at org.apache.hadoop.ipc.Client.call(Client.java:1324)
>>> ~[hadoop-common-2.5.1.jar:na]
>>>       at
>> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
>>> ~[hadoop-common-2.5.1.jar:na]
>>>       at com.sun.proxy.$Proxy42.getFileInfo(Unknown Source) ~[na:na]
>>>       at
>> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:707)
>>> ~[hadoop-hdfs-2.5.1.jar:na]
>>> 
>>> 
>>>                            426,2-9       66%
>>>       at com.sun.proxy.$Proxy42.getFileInfo(Unknown Source) ~[na:na]
>>>       at
>> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:707)
>>> ~[hadoop-hdfs-2.5.1.jar:na]
>>>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>> ~[na:1.7.0_17]
>>>       at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>> ~[na:1.7.0_17]
>>>       at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> ~[na:1.7.0_17]
>>>       at java.lang.reflect.Method.invoke(Method.java:601) ~[na:1.7.0_17]
>>>       at
>> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
>>> ~[hadoop-common-2.5.1.jar:na]
>>>       at
>> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>>> ~[hadoop-common-2.5.1.jar:na]
>>>       at com.sun.proxy.$Proxy43.getFileInfo(Unknown Source) ~[na:na]
>>>       at
>> org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1785)
>>> ~[hadoop-hdfs-2.5.1.jar:na]
>>>       at
>> org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1068)
>>> ~[hadoop-hdfs-2.5.1.jar:na]
>>>       at
>> org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1064)
>>> ~[hadoop-hdfs-2.5.1.jar:na]
>>>       at
>> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>>> ~[hadoop-common-2.5.1.jar:na]
>>>       at
>> org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1064)
>>> ~[hadoop-hdfs-2.5.1.jar:na]
>>>       at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:59)
>>> ~[hadoop-common-2.5.1.jar:na]
>>>       at org.apache.hadoop.fs.Globber.matchFilter(Globber.java:276)
>>> ~[hadoop-common-2.5.1.jar:na]
>>>       at org.apache.hadoop.fs.Globber.applyFilters(Globber.java:258)
>>> ~[hadoop-common-2.5.1.jar:na]
>>>       at org.apache.hadoop.fs.Globber.glob(Globber.java:226)
>>> ~[hadoop-common-2.5.1.jar:na]
>>>       at org.apache.hadoop.fs.Globber.glob(Globber.java:177)
>>> ~[hadoop-common-2.5.1.jar:na]
>>>       at
>> org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1623)
>>> ~[hadoop-common-2.5.1.jar:na]
>>>       at
>> org.apache.drill.exec.dotdrill.DotDrillUtil.getDotDrills(DotDrillUtil.java:57)
>> ~[drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
>>>       at
>> org.apache.drill.exec.store.dfs.WorkspaceSchemaFactory$WorkspaceSchema.getTable(WorkspaceSchemaFactory.java:259)
>> ~[drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
>>>       at
>> org.apache.drill.exec.store.dfs.FileSystemSchemaFactory$FileSystemSchema.getTable(FileSystemSchemaFactory.java:97)
>> [drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
>>>       at
>> net.hydromatic.optiq.jdbc.SimpleOptiqSchema.getTable(SimpleOptiqSchema.java:75)
>>> [optiq-core-0.9-drill-r6.jar:na]
>>>       at
>> net.hydromatic.optiq.prepare.OptiqCatalogReader.getTableFrom(OptiqCatalogReader.java:87)
>>> [optiq-core-0.9-drill-r6.jar:na]
>>>       at
>> net.hydromatic.optiq.prepare.OptiqCatalogReader.getTable(OptiqCatalogReader.java:70)
>>> [optiq-core-0.9-drill-r6.jar:na]
>>>       at
>> net.hydromatic.optiq.prepare.OptiqCatalogReader.getTable(OptiqCatalogReader.java:1)
>>> [optiq-core-0.9-drill-r6.jar:na]
>>>       at
>> org.eigenbase.sql.validate.EmptyScope.getTableNamespace(EmptyScope.java:67)
>>> [optiq-core-0.9-drill-r6.jar:na]
>>>       at
>> org.eigenbase.sql.validate.IdentifierNamespace.validateImpl(IdentifierNamespace.java:75)
>>> [optiq-core-0.9-drill-r6.jar:na]
>>>       at
>> org.eigenbase.sql.validate.AbstractNamespace.validate(AbstractNamespace.java:85)
>>> [optiq-core-0.9-drill-r6.jar:na]
>>>       at
>> org.eigenbase.sql.validate.SqlValidatorImpl.validateNamespace(SqlValidatorImpl.java:779)
>>> [optiq-core-0.9-drill-r6.jar:na]
>>>       at
>> org.eigenbase.sql.validate.SqlValidatorImpl.validateQuery(SqlValidatorImpl.java:768)
>>> [optiq-core-0.9-drill-r6.jar:na]
>>>       at
>> org.eigenbase.sql.validate.SqlValidatorImpl.validateFrom(SqlValidatorImpl.java:2599)
>>> [optiq-core-0.9-drill-r6.jar:na]
>>>       at
>> org.eigenbase.sql.validate.SqlValidatorImpl.validateSelect(SqlValidatorImpl.java:2807)
>>> [optiq-core-0.9-drill-r6.jar:na]
>>>       at
>> org.eigenbase.sql.validate.SelectNamespace.validateImpl(SelectNamespace.java:60)
>>> [optiq-core-0.9-drill-r6.jar:na]
>>>       at
>> org.eigenbase.sql.validate.AbstractNamespace.validate(AbstractNamespace.java:85)
>>> [optiq-core-0.9-drill-r6.jar:na]
>>>       at
>> org.eigenbase.sql.validate.SqlValidatorImpl.validateNamespace(SqlValidatorImpl.java:779)
>>> [optiq-core-0.9-drill-r6.jar:na]
>>>       at
>> org.eigenbase.sql.validate.SqlValidatorImpl.validateQuery(SqlValidatorImpl.java:768)
>>> [optiq-core-0.9-drill-r6.jar:na]
>>>       at org.eigenbase.sql.SqlSelect.validate(SqlSelect.java:208)
>>> [optiq-core-0.9-drill-r6.jar:na]
>>>       at
>> org.eigenbase.sql.validate.SqlValidatorImpl.validateScopedExpression(SqlValidatorImpl.java:742)
>>> [optiq-core-0.9-drill-r6.jar:na]
>>>       at
>> org.eigenbase.sql.validate.SqlValidatorImpl.validate(SqlValidatorImpl.java:458)
>>> [optiq-core-0.9-drill-r6.jar:na]
>>> 
>>> 
>>>                            425,2-9       73%
>>>       at
>> org.eigenbase.sql.validate.SqlValidatorImpl.validateScopedExpression(SqlValidatorImpl.java:742)
>>> [optiq-core-0.9-drill-r6.jar:na]
>>>       at
>> org.eigenbase.sql.validate.SqlValidatorImpl.validate(SqlValidatorImpl.java:458)
>>> [optiq-core-0.9-drill-r6.jar:na]
>>>       at
>> net.hydromatic.optiq.prepare.PlannerImpl.validate(PlannerImpl.java:173)
>>> [optiq-core-0.9-drill-r6.jar:na]
>>>       at
>> org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.validateNode(DefaultSqlHandler.java:145)
>> [drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
>>>       at
>> org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.getPlan(DefaultSqlHandler.java:125)
>> [drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
>>>       at
>> org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:132)
>> [drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
>>>       at
>> org.apache.drill.exec.work.foreman.Foreman.runSQL(Foreman.java:384)
>> [drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
>>>       at
>> org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:204)
>> [drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
>>>       at
>> org.apache.drill.exec.work.WorkManager$RunnableWrapper.run(WorkManager.java:249)
>> [drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
>>>       at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>> [na:1.7.0_17]
>>>       at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>> [na:1.7.0_17]
>>>       at java.lang.Thread.run(Thread.java:722) [na:1.7.0_17]
>>> Caused by: java.io.IOException: Couldn't set up IO streams
>>>       at
>> org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:753)
>>> ~[hadoop-common-2.5.1.jar:na]
>>>       at
>> org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:368)
>>> ~[hadoop-common-2.5.1.jar:na]
>>>       at org.apache.hadoop.ipc.Client.getConnection(Client.java:1423)
>>> ~[hadoop-common-2.5.1.jar:na]
>>>       at org.apache.hadoop.ipc.Client.call(Client.java:1342)
>>> ~[hadoop-common-2.5.1.jar:na]
>>>       ... 53 common frames omitted
>>> Caused by: java.lang.NoClassDefFoundError:
>>> org/apache/hadoop/yarn/api/ApplicationClientProtocolPB
>>>       at
>> org.apache.hadoop.yarn.security.client.ClientRMSecurityInfo.getTokenInfo(ClientRMSecurityInfo.java:65)
>>> ~[hadoop-yarn-common-2.5.1.jar:na]
>>>       at
>> org.apache.hadoop.security.SecurityUtil.getTokenInfo(SecurityUtil.java:327)
>>> ~[hadoop-common-2.5.1.jar:na]
>>>       at
>> org.apache.hadoop.security.SaslRpcClient.getServerToken(SaslRpcClient.java:262)
>>> ~[hadoop-common-2.5.1.jar:na]
>>>       at
>> org.apache.hadoop.security.SaslRpcClient.createSaslClient(SaslRpcClient.java:218)
>>> ~[hadoop-common-2.5.1.jar:na]
>>>       at
>> org.apache.hadoop.security.SaslRpcClient.selectSaslClient(SaslRpcClient.java:158)
>>> ~[hadoop-common-2.5.1.jar:na]
>>>       at
>> org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:388)
>>> ~[hadoop-common-2.5.1.jar:na]
>>>       at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:702)
>>> ~[hadoop-common-2.5.1.jar:na]
>>>       at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:698)
>>> ~[hadoop-common-2.5.1.jar:na]
>>>       at java.security.AccessController.doPrivileged(Native Method)
>>> ~[na:1.7.0_17]
>>>       at javax.security.auth.Subject.doAs(Subject.java:415)
>> ~[na:1.7.0_17]
>>>       at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1637)
>>> ~[hadoop-common-2.5.1.jar:na]
>>>       at
>> org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:697)
>>> ~[hadoop-common-2.5.1.jar:na]
>>>       ... 56 common frames omitted
>>> 2014-11-07 22:41:59,250 [a1edbba9-b61a-4eb8-a0bd-8afa41a3db52:foreman]
>>> WARN  o.a.d.e.s.dfs.WorkspaceSchemaFactory - Failure while trying to
>>> load .drill file.
>>> java.io.IOException: Failed on local exception: java.io.IOException:
>>> Couldn't set up IO streams; Host Details : local host is: "*.*.*.*";
>>> destination host is: "*.*.*.*":8020;
>>>       at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
>>> ~[hadoop-common-2.5.1.jar:na]
>>>       at org.apache.hadoop.ipc.Client.call(Client.java:1375)
>>> ~[hadoop-common-2.5.1.jar:na]
>>>       at org.apache.hadoop.ipc.Client.call(Client.java:1324)
>>> ~[hadoop-common-2.5.1.jar:na]
>>>       at
>> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
>>> ~[hadoop-common-2.5.1.jar:na]
>>>       at com.sun.proxy.$Proxy42.getFileInfo(Unknown Source) ~[na:na]
>>>       at
>> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:707)
>>> ~[hadoop-hdfs-2.5.1.jar:na]
>> 

Re: Help: Drill 0.7.0 unable to query hdfs file

Posted by Li HM <hm...@gmail.com>.
anything special needed to be done for a secure cluster?

On Friday, November 7, 2014, Aditya <ad...@gmail.com> wrote:

> Are you running against a secure HDFS cluster?
>
> On Fri, Nov 7, 2014 at 10:00 PM, Hmxxyy <hmxxyy@gmail.com <javascript:;>>
> wrote:
>
> > Anybody has any clue?
> >
> > Sent from my iPhone
> >
> > > On Nov 7, 2014, at 3:09 PM, Li HM <hmxxyy@gmail.com <javascript:;>>
> wrote:
> > >
> > > fresh compiled drill 0.7.0 with hadoop-2.5.1. Whenever querying a hdfs
> > > file, I get the following error from sqlline
> > >
> > > Query failed: Failure while running sql.
> > >
> > > Error: exception while executing query: Failure while executing query.
> > > (state=,code=0)
> > >
> > > Checking the drillbit log, there are long java exceptions. Anybody
> > > know what would be the issue?
> > >
> > > The fatal one looks like a missing class.
> > > Caused by: java.lang.NoClassDefFoundError:
> > > org/apache/hadoop/yarn/api/ApplicationClientProtocolPB. but the class
> > > is in hadoop-yarn-common-0.5.1.jar
> > >
> > >> jar tf hadoop-yarn-common-2.5.1.jar | grep
> > "org/apache/hadoop/yarn/api/ApplicationClientProtocolPB.class"
> > > org/apache/hadoop/yarn/api/ApplicationClientProtocolPB.class
> > >
> > > Please help, thanks!
> > >
> > > 2014-11-07 22:41:45,283
> > > [84d32586-a999-4b4b-a05c-620058a29fb6:frag:0:0] WARN
> > > o.a.d.exec.work.foreman.QueryStatus - Update finished query state :
> > > COMPLETED
> > > 2014-11-07 22:41:52,136 [UserServer-1] WARN
> > > o.a.d.exec.work.foreman.QueryStatus - Update running or pending query
> > > state : PENDING
> > > 2014-11-07 22:41:54,697 [a1edbba9-b61a-4eb8-a0bd-8afa41a3db52:foreman]
> > > WARN  o.a.d.e.s.dfs.WorkspaceSchemaFactory - Failure while trying to
> > > load .drill file.
> > > java.io.IOException: Failed on local exception: java.io.IOException:
> > > Couldn't set up IO streams; Host Details : local host is:
> > > "*.*.*.*/*.*.*.*"; destination host is: "*.*.*.*":8020;
> > >        at
> org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
> > > ~[hadoop-common-2.5.1.jar:na]
> > >        at org.apache.hadoop.ipc.Client.call(Client.java:1375)
> > > ~[hadoop-common-2.5.1.jar:na]
> > >        at org.apache.hadoop.ipc.Client.call(Client.java:1324)
> > > ~[hadoop-common-2.5.1.jar:na]
> > >        at
> >
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
> > > ~[hadoop-common-2.5.1.jar:na]
> > >        at com.sun.proxy.$Proxy42.getFileInfo(Unknown Source) ~[na:na]
> > >        at
> >
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:707)
> > > ~[hadoop-hdfs-2.5.1.jar:na]
> > >
> > >
> > >                             426,2-9       66%
> > >        at com.sun.proxy.$Proxy42.getFileInfo(Unknown Source) ~[na:na]
> > >        at
> >
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:707)
> > > ~[hadoop-hdfs-2.5.1.jar:na]
> > >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > > ~[na:1.7.0_17]
> > >        at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> > > ~[na:1.7.0_17]
> > >        at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > > ~[na:1.7.0_17]
> > >        at java.lang.reflect.Method.invoke(Method.java:601)
> ~[na:1.7.0_17]
> > >        at
> >
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
> > > ~[hadoop-common-2.5.1.jar:na]
> > >        at
> >
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
> > > ~[hadoop-common-2.5.1.jar:na]
> > >        at com.sun.proxy.$Proxy43.getFileInfo(Unknown Source) ~[na:na]
> > >        at
> > org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1785)
> > > ~[hadoop-hdfs-2.5.1.jar:na]
> > >        at
> >
> org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1068)
> > > ~[hadoop-hdfs-2.5.1.jar:na]
> > >        at
> >
> org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1064)
> > > ~[hadoop-hdfs-2.5.1.jar:na]
> > >        at
> >
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
> > > ~[hadoop-common-2.5.1.jar:na]
> > >        at
> >
> org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1064)
> > > ~[hadoop-hdfs-2.5.1.jar:na]
> > >        at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:59)
> > > ~[hadoop-common-2.5.1.jar:na]
> > >        at org.apache.hadoop.fs.Globber.matchFilter(Globber.java:276)
> > > ~[hadoop-common-2.5.1.jar:na]
> > >        at org.apache.hadoop.fs.Globber.applyFilters(Globber.java:258)
> > > ~[hadoop-common-2.5.1.jar:na]
> > >        at org.apache.hadoop.fs.Globber.glob(Globber.java:226)
> > > ~[hadoop-common-2.5.1.jar:na]
> > >        at org.apache.hadoop.fs.Globber.glob(Globber.java:177)
> > > ~[hadoop-common-2.5.1.jar:na]
> > >        at
> > org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1623)
> > > ~[hadoop-common-2.5.1.jar:na]
> > >        at
> >
> org.apache.drill.exec.dotdrill.DotDrillUtil.getDotDrills(DotDrillUtil.java:57)
> > >
> >
> ~[drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
> > >        at
> >
> org.apache.drill.exec.store.dfs.WorkspaceSchemaFactory$WorkspaceSchema.getTable(WorkspaceSchemaFactory.java:259)
> > >
> >
> ~[drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
> > >        at
> >
> org.apache.drill.exec.store.dfs.FileSystemSchemaFactory$FileSystemSchema.getTable(FileSystemSchemaFactory.java:97)
> > >
> >
> [drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
> > >        at
> >
> net.hydromatic.optiq.jdbc.SimpleOptiqSchema.getTable(SimpleOptiqSchema.java:75)
> > > [optiq-core-0.9-drill-r6.jar:na]
> > >        at
> >
> net.hydromatic.optiq.prepare.OptiqCatalogReader.getTableFrom(OptiqCatalogReader.java:87)
> > > [optiq-core-0.9-drill-r6.jar:na]
> > >        at
> >
> net.hydromatic.optiq.prepare.OptiqCatalogReader.getTable(OptiqCatalogReader.java:70)
> > > [optiq-core-0.9-drill-r6.jar:na]
> > >        at
> >
> net.hydromatic.optiq.prepare.OptiqCatalogReader.getTable(OptiqCatalogReader.java:1)
> > > [optiq-core-0.9-drill-r6.jar:na]
> > >        at
> >
> org.eigenbase.sql.validate.EmptyScope.getTableNamespace(EmptyScope.java:67)
> > > [optiq-core-0.9-drill-r6.jar:na]
> > >        at
> >
> org.eigenbase.sql.validate.IdentifierNamespace.validateImpl(IdentifierNamespace.java:75)
> > > [optiq-core-0.9-drill-r6.jar:na]
> > >        at
> >
> org.eigenbase.sql.validate.AbstractNamespace.validate(AbstractNamespace.java:85)
> > > [optiq-core-0.9-drill-r6.jar:na]
> > >        at
> >
> org.eigenbase.sql.validate.SqlValidatorImpl.validateNamespace(SqlValidatorImpl.java:779)
> > > [optiq-core-0.9-drill-r6.jar:na]
> > >        at
> >
> org.eigenbase.sql.validate.SqlValidatorImpl.validateQuery(SqlValidatorImpl.java:768)
> > > [optiq-core-0.9-drill-r6.jar:na]
> > >        at
> >
> org.eigenbase.sql.validate.SqlValidatorImpl.validateFrom(SqlValidatorImpl.java:2599)
> > > [optiq-core-0.9-drill-r6.jar:na]
> > >        at
> >
> org.eigenbase.sql.validate.SqlValidatorImpl.validateSelect(SqlValidatorImpl.java:2807)
> > > [optiq-core-0.9-drill-r6.jar:na]
> > >        at
> >
> org.eigenbase.sql.validate.SelectNamespace.validateImpl(SelectNamespace.java:60)
> > > [optiq-core-0.9-drill-r6.jar:na]
> > >        at
> >
> org.eigenbase.sql.validate.AbstractNamespace.validate(AbstractNamespace.java:85)
> > > [optiq-core-0.9-drill-r6.jar:na]
> > >        at
> >
> org.eigenbase.sql.validate.SqlValidatorImpl.validateNamespace(SqlValidatorImpl.java:779)
> > > [optiq-core-0.9-drill-r6.jar:na]
> > >        at
> >
> org.eigenbase.sql.validate.SqlValidatorImpl.validateQuery(SqlValidatorImpl.java:768)
> > > [optiq-core-0.9-drill-r6.jar:na]
> > >        at org.eigenbase.sql.SqlSelect.validate(SqlSelect.java:208)
> > > [optiq-core-0.9-drill-r6.jar:na]
> > >        at
> >
> org.eigenbase.sql.validate.SqlValidatorImpl.validateScopedExpression(SqlValidatorImpl.java:742)
> > > [optiq-core-0.9-drill-r6.jar:na]
> > >        at
> >
> org.eigenbase.sql.validate.SqlValidatorImpl.validate(SqlValidatorImpl.java:458)
> > > [optiq-core-0.9-drill-r6.jar:na]
> > >
> > >
> > >                             425,2-9       73%
> > >        at
> >
> org.eigenbase.sql.validate.SqlValidatorImpl.validateScopedExpression(SqlValidatorImpl.java:742)
> > > [optiq-core-0.9-drill-r6.jar:na]
> > >        at
> >
> org.eigenbase.sql.validate.SqlValidatorImpl.validate(SqlValidatorImpl.java:458)
> > > [optiq-core-0.9-drill-r6.jar:na]
> > >        at
> > net.hydromatic.optiq.prepare.PlannerImpl.validate(PlannerImpl.java:173)
> > > [optiq-core-0.9-drill-r6.jar:na]
> > >        at
> >
> org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.validateNode(DefaultSqlHandler.java:145)
> > >
> >
> [drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
> > >        at
> >
> org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.getPlan(DefaultSqlHandler.java:125)
> > >
> >
> [drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
> > >        at
> >
> org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:132)
> > >
> >
> [drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
> > >        at
> > org.apache.drill.exec.work.foreman.Foreman.runSQL(Foreman.java:384)
> > >
> >
> [drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
> > >        at
> > org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:204)
> > >
> >
> [drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
> > >        at
> >
> org.apache.drill.exec.work.WorkManager$RunnableWrapper.run(WorkManager.java:249)
> > >
> >
> [drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
> > >        at
> >
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> > > [na:1.7.0_17]
> > >        at
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> > > [na:1.7.0_17]
> > >        at java.lang.Thread.run(Thread.java:722) [na:1.7.0_17]
> > > Caused by: java.io.IOException: Couldn't set up IO streams
> > >        at
> > org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:753)
> > > ~[hadoop-common-2.5.1.jar:na]
> > >        at
> > org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:368)
> > > ~[hadoop-common-2.5.1.jar:na]
> > >        at org.apache.hadoop.ipc.Client.getConnection(Client.java:1423)
> > > ~[hadoop-common-2.5.1.jar:na]
> > >        at org.apache.hadoop.ipc.Client.call(Client.java:1342)
> > > ~[hadoop-common-2.5.1.jar:na]
> > >        ... 53 common frames omitted
> > > Caused by: java.lang.NoClassDefFoundError:
> > > org/apache/hadoop/yarn/api/ApplicationClientProtocolPB
> > >        at
> >
> org.apache.hadoop.yarn.security.client.ClientRMSecurityInfo.getTokenInfo(ClientRMSecurityInfo.java:65)
> > > ~[hadoop-yarn-common-2.5.1.jar:na]
> > >        at
> >
> org.apache.hadoop.security.SecurityUtil.getTokenInfo(SecurityUtil.java:327)
> > > ~[hadoop-common-2.5.1.jar:na]
> > >        at
> >
> org.apache.hadoop.security.SaslRpcClient.getServerToken(SaslRpcClient.java:262)
> > > ~[hadoop-common-2.5.1.jar:na]
> > >        at
> >
> org.apache.hadoop.security.SaslRpcClient.createSaslClient(SaslRpcClient.java:218)
> > > ~[hadoop-common-2.5.1.jar:na]
> > >        at
> >
> org.apache.hadoop.security.SaslRpcClient.selectSaslClient(SaslRpcClient.java:158)
> > > ~[hadoop-common-2.5.1.jar:na]
> > >        at
> >
> org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:388)
> > > ~[hadoop-common-2.5.1.jar:na]
> > >        at
> org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:702)
> > > ~[hadoop-common-2.5.1.jar:na]
> > >        at
> org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:698)
> > > ~[hadoop-common-2.5.1.jar:na]
> > >        at java.security.AccessController.doPrivileged(Native Method)
> > > ~[na:1.7.0_17]
> > >        at javax.security.auth.Subject.doAs(Subject.java:415)
> > ~[na:1.7.0_17]
> > >        at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1637)
> > > ~[hadoop-common-2.5.1.jar:na]
> > >        at
> > org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:697)
> > > ~[hadoop-common-2.5.1.jar:na]
> > >        ... 56 common frames omitted
> > > 2014-11-07 22:41:59,250 [a1edbba9-b61a-4eb8-a0bd-8afa41a3db52:foreman]
> > > WARN  o.a.d.e.s.dfs.WorkspaceSchemaFactory - Failure while trying to
> > > load .drill file.
> > > java.io.IOException: Failed on local exception: java.io.IOException:
> > > Couldn't set up IO streams; Host Details : local host is: "*.*.*.*";
> > > destination host is: "*.*.*.*":8020;
> > >        at
> org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
> > > ~[hadoop-common-2.5.1.jar:na]
> > >        at org.apache.hadoop.ipc.Client.call(Client.java:1375)
> > > ~[hadoop-common-2.5.1.jar:na]
> > >        at org.apache.hadoop.ipc.Client.call(Client.java:1324)
> > > ~[hadoop-common-2.5.1.jar:na]
> > >        at
> >
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
> > > ~[hadoop-common-2.5.1.jar:na]
> > >        at com.sun.proxy.$Proxy42.getFileInfo(Unknown Source) ~[na:na]
> > >        at
> >
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:707)
> > > ~[hadoop-hdfs-2.5.1.jar:na]
> >
>

Re: Help: Drill 0.7.0 unable to query hdfs file

Posted by Aditya <ad...@gmail.com>.
Are you running against a secure HDFS cluster?

On Fri, Nov 7, 2014 at 10:00 PM, Hmxxyy <hm...@gmail.com> wrote:

> Anybody has any clue?
>
> Sent from my iPhone
>
> > On Nov 7, 2014, at 3:09 PM, Li HM <hm...@gmail.com> wrote:
> >
> > fresh compiled drill 0.7.0 with hadoop-2.5.1. Whenever querying a hdfs
> > file, I get the following error from sqlline
> >
> > Query failed: Failure while running sql.
> >
> > Error: exception while executing query: Failure while executing query.
> > (state=,code=0)
> >
> > Checking the drillbit log, there are long java exceptions. Anybody
> > know what would be the issue?
> >
> > The fatal one looks like a missing class.
> > Caused by: java.lang.NoClassDefFoundError:
> > org/apache/hadoop/yarn/api/ApplicationClientProtocolPB. but the class
> > is in hadoop-yarn-common-0.5.1.jar
> >
> >> jar tf hadoop-yarn-common-2.5.1.jar | grep
> "org/apache/hadoop/yarn/api/ApplicationClientProtocolPB.class"
> > org/apache/hadoop/yarn/api/ApplicationClientProtocolPB.class
> >
> > Please help, thanks!
> >
> > 2014-11-07 22:41:45,283
> > [84d32586-a999-4b4b-a05c-620058a29fb6:frag:0:0] WARN
> > o.a.d.exec.work.foreman.QueryStatus - Update finished query state :
> > COMPLETED
> > 2014-11-07 22:41:52,136 [UserServer-1] WARN
> > o.a.d.exec.work.foreman.QueryStatus - Update running or pending query
> > state : PENDING
> > 2014-11-07 22:41:54,697 [a1edbba9-b61a-4eb8-a0bd-8afa41a3db52:foreman]
> > WARN  o.a.d.e.s.dfs.WorkspaceSchemaFactory - Failure while trying to
> > load .drill file.
> > java.io.IOException: Failed on local exception: java.io.IOException:
> > Couldn't set up IO streams; Host Details : local host is:
> > "*.*.*.*/*.*.*.*"; destination host is: "*.*.*.*":8020;
> >        at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
> > ~[hadoop-common-2.5.1.jar:na]
> >        at org.apache.hadoop.ipc.Client.call(Client.java:1375)
> > ~[hadoop-common-2.5.1.jar:na]
> >        at org.apache.hadoop.ipc.Client.call(Client.java:1324)
> > ~[hadoop-common-2.5.1.jar:na]
> >        at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
> > ~[hadoop-common-2.5.1.jar:na]
> >        at com.sun.proxy.$Proxy42.getFileInfo(Unknown Source) ~[na:na]
> >        at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:707)
> > ~[hadoop-hdfs-2.5.1.jar:na]
> >
> >
> >                             426,2-9       66%
> >        at com.sun.proxy.$Proxy42.getFileInfo(Unknown Source) ~[na:na]
> >        at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:707)
> > ~[hadoop-hdfs-2.5.1.jar:na]
> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > ~[na:1.7.0_17]
> >        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> > ~[na:1.7.0_17]
> >        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > ~[na:1.7.0_17]
> >        at java.lang.reflect.Method.invoke(Method.java:601) ~[na:1.7.0_17]
> >        at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
> > ~[hadoop-common-2.5.1.jar:na]
> >        at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
> > ~[hadoop-common-2.5.1.jar:na]
> >        at com.sun.proxy.$Proxy43.getFileInfo(Unknown Source) ~[na:na]
> >        at
> org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1785)
> > ~[hadoop-hdfs-2.5.1.jar:na]
> >        at
> org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1068)
> > ~[hadoop-hdfs-2.5.1.jar:na]
> >        at
> org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1064)
> > ~[hadoop-hdfs-2.5.1.jar:na]
> >        at
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
> > ~[hadoop-common-2.5.1.jar:na]
> >        at
> org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1064)
> > ~[hadoop-hdfs-2.5.1.jar:na]
> >        at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:59)
> > ~[hadoop-common-2.5.1.jar:na]
> >        at org.apache.hadoop.fs.Globber.matchFilter(Globber.java:276)
> > ~[hadoop-common-2.5.1.jar:na]
> >        at org.apache.hadoop.fs.Globber.applyFilters(Globber.java:258)
> > ~[hadoop-common-2.5.1.jar:na]
> >        at org.apache.hadoop.fs.Globber.glob(Globber.java:226)
> > ~[hadoop-common-2.5.1.jar:na]
> >        at org.apache.hadoop.fs.Globber.glob(Globber.java:177)
> > ~[hadoop-common-2.5.1.jar:na]
> >        at
> org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1623)
> > ~[hadoop-common-2.5.1.jar:na]
> >        at
> org.apache.drill.exec.dotdrill.DotDrillUtil.getDotDrills(DotDrillUtil.java:57)
> >
> ~[drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
> >        at
> org.apache.drill.exec.store.dfs.WorkspaceSchemaFactory$WorkspaceSchema.getTable(WorkspaceSchemaFactory.java:259)
> >
> ~[drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
> >        at
> org.apache.drill.exec.store.dfs.FileSystemSchemaFactory$FileSystemSchema.getTable(FileSystemSchemaFactory.java:97)
> >
> [drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
> >        at
> net.hydromatic.optiq.jdbc.SimpleOptiqSchema.getTable(SimpleOptiqSchema.java:75)
> > [optiq-core-0.9-drill-r6.jar:na]
> >        at
> net.hydromatic.optiq.prepare.OptiqCatalogReader.getTableFrom(OptiqCatalogReader.java:87)
> > [optiq-core-0.9-drill-r6.jar:na]
> >        at
> net.hydromatic.optiq.prepare.OptiqCatalogReader.getTable(OptiqCatalogReader.java:70)
> > [optiq-core-0.9-drill-r6.jar:na]
> >        at
> net.hydromatic.optiq.prepare.OptiqCatalogReader.getTable(OptiqCatalogReader.java:1)
> > [optiq-core-0.9-drill-r6.jar:na]
> >        at
> org.eigenbase.sql.validate.EmptyScope.getTableNamespace(EmptyScope.java:67)
> > [optiq-core-0.9-drill-r6.jar:na]
> >        at
> org.eigenbase.sql.validate.IdentifierNamespace.validateImpl(IdentifierNamespace.java:75)
> > [optiq-core-0.9-drill-r6.jar:na]
> >        at
> org.eigenbase.sql.validate.AbstractNamespace.validate(AbstractNamespace.java:85)
> > [optiq-core-0.9-drill-r6.jar:na]
> >        at
> org.eigenbase.sql.validate.SqlValidatorImpl.validateNamespace(SqlValidatorImpl.java:779)
> > [optiq-core-0.9-drill-r6.jar:na]
> >        at
> org.eigenbase.sql.validate.SqlValidatorImpl.validateQuery(SqlValidatorImpl.java:768)
> > [optiq-core-0.9-drill-r6.jar:na]
> >        at
> org.eigenbase.sql.validate.SqlValidatorImpl.validateFrom(SqlValidatorImpl.java:2599)
> > [optiq-core-0.9-drill-r6.jar:na]
> >        at
> org.eigenbase.sql.validate.SqlValidatorImpl.validateSelect(SqlValidatorImpl.java:2807)
> > [optiq-core-0.9-drill-r6.jar:na]
> >        at
> org.eigenbase.sql.validate.SelectNamespace.validateImpl(SelectNamespace.java:60)
> > [optiq-core-0.9-drill-r6.jar:na]
> >        at
> org.eigenbase.sql.validate.AbstractNamespace.validate(AbstractNamespace.java:85)
> > [optiq-core-0.9-drill-r6.jar:na]
> >        at
> org.eigenbase.sql.validate.SqlValidatorImpl.validateNamespace(SqlValidatorImpl.java:779)
> > [optiq-core-0.9-drill-r6.jar:na]
> >        at
> org.eigenbase.sql.validate.SqlValidatorImpl.validateQuery(SqlValidatorImpl.java:768)
> > [optiq-core-0.9-drill-r6.jar:na]
> >        at org.eigenbase.sql.SqlSelect.validate(SqlSelect.java:208)
> > [optiq-core-0.9-drill-r6.jar:na]
> >        at
> org.eigenbase.sql.validate.SqlValidatorImpl.validateScopedExpression(SqlValidatorImpl.java:742)
> > [optiq-core-0.9-drill-r6.jar:na]
> >        at
> org.eigenbase.sql.validate.SqlValidatorImpl.validate(SqlValidatorImpl.java:458)
> > [optiq-core-0.9-drill-r6.jar:na]
> >
> >
> >                             425,2-9       73%
> >        at
> org.eigenbase.sql.validate.SqlValidatorImpl.validateScopedExpression(SqlValidatorImpl.java:742)
> > [optiq-core-0.9-drill-r6.jar:na]
> >        at
> org.eigenbase.sql.validate.SqlValidatorImpl.validate(SqlValidatorImpl.java:458)
> > [optiq-core-0.9-drill-r6.jar:na]
> >        at
> net.hydromatic.optiq.prepare.PlannerImpl.validate(PlannerImpl.java:173)
> > [optiq-core-0.9-drill-r6.jar:na]
> >        at
> org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.validateNode(DefaultSqlHandler.java:145)
> >
> [drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
> >        at
> org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.getPlan(DefaultSqlHandler.java:125)
> >
> [drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
> >        at
> org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:132)
> >
> [drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
> >        at
> org.apache.drill.exec.work.foreman.Foreman.runSQL(Foreman.java:384)
> >
> [drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
> >        at
> org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:204)
> >
> [drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
> >        at
> org.apache.drill.exec.work.WorkManager$RunnableWrapper.run(WorkManager.java:249)
> >
> [drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
> >        at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> > [na:1.7.0_17]
> >        at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> > [na:1.7.0_17]
> >        at java.lang.Thread.run(Thread.java:722) [na:1.7.0_17]
> > Caused by: java.io.IOException: Couldn't set up IO streams
> >        at
> org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:753)
> > ~[hadoop-common-2.5.1.jar:na]
> >        at
> org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:368)
> > ~[hadoop-common-2.5.1.jar:na]
> >        at org.apache.hadoop.ipc.Client.getConnection(Client.java:1423)
> > ~[hadoop-common-2.5.1.jar:na]
> >        at org.apache.hadoop.ipc.Client.call(Client.java:1342)
> > ~[hadoop-common-2.5.1.jar:na]
> >        ... 53 common frames omitted
> > Caused by: java.lang.NoClassDefFoundError:
> > org/apache/hadoop/yarn/api/ApplicationClientProtocolPB
> >        at
> org.apache.hadoop.yarn.security.client.ClientRMSecurityInfo.getTokenInfo(ClientRMSecurityInfo.java:65)
> > ~[hadoop-yarn-common-2.5.1.jar:na]
> >        at
> org.apache.hadoop.security.SecurityUtil.getTokenInfo(SecurityUtil.java:327)
> > ~[hadoop-common-2.5.1.jar:na]
> >        at
> org.apache.hadoop.security.SaslRpcClient.getServerToken(SaslRpcClient.java:262)
> > ~[hadoop-common-2.5.1.jar:na]
> >        at
> org.apache.hadoop.security.SaslRpcClient.createSaslClient(SaslRpcClient.java:218)
> > ~[hadoop-common-2.5.1.jar:na]
> >        at
> org.apache.hadoop.security.SaslRpcClient.selectSaslClient(SaslRpcClient.java:158)
> > ~[hadoop-common-2.5.1.jar:na]
> >        at
> org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:388)
> > ~[hadoop-common-2.5.1.jar:na]
> >        at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:702)
> > ~[hadoop-common-2.5.1.jar:na]
> >        at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:698)
> > ~[hadoop-common-2.5.1.jar:na]
> >        at java.security.AccessController.doPrivileged(Native Method)
> > ~[na:1.7.0_17]
> >        at javax.security.auth.Subject.doAs(Subject.java:415)
> ~[na:1.7.0_17]
> >        at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1637)
> > ~[hadoop-common-2.5.1.jar:na]
> >        at
> org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:697)
> > ~[hadoop-common-2.5.1.jar:na]
> >        ... 56 common frames omitted
> > 2014-11-07 22:41:59,250 [a1edbba9-b61a-4eb8-a0bd-8afa41a3db52:foreman]
> > WARN  o.a.d.e.s.dfs.WorkspaceSchemaFactory - Failure while trying to
> > load .drill file.
> > java.io.IOException: Failed on local exception: java.io.IOException:
> > Couldn't set up IO streams; Host Details : local host is: "*.*.*.*";
> > destination host is: "*.*.*.*":8020;
> >        at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
> > ~[hadoop-common-2.5.1.jar:na]
> >        at org.apache.hadoop.ipc.Client.call(Client.java:1375)
> > ~[hadoop-common-2.5.1.jar:na]
> >        at org.apache.hadoop.ipc.Client.call(Client.java:1324)
> > ~[hadoop-common-2.5.1.jar:na]
> >        at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
> > ~[hadoop-common-2.5.1.jar:na]
> >        at com.sun.proxy.$Proxy42.getFileInfo(Unknown Source) ~[na:na]
> >        at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:707)
> > ~[hadoop-hdfs-2.5.1.jar:na]
>

Re: Help: Drill 0.7.0 unable to query hdfs file

Posted by Hmxxyy <hm...@gmail.com>.
Anybody has any clue?

Sent from my iPhone

> On Nov 7, 2014, at 3:09 PM, Li HM <hm...@gmail.com> wrote:
> 
> fresh compiled drill 0.7.0 with hadoop-2.5.1. Whenever querying a hdfs
> file, I get the following error from sqlline
> 
> Query failed: Failure while running sql.
> 
> Error: exception while executing query: Failure while executing query.
> (state=,code=0)
> 
> Checking the drillbit log, there are long java exceptions. Anybody
> know what would be the issue?
> 
> The fatal one looks like a missing class.
> Caused by: java.lang.NoClassDefFoundError:
> org/apache/hadoop/yarn/api/ApplicationClientProtocolPB. but the class
> is in hadoop-yarn-common-0.5.1.jar
> 
>> jar tf hadoop-yarn-common-2.5.1.jar | grep "org/apache/hadoop/yarn/api/ApplicationClientProtocolPB.class"
> org/apache/hadoop/yarn/api/ApplicationClientProtocolPB.class
> 
> Please help, thanks!
> 
> 2014-11-07 22:41:45,283
> [84d32586-a999-4b4b-a05c-620058a29fb6:frag:0:0] WARN
> o.a.d.exec.work.foreman.QueryStatus - Update finished query state :
> COMPLETED
> 2014-11-07 22:41:52,136 [UserServer-1] WARN
> o.a.d.exec.work.foreman.QueryStatus - Update running or pending query
> state : PENDING
> 2014-11-07 22:41:54,697 [a1edbba9-b61a-4eb8-a0bd-8afa41a3db52:foreman]
> WARN  o.a.d.e.s.dfs.WorkspaceSchemaFactory - Failure while trying to
> load .drill file.
> java.io.IOException: Failed on local exception: java.io.IOException:
> Couldn't set up IO streams; Host Details : local host is:
> "*.*.*.*/*.*.*.*"; destination host is: "*.*.*.*":8020;
>        at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
> ~[hadoop-common-2.5.1.jar:na]
>        at org.apache.hadoop.ipc.Client.call(Client.java:1375)
> ~[hadoop-common-2.5.1.jar:na]
>        at org.apache.hadoop.ipc.Client.call(Client.java:1324)
> ~[hadoop-common-2.5.1.jar:na]
>        at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
> ~[hadoop-common-2.5.1.jar:na]
>        at com.sun.proxy.$Proxy42.getFileInfo(Unknown Source) ~[na:na]
>        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:707)
> ~[hadoop-hdfs-2.5.1.jar:na]
> 
> 
>                             426,2-9       66%
>        at com.sun.proxy.$Proxy42.getFileInfo(Unknown Source) ~[na:na]
>        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:707)
> ~[hadoop-hdfs-2.5.1.jar:na]
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> ~[na:1.7.0_17]
>        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> ~[na:1.7.0_17]
>        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> ~[na:1.7.0_17]
>        at java.lang.reflect.Method.invoke(Method.java:601) ~[na:1.7.0_17]
>        at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
> ~[hadoop-common-2.5.1.jar:na]
>        at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
> ~[hadoop-common-2.5.1.jar:na]
>        at com.sun.proxy.$Proxy43.getFileInfo(Unknown Source) ~[na:na]
>        at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1785)
> ~[hadoop-hdfs-2.5.1.jar:na]
>        at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1068)
> ~[hadoop-hdfs-2.5.1.jar:na]
>        at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1064)
> ~[hadoop-hdfs-2.5.1.jar:na]
>        at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
> ~[hadoop-common-2.5.1.jar:na]
>        at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1064)
> ~[hadoop-hdfs-2.5.1.jar:na]
>        at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:59)
> ~[hadoop-common-2.5.1.jar:na]
>        at org.apache.hadoop.fs.Globber.matchFilter(Globber.java:276)
> ~[hadoop-common-2.5.1.jar:na]
>        at org.apache.hadoop.fs.Globber.applyFilters(Globber.java:258)
> ~[hadoop-common-2.5.1.jar:na]
>        at org.apache.hadoop.fs.Globber.glob(Globber.java:226)
> ~[hadoop-common-2.5.1.jar:na]
>        at org.apache.hadoop.fs.Globber.glob(Globber.java:177)
> ~[hadoop-common-2.5.1.jar:na]
>        at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1623)
> ~[hadoop-common-2.5.1.jar:na]
>        at org.apache.drill.exec.dotdrill.DotDrillUtil.getDotDrills(DotDrillUtil.java:57)
> ~[drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
>        at org.apache.drill.exec.store.dfs.WorkspaceSchemaFactory$WorkspaceSchema.getTable(WorkspaceSchemaFactory.java:259)
> ~[drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
>        at org.apache.drill.exec.store.dfs.FileSystemSchemaFactory$FileSystemSchema.getTable(FileSystemSchemaFactory.java:97)
> [drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
>        at net.hydromatic.optiq.jdbc.SimpleOptiqSchema.getTable(SimpleOptiqSchema.java:75)
> [optiq-core-0.9-drill-r6.jar:na]
>        at net.hydromatic.optiq.prepare.OptiqCatalogReader.getTableFrom(OptiqCatalogReader.java:87)
> [optiq-core-0.9-drill-r6.jar:na]
>        at net.hydromatic.optiq.prepare.OptiqCatalogReader.getTable(OptiqCatalogReader.java:70)
> [optiq-core-0.9-drill-r6.jar:na]
>        at net.hydromatic.optiq.prepare.OptiqCatalogReader.getTable(OptiqCatalogReader.java:1)
> [optiq-core-0.9-drill-r6.jar:na]
>        at org.eigenbase.sql.validate.EmptyScope.getTableNamespace(EmptyScope.java:67)
> [optiq-core-0.9-drill-r6.jar:na]
>        at org.eigenbase.sql.validate.IdentifierNamespace.validateImpl(IdentifierNamespace.java:75)
> [optiq-core-0.9-drill-r6.jar:na]
>        at org.eigenbase.sql.validate.AbstractNamespace.validate(AbstractNamespace.java:85)
> [optiq-core-0.9-drill-r6.jar:na]
>        at org.eigenbase.sql.validate.SqlValidatorImpl.validateNamespace(SqlValidatorImpl.java:779)
> [optiq-core-0.9-drill-r6.jar:na]
>        at org.eigenbase.sql.validate.SqlValidatorImpl.validateQuery(SqlValidatorImpl.java:768)
> [optiq-core-0.9-drill-r6.jar:na]
>        at org.eigenbase.sql.validate.SqlValidatorImpl.validateFrom(SqlValidatorImpl.java:2599)
> [optiq-core-0.9-drill-r6.jar:na]
>        at org.eigenbase.sql.validate.SqlValidatorImpl.validateSelect(SqlValidatorImpl.java:2807)
> [optiq-core-0.9-drill-r6.jar:na]
>        at org.eigenbase.sql.validate.SelectNamespace.validateImpl(SelectNamespace.java:60)
> [optiq-core-0.9-drill-r6.jar:na]
>        at org.eigenbase.sql.validate.AbstractNamespace.validate(AbstractNamespace.java:85)
> [optiq-core-0.9-drill-r6.jar:na]
>        at org.eigenbase.sql.validate.SqlValidatorImpl.validateNamespace(SqlValidatorImpl.java:779)
> [optiq-core-0.9-drill-r6.jar:na]
>        at org.eigenbase.sql.validate.SqlValidatorImpl.validateQuery(SqlValidatorImpl.java:768)
> [optiq-core-0.9-drill-r6.jar:na]
>        at org.eigenbase.sql.SqlSelect.validate(SqlSelect.java:208)
> [optiq-core-0.9-drill-r6.jar:na]
>        at org.eigenbase.sql.validate.SqlValidatorImpl.validateScopedExpression(SqlValidatorImpl.java:742)
> [optiq-core-0.9-drill-r6.jar:na]
>        at org.eigenbase.sql.validate.SqlValidatorImpl.validate(SqlValidatorImpl.java:458)
> [optiq-core-0.9-drill-r6.jar:na]
> 
> 
>                             425,2-9       73%
>        at org.eigenbase.sql.validate.SqlValidatorImpl.validateScopedExpression(SqlValidatorImpl.java:742)
> [optiq-core-0.9-drill-r6.jar:na]
>        at org.eigenbase.sql.validate.SqlValidatorImpl.validate(SqlValidatorImpl.java:458)
> [optiq-core-0.9-drill-r6.jar:na]
>        at net.hydromatic.optiq.prepare.PlannerImpl.validate(PlannerImpl.java:173)
> [optiq-core-0.9-drill-r6.jar:na]
>        at org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.validateNode(DefaultSqlHandler.java:145)
> [drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
>        at org.apache.drill.exec.planner.sql.handlers.DefaultSqlHandler.getPlan(DefaultSqlHandler.java:125)
> [drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
>        at org.apache.drill.exec.planner.sql.DrillSqlWorker.getPlan(DrillSqlWorker.java:132)
> [drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
>        at org.apache.drill.exec.work.foreman.Foreman.runSQL(Foreman.java:384)
> [drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
>        at org.apache.drill.exec.work.foreman.Foreman.run(Foreman.java:204)
> [drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
>        at org.apache.drill.exec.work.WorkManager$RunnableWrapper.run(WorkManager.java:249)
> [drill-java-exec-0.7.0-incubating-SNAPSHOT-rebuffed.jar:0.7.0-incubating-SNAPSHOT]
>        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> [na:1.7.0_17]
>        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> [na:1.7.0_17]
>        at java.lang.Thread.run(Thread.java:722) [na:1.7.0_17]
> Caused by: java.io.IOException: Couldn't set up IO streams
>        at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:753)
> ~[hadoop-common-2.5.1.jar:na]
>        at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:368)
> ~[hadoop-common-2.5.1.jar:na]
>        at org.apache.hadoop.ipc.Client.getConnection(Client.java:1423)
> ~[hadoop-common-2.5.1.jar:na]
>        at org.apache.hadoop.ipc.Client.call(Client.java:1342)
> ~[hadoop-common-2.5.1.jar:na]
>        ... 53 common frames omitted
> Caused by: java.lang.NoClassDefFoundError:
> org/apache/hadoop/yarn/api/ApplicationClientProtocolPB
>        at org.apache.hadoop.yarn.security.client.ClientRMSecurityInfo.getTokenInfo(ClientRMSecurityInfo.java:65)
> ~[hadoop-yarn-common-2.5.1.jar:na]
>        at org.apache.hadoop.security.SecurityUtil.getTokenInfo(SecurityUtil.java:327)
> ~[hadoop-common-2.5.1.jar:na]
>        at org.apache.hadoop.security.SaslRpcClient.getServerToken(SaslRpcClient.java:262)
> ~[hadoop-common-2.5.1.jar:na]
>        at org.apache.hadoop.security.SaslRpcClient.createSaslClient(SaslRpcClient.java:218)
> ~[hadoop-common-2.5.1.jar:na]
>        at org.apache.hadoop.security.SaslRpcClient.selectSaslClient(SaslRpcClient.java:158)
> ~[hadoop-common-2.5.1.jar:na]
>        at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:388)
> ~[hadoop-common-2.5.1.jar:na]
>        at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:702)
> ~[hadoop-common-2.5.1.jar:na]
>        at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:698)
> ~[hadoop-common-2.5.1.jar:na]
>        at java.security.AccessController.doPrivileged(Native Method)
> ~[na:1.7.0_17]
>        at javax.security.auth.Subject.doAs(Subject.java:415) ~[na:1.7.0_17]
>        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1637)
> ~[hadoop-common-2.5.1.jar:na]
>        at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:697)
> ~[hadoop-common-2.5.1.jar:na]
>        ... 56 common frames omitted
> 2014-11-07 22:41:59,250 [a1edbba9-b61a-4eb8-a0bd-8afa41a3db52:foreman]
> WARN  o.a.d.e.s.dfs.WorkspaceSchemaFactory - Failure while trying to
> load .drill file.
> java.io.IOException: Failed on local exception: java.io.IOException:
> Couldn't set up IO streams; Host Details : local host is: "*.*.*.*";
> destination host is: "*.*.*.*":8020;
>        at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
> ~[hadoop-common-2.5.1.jar:na]
>        at org.apache.hadoop.ipc.Client.call(Client.java:1375)
> ~[hadoop-common-2.5.1.jar:na]
>        at org.apache.hadoop.ipc.Client.call(Client.java:1324)
> ~[hadoop-common-2.5.1.jar:na]
>        at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
> ~[hadoop-common-2.5.1.jar:na]
>        at com.sun.proxy.$Proxy42.getFileInfo(Unknown Source) ~[na:na]
>        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:707)
> ~[hadoop-hdfs-2.5.1.jar:na]