You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Marcelo Vanzin (JIRA)" <ji...@apache.org> on 2015/07/28 00:18:04 UTC

[jira] [Resolved] (SPARK-8988) Driver logs links is missing on secure cluster

     [ https://issues.apache.org/jira/browse/SPARK-8988?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Marcelo Vanzin resolved SPARK-8988.
-----------------------------------
       Resolution: Fixed
         Assignee: Hari Shreedharan
    Fix Version/s: 1.5.0

> Driver logs links is missing on secure cluster
> ----------------------------------------------
>
>                 Key: SPARK-8988
>                 URL: https://issues.apache.org/jira/browse/SPARK-8988
>             Project: Spark
>          Issue Type: Bug
>          Components: YARN
>    Affects Versions: 1.4.0
>            Reporter: Hari Shreedharan
>            Assignee: Hari Shreedharan
>             Fix For: 1.5.0
>
>
> On a secure cluster, the {{NodeReports}} api throws an exception:
> {code}
> INFO cluster.YarnClusterSchedulerBackend: Node Report API is not available in the version of YARN being used, so AM logs link will not appear in application UI
> java.io.IOException: Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]; Host Details : local host is: "hsp-4.vpc.cloudera.com/172.28.195.51"; destination host is: "hsp-1.vpc.cloudera.com":8032; 
> 	at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:772)
> 	at org.apache.hadoop.ipc.Client.call(Client.java:1472)
> 	at org.apache.hadoop.ipc.Client.call(Client.java:1399)
> 	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
> 	at com.sun.proxy.$Proxy22.getClusterNodes(Unknown Source)
> 	at org.apache.hadoop.yarn.api.impl.pb.client.ApplicationClientProtocolPBClientImpl.getClusterNodes(ApplicationClientProtocolPBClientImpl.java:262)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:606)
> 	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
> 	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
> 	at com.sun.proxy.$Proxy23.getClusterNodes(Unknown Source)
> 	at org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getNodeReports(YarnClientImpl.java:475)
> 	at org.apache.spark.scheduler.cluster.YarnClusterSchedulerBackend$$anonfun$getDriverLogUrls$1.apply(YarnClusterSchedulerBackend.scala:92)
> 	at org.apache.spark.scheduler.cluster.YarnClusterSchedulerBackend$$anonfun$getDriverLogUrls$1.apply(YarnClusterSchedulerBackend.scala:73)
> 	at scala.Option.foreach(Option.scala:236)
> 	at org.apache.spark.scheduler.cluster.YarnClusterSchedulerBackend.getDriverLogUrls(YarnClusterSchedulerBackend.scala:73)
> 	at org.apache.spark.SparkContext.postApplicationStart(SparkContext.scala:2015)
> 	at org.apache.spark.SparkContext.<init>(SparkContext.scala:553)
> 	at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:842)
> 	at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:80)
> 	at org.apache.spark.testing.testing.HdfsWordCount$.main(HdfsWordCount.scala:41)
> 	at org.apache.spark.testing.testing.HdfsWordCount.main(HdfsWordCount.scala)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:606)
> 	at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:504)
> Caused by: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
> 	at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:680)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:415)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
> 	at org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:643)
> 	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:730)
> 	at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:368)
> 	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1521)
> 	at org.apache.hadoop.ipc.Client.call(Client.java:1438)
> 	... 27 more
> Caused by: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[TOKEN, KERBEROS]
> 	at org.apache.hadoop.security.SaslRpcClient.selectSaslClient(SaslRpcClient.java:172)
> 	at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:396)
> 	at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:553)
> 	at org.apache.hadoop.ipc.Client$Connection.access$1800(Client.java:368)
> 	at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:722)
> 	at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:718)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:415)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671)
> 	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:717)
> 	... 30 more
> {code}
> So the link to the driver is not available in secure mode.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org