You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Gerard Maas <ge...@gmail.com> on 2014/09/18 12:29:51 UTC

[SparkStreaming] task failure with 'Unknown exception in doAs'

My Spark Streaming job (running on Spark 1.0.2) stopped working today and
consistently throws the exception below.
No code changed for it, so I'm really puzzled about the cause of the issue.
Looks like a security issue at HDFS level.  Has anybody seen this exception
and maybe know the root cause?

14/09/18 10:16:27 ERROR UserGroupInformation: PriviledgedActionException
as:********** (auth:SIMPLE) cause:java.util.concurrent.TimeoutException:
Futures timed out after [30 seconds]
Exception in thread "main" java.lang.reflect.UndeclaredThrowableException:
Unknown exception in doAs
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1421)
at
org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:52)
at
org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:113)
at
org.apache.spark.executor.CoarseGrainedExecutorBackend$.main(CoarseGrainedExecutorBackend.scala:154)
at
org.apache.spark.executor.CoarseGrainedExecutorBackend.main(CoarseGrainedExecutorBackend.scala)
Caused by: java.security.PrivilegedActionException:
java.util.concurrent.TimeoutException: Futures timed out after [30 seconds]
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)


Any hints?

-kr, Gerard.

Re: [SparkStreaming] task failure with 'Unknown exception in doAs'

Posted by Gerard Maas <ge...@gmail.com>.
Found it!  (with sweat in my forehead)

The job was actually running on Mesos using a  Spark 1.1.0 executor.

I guess there's some incompatibility between the 1.0.2 and the 1.1 versions
 - still quite weird.

-kr, Gerard.

On Thu, Sep 18, 2014 at 12:29 PM, Gerard Maas <ge...@gmail.com> wrote:

> My Spark Streaming job (running on Spark 1.0.2) stopped working today and
> consistently throws the exception below.
> No code changed for it, so I'm really puzzled about the cause of the
> issue. Looks like a security issue at HDFS level.  Has anybody seen this
> exception and maybe know the root cause?
>
> 14/09/18 10:16:27 ERROR UserGroupInformation: PriviledgedActionException
> as:********** (auth:SIMPLE) cause:java.util.concurrent.TimeoutException:
> Futures timed out after [30 seconds]
> Exception in thread "main" java.lang.reflect.UndeclaredThrowableException:
> Unknown exception in doAs
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1421)
> at
> org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:52)
> at
> org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:113)
> at
> org.apache.spark.executor.CoarseGrainedExecutorBackend$.main(CoarseGrainedExecutorBackend.scala:154)
> at
> org.apache.spark.executor.CoarseGrainedExecutorBackend.main(CoarseGrainedExecutorBackend.scala)
> Caused by: java.security.PrivilegedActionException:
> java.util.concurrent.TimeoutException: Futures timed out after [30 seconds]
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
>
>
> Any hints?
>
> -kr, Gerard.
>