You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Marcelo Vanzin (JIRA)" <ji...@apache.org> on 2016/09/20 17:19:20 UTC
[jira] [Created] (SPARK-17611) YarnShuffleServiceSuite swallows
exceptions, doesn't really test a few things
Marcelo Vanzin created SPARK-17611:
--------------------------------------
Summary: YarnShuffleServiceSuite swallows exceptions, doesn't really test a few things
Key: SPARK-17611
URL: https://issues.apache.org/jira/browse/SPARK-17611
Project: Spark
Issue Type: Bug
Components: YARN
Reporter: Marcelo Vanzin
Priority: Minor
If you look at the logs, you'll see exceptions like this:
{noformat}
16/09/20 10:03:45.081 pool-1-thread-1-ScalaTest-running-YarnShuffleServiceSuite ERROR YarnShuffleService: Exception when initializing application application_0_0001
java.lang.NullPointerException
at io.netty.buffer.Unpooled.wrappedBuffer(Unpooled.java:183)
at org.apache.spark.network.util.JavaUtils.bytesToString(JavaUtils.java:80)
at org.apache.spark.network.sasl.ShuffleSecretManager.registerApp(ShuffleSecretManager.java:62)
at org.apache.spark.network.yarn.YarnShuffleService.initializeApplication(YarnShuffleService.java:254)
at org.apache.spark.network.yarn.YarnShuffleServiceSuite$$anonfun$1.apply$mcV$sp(YarnShuffleServiceSuite.scala:89)
at org.apache.spark.network.yarn.YarnShuffleServiceSuite$$anonfun$1.apply(YarnShuffleServiceSuite.scala:81)
at org.apache.spark.network.yarn.YarnShuffleServiceSuite$$anonfun$1.apply(YarnShuffleServiceSuite.scala:81)
{noformat}
The underlying reason is that the suite is testing with auth on, but providing a null auth secret. But because of catch blocks in the shuffle service code, and failure to properly check things in the suite, the tests pass even though the underlying code is failing.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org