You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2015/05/14 15:06:01 UTC
[jira] [Resolved] (SPARK-7635) SparkContextSchedulerCreationSuite
tests may fail due to unrecognized UnsatisfiedLinkError message.
[ https://issues.apache.org/jira/browse/SPARK-7635?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean Owen resolved SPARK-7635.
------------------------------
Resolution: Fixed
Fix Version/s: 1.5.0
Assignee: Tim Ellison
I think that was literally fixed yesterday by [~tellison]'s commit: https://github.com/apache/spark/commit/51030b8a9d4f3feb7a5d2249cc867fd6a06f0336
Coincidence?
I'm using this as its JIRA, retroactively.
> SparkContextSchedulerCreationSuite tests may fail due to unrecognized UnsatisfiedLinkError message.
> ---------------------------------------------------------------------------------------------------
>
> Key: SPARK-7635
> URL: https://issues.apache.org/jira/browse/SPARK-7635
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 1.3.1
> Environment: IBM Java
> Reporter: Matthew Brandyberry
> Assignee: Tim Ellison
> Priority: Minor
> Labels: test
> Fix For: 1.5.0
>
> Original Estimate: 1h
> Remaining Estimate: 1h
>
> When mesos is not available, these tests fail due to the difference in the UnsatisfiedLinkError message with IBM Java vs OpenJDK:
> - mesos fine-grained *** FAILED ***
> "mesos (Not found in java.library.path)" did not contain "no mesos in" (SparkContextSchedulerCreationSuite.scala:162)
> - mesos coarse-grained *** FAILED ***
> "mesos (Not found in java.library.path)" did not contain "no mesos in" (SparkContextSchedulerCreationSuite.scala:162)
> - mesos with zookeeper *** FAILED ***
> "mesos (Not found in java.library.path)" did not contain "no mesos in" (SparkContextSchedulerCreationSuite.scala:162)
> PR to be submitted shortly.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org