You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2019/03/01 23:10:00 UTC

[jira] [Resolved] (SPARK-26506) RegressionMetrics fails in Spark 2.4

     [ https://issues.apache.org/jira/browse/SPARK-26506?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean Owen resolved SPARK-26506.
-------------------------------
    Resolution: Cannot Reproduce

There's not enough info here. What's the error? these work in general per integration tests. Reopen if it can be narrowed down ideally with some kind of reproduction.

> RegressionMetrics fails in Spark 2.4
> ------------------------------------
>
>                 Key: SPARK-26506
>                 URL: https://issues.apache.org/jira/browse/SPARK-26506
>             Project: Spark
>          Issue Type: Bug
>          Components: MLlib
>    Affects Versions: 2.4.0
>         Environment: Windows using the Anaconda stack running Spark 2.4, using Java jdk 1.8.0_181.  It may also affect unix when running Spark 2.4, not sure because my workplace where I use Spark in unix is still on Spark 2.2. 
> The bug does not appear cause an error in either 2.3 or 2.2 on either windows or unix.
>            Reporter: Casey Bennett
>            Priority: Major
>
> RegressionMetrics fails in Spark 2.4 when running via Anaconda on a Windows machine.  A java error comes back saying that "python worker failed to connect back".  This makes all the evaluation metrics ([https://spark.apache.org/docs/2.2.0/mllib-evaluation-metrics.html)] unusable for scoring model performance. 
> Reverted to Spark 2.3 and did not have this issue, also tested 2.2 and did not have this issue.  So it appears to be a bug specific to Spark 2.4.  Likely also affects other evaluation metric types, e.g.BinaryClassificationMetrics.
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org