You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2020/08/06 18:01:00 UTC

[jira] [Assigned] (SPARK-32506) flaky test: pyspark.mllib.tests.test_streaming_algorithms.StreamingLinearRegressionWithTests

     [ https://issues.apache.org/jira/browse/SPARK-32506?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Apache Spark reassigned SPARK-32506:
------------------------------------

    Assignee:     (was: Apache Spark)

> flaky test: pyspark.mllib.tests.test_streaming_algorithms.StreamingLinearRegressionWithTests
> --------------------------------------------------------------------------------------------
>
>                 Key: SPARK-32506
>                 URL: https://issues.apache.org/jira/browse/SPARK-32506
>             Project: Spark
>          Issue Type: Test
>          Components: MLlib
>    Affects Versions: 3.1.0
>            Reporter: Wenchen Fan
>            Priority: Major
>
> {code}
> FAIL: test_train_prediction (pyspark.mllib.tests.test_streaming_algorithms.StreamingLinearRegressionWithTests)
> Test that error on test data improves as model is trained.
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>   File "/home/runner/work/spark/spark/python/pyspark/mllib/tests/test_streaming_algorithms.py", line 466, in test_train_prediction
>     eventually(condition, timeout=180.0)
>   File "/home/runner/work/spark/spark/python/pyspark/testing/utils.py", line 81, in eventually
>     lastValue = condition()
>   File "/home/runner/work/spark/spark/python/pyspark/mllib/tests/test_streaming_algorithms.py", line 461, in condition
>     self.assertGreater(errors[1] - errors[-1], 2)
> AssertionError: 1.672640157855923 not greater than 2
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org