You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Kousuke Saruta (Jira)" <ji...@apache.org> on 2021/10/21 14:13:00 UTC

[jira] [Created] (SPARK-37086) Fix the R test of FPGrowthModel for Scala 2.13

Kousuke Saruta created SPARK-37086:
--------------------------------------

             Summary: Fix the R test of FPGrowthModel for Scala 2.13
                 Key: SPARK-37086
                 URL: https://issues.apache.org/jira/browse/SPARK-37086
             Project: Spark
          Issue Type: Bug
          Components: ML, R, Tests
    Affects Versions: 3.3.0
            Reporter: Kousuke Saruta
            Assignee: Kousuke Saruta


Similar to the issue filed in SPARK-37059, an R test of FPGrowthModel assumes that the result records returned by FPGrowthModel.freqItemsets are sorted by a certain kind of order but it's wrong.
As a result, such tests fail with Scala 2.13.

{code}
 ══ Failed ══════════════════════════════════════════════════════════════════════
── 1. Failure (test_mllib_fpm.R:42:3): spark.fpGrowth ──────────────────────────
`expected_itemsets` not equivalent to `itemsets`.
Component “items”: Component 1: Component 1: 1 string mismatch
Component “items”: Component 2: Length mismatch: comparison on first 1 components
Component “items”: Component 2: Component 1: 1 string mismatch
Component “items”: Component 3: Length mismatch: comparison on first 1 components
Component “items”: Component 4: Length mismatch: comparison on first 1 components
Component “items”: Component 4: Component 1: 1 string mismatch
Component “items”: Component 5: Length mismatch: comparison on first 1 components
Component “items”: Component 5: Component 1: 1 string mismatch
Component “freq”: Mean relative difference: 0.5454545
{code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org