You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Vyacheslav Baranov (JIRA)" <ji...@apache.org> on 2015/06/11 17:08:00 UTC

[jira] [Updated] (SPARK-8309) OpenHashMap doesn't work with more than 12M items

     [ https://issues.apache.org/jira/browse/SPARK-8309?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Vyacheslav Baranov updated SPARK-8309:
--------------------------------------
    Description: 
The problem might be demonstrated with the following testcase:

{code}
  test("support for more than 12M items") {
    val cnt = 12000000 // 12M
    val map = new OpenHashMap[Int, Int](cnt)
    for (i <- 0 until cnt) {
      map(i) = 1
    }
    val numInvalidValues = map.iterator.count(_._2 == 0)
    assertResult(0)(numInvalidValues)
  }

{code}

  was:
The problem might be demonstrated with the following testcase:

{code:scala}
  test("support for more than 12M items") {
    val cnt = 12000000 // 12M
    val map = new OpenHashMap[Int, Int](cnt)
    for (i <- 0 until cnt) {
      map(i) = 1
    }
    val numInvalidValues = map.iterator.count(_._2 == 0)
    assertResult(0)(numInvalidValues)
  }

{code}


> OpenHashMap doesn't work with more than 12M items
> -------------------------------------------------
>
>                 Key: SPARK-8309
>                 URL: https://issues.apache.org/jira/browse/SPARK-8309
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.4.0
>            Reporter: Vyacheslav Baranov
>
> The problem might be demonstrated with the following testcase:
> {code}
>   test("support for more than 12M items") {
>     val cnt = 12000000 // 12M
>     val map = new OpenHashMap[Int, Int](cnt)
>     for (i <- 0 until cnt) {
>       map(i) = 1
>     }
>     val numInvalidValues = map.iterator.count(_._2 == 0)
>     assertResult(0)(numInvalidValues)
>   }
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org