You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2016/09/13 12:35:20 UTC
[jira] [Assigned] (SPARK-17524) RowBasedKeyValueBatchSuite always
uses 64 mb page size
[ https://issues.apache.org/jira/browse/SPARK-17524?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-17524:
------------------------------------
Assignee: Apache Spark
> RowBasedKeyValueBatchSuite always uses 64 mb page size
> ------------------------------------------------------
>
> Key: SPARK-17524
> URL: https://issues.apache.org/jira/browse/SPARK-17524
> Project: Spark
> Issue Type: Improvement
> Components: Tests
> Affects Versions: 2.1.0
> Reporter: Adam Roberts
> Assignee: Apache Spark
> Priority: Minor
>
> The appendRowUntilExceedingPageSize test at sql/catalyst/src/test/java/org/apache/spark/sql/catalyst/expressions/RowBasedKeyValueBatchSuite.java always uses the default page size which is 64 MB for running the test
> Users with less powerful machines (e.g. those with two cores) may opt to choose a smaller spark.buffer.pageSize value in order to prevent problems acquiring memory
> If this size is reduced, let's say to 1 MB, this test will fail, here is the problem scenario
> We run with 1,048,576 page size (1 mb)
> Default is 67,108,864 size (64 mb)
> Test fails: java.lang.AssertionError: expected:<14563> but was:<932067>
> 932,067 is 64x bigger than 14,563 and the default page size is 64x bigger than 1 MB (which is when we see the failure)
> The failure is at
> Assert.assertEquals(batch.numRows(), numRows);
> This minor improvement has the test use whatever the user has specified (looks for spark.buffer.pageSize) to prevent this problem from occurring for anyone testing Apache Spark on a box with a reduced page size.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org