You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2020/03/07 15:52:41 UTC

[GitHub] [spark] srowen commented on a change in pull request #27828: [SPARK-31068][SQL] Avoid IllegalArgumentException in broadcast exchange

srowen commented on a change in pull request #27828: [SPARK-31068][SQL] Avoid IllegalArgumentException in broadcast exchange
URL: https://github.com/apache/spark/pull/27828#discussion_r389263729
 
 

 ##########
 File path: sql/core/src/main/scala/org/apache/spark/sql/execution/exchange/BroadcastExchangeExec.scala
 ##########
 @@ -87,9 +87,12 @@ case class BroadcastExchangeExec(
             val beforeCollect = System.nanoTime()
             // Use executeCollect/executeCollectIterator to avoid conversion to Scala types
             val (numRows, input) = child.executeCollectIterator()
-            if (numRows >= 512000000) {
+            // Since the maximum number of keys that BytesToBytesMap supports is 1 << 29,
+            // and only 70% of the slots can be used before growing in HashedRelation,
+            // here the limitation should not be over 341 million.
+            if (numRows >= (1 << 29) / 1.5) {
               throw new SparkException(
-                s"Cannot broadcast the table with 512 million or more rows: $numRows rows")
+                s"Cannot broadcast the table with 341 million or more rows: $numRows rows")
 
 Review comment:
   Maybe compute this limit in bytes and refer to it in the check and message, to be more specific

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org