You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean R. Owen (Jira)" <ji...@apache.org> on 2020/03/16 01:22:00 UTC
[jira] [Resolved] (SPARK-31068) IllegalArgumentException in
BroadcastExchangeExec
[ https://issues.apache.org/jira/browse/SPARK-31068?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean R. Owen resolved SPARK-31068.
----------------------------------
Fix Version/s: 3.1.0
Resolution: Fixed
Issue resolved by pull request 27828
[https://github.com/apache/spark/pull/27828]
> IllegalArgumentException in BroadcastExchangeExec
> -------------------------------------------------
>
> Key: SPARK-31068
> URL: https://issues.apache.org/jira/browse/SPARK-31068
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 3.0.0
> Reporter: Lantao Jin
> Assignee: Lantao Jin
> Priority: Major
> Fix For: 3.1.0
>
>
> {code}
> Caused by: org.apache.spark.SparkException: Failed to materialize query stage: BroadcastQueryStage 0
> +- BroadcastExchange HashedRelationBroadcastMode(List(input[0, string, true], input[1, bigint, true], input[2, int, true]))
> +- *(1) Project [guid#138126, session_skey#138127L, seqnum#138132]
> +- *(1) Filter ((((isnotnull(session_start_dt#138129) && (session_start_dt#138129 = 2020-01-01)) && isnotnull(seqnum#138132)) && isnotnull(session_skey#138127L)) && isnotnull(guid#138126))
> +- *(1) FileScan parquet p_soj_cl_t.clav_events[guid#138126, session_skey#138127L, session_start_dt#138129, seqnum#138132] DataFilters: [isnotnull(session_start_dt#138129), (session_start_dt#138129 = 2020-01-01), isnotnull(seqnum#138..., Format: Parquet, Location: TahoeLogFileIndex[hdfs://hermes-rno/workspaces/P_SOJ_CL_T/clav_events], PartitionFilters: [], PushedFilters: [IsNotNull(session_start_dt), EqualTo(session_start_dt,2020-01-01), IsNotNull(seqnum), IsNotNull(..., ReadSchema: struct<guid:string,session_skey:bigint,session_start_dt:string,seqnum:int>, SelectedBucketsCount: 1000 out of 1000, UsedIndexes: []
> at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec$$anonfun$generateFinalPlan$3.apply(AdaptiveSparkPlanExec.scala:230)
> at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec$$anonfun$generateFinalPlan$3.apply(AdaptiveSparkPlanExec.scala:225)
> at scala.collection.immutable.List.foreach(List.scala:381)
> at org.apache.spark.sql.execution.adaptive.AdaptiveSparkPlanExec.generateFinalPlan(AdaptiveSparkPlanExec.scala:225)
> ... 48 more
> Caused by: java.lang.IllegalArgumentException: Initial capacity 670166426 exceeds maximum capacity of 536870912
> at org.apache.spark.unsafe.map.BytesToBytesMap.<init>(BytesToBytesMap.java:196)
> at org.apache.spark.unsafe.map.BytesToBytesMap.<init>(BytesToBytesMap.java:219)
> at org.apache.spark.sql.execution.joins.UnsafeHashedRelation$.apply(HashedRelation.scala:340)
> at org.apache.spark.sql.execution.joins.HashedRelation$.apply(HashedRelation.scala:123)
> at org.apache.spark.sql.execution.joins.HashedRelationBroadcastMode.transform(HashedRelation.scala:964)
> at org.apache.spark.sql.execution.joins.HashedRelationBroadcastMode.transform(HashedRelation.scala:952)
> at org.apache.spark.sql.execution.exchange.BroadcastExchangeExec$$anonfun$relationFuture$1$$anonfun$apply$9.apply(BroadcastExchangeExec.scala:220)
> at org.apache.spark.sql.execution.exchange.BroadcastExchangeExec$$anonfun$relationFuture$1$$anonfun$apply$9.apply(BroadcastExchangeExec.scala:207)
> at org.apache.spark.sql.execution.SQLExecution$.withExecutionId(SQLExecution.scala:128)
> at org.apache.spark.sql.execution.exchange.BroadcastExchangeExec$$anonfun$relationFuture$1.apply(BroadcastExchangeExec.scala:206)
> at org.apache.spark.sql.execution.exchange.BroadcastExchangeExec$$anonfun$relationFuture$1.apply(BroadcastExchangeExec.scala:172)
> at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
> at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
> ... 3 more
> {code}
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org