You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "yucai (JIRA)" <ji...@apache.org> on 2015/12/11 06:42:11 UTC
[jira] [Created] (SPARK-12275) No plan for BroadcastHint in some
condition
yucai created SPARK-12275:
-----------------------------
Summary: No plan for BroadcastHint in some condition
Key: SPARK-12275
URL: https://issues.apache.org/jira/browse/SPARK-12275
Project: Spark
Issue Type: Bug
Components: SQL
Affects Versions: 1.6.0
Reporter: yucai
*Summary*
No plan for BroadcastHint is generated in some condition.
*Test Case*
{code}
val df1 = Seq((1, "1"), (2, "2")).toDF("key", "value")
val parquetTempFile =
"%s/SPARK-xxxx_%d.parquet".format(System.getProperty("java.io.tmpdir"), scala.util.Random.nextInt)
df1.write.parquet(parquetTempFile)
val pf1 = sqlContext.read.parquet(parquetTempFile)
df1.join(broadcast(pf1)).count()
{code}
*Result*
It will throw exception, like below:
{code}
scala> df1.join(broadcast(pf1)).count()
java.lang.AssertionError: assertion failed: No plan for BroadcastHint
+- Relation[key#6,value#7] ParquetRelation[hdfs://10.1.0.20:8020/tmp/SPARK-xxxx_1817830406.parquet]
at scala.Predef$.assert(Predef.scala:179)
at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:59)
at org.apache.spark.sql.catalyst.planning.QueryPlanner.planLater(QueryPlanner.scala:54)
at org.apache.spark.sql.execution.SparkStrategies$BasicOperators$.apply(SparkStrategies.scala:336)
at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:58)
at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:58)
at scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:371)
at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:59)
at org.apache.spark.sql.catalyst.planning.QueryPlanner.planLater(QueryPlanner.scala:54)
{code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org