You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Karan (JIRA)" <ji...@apache.org> on 2018/08/13 22:38:00 UTC

[jira] [Updated] (SPARK-25107) Spark 2.2.0 Upgrade Issue : Throwing TreeNodeException: makeCopy, tree: CatalogRelation Errors

     [ https://issues.apache.org/jira/browse/SPARK-25107?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Karan updated SPARK-25107:
--------------------------
    Description: 
I am in the process of upgrading Spark 1.6 to Spark 2.2.

I have two stage query and I am running with hiveContext.

{{1) hiveContext.sql("SELECT *, ROW_NUMBER() OVER (PARTITION BY ConfigID, rowid ORDER BY date DESC) AS ro }}
{{ FROM (SELECT DISTINCT PC.ConfigID, VPM.seqNo, VC.rowid ,VC.recordid ,VC.data,CASE }}
{{ WHEN pcs.score BETWEEN PC.from AND PC.to }}
{{ AND ((PC.csacnt IS NOT NULL AND CC.status = 4 }}
{{ AND CC.cnclusion = mc.ca) OR (PC.csacnt IS NULL)) THEN 1 ELSE 0 END AS Flag }}
{{ FROM maindata VC }}
{{ INNER JOIN scoretable pcs ON VC.s_recordid = pcs.s_recordid }}
{{ INNER JOIN cnfgtable PC ON PC.subid = VC.subid }}
{{ INNER JOIN prdtable VPM ON PC.configID = VPM.CNFG_ID }}
{{ LEFT JOIN casetable CC ON CC.rowid = VC.rowid }}
{{ LEFT JOIN prdcnfg mc ON mc.configID = PC.configID WHERE VC.date BETWEEN VPM.StartDate and VPM.EndDate) A }}
{{ WHERE A.Flag =1").registerTempTable("stage1")}}{{2) hiveContext.sql("SELECT DISTINCT t1.ConfigID As cnfg_id ,vct.* }}
{{ FROM stage1 t1 }}
{{ INNER JOIN stage1 t2 ON t1.rowid = t2.rowid AND T1.ConfigID = t2.ConfigID }}
{{ INNER JOIN cnfgtable PCR ON PCR.ConfigID = t2.ConfigID }}
{{ INNER JOIN maindata vct on vct.recordid = t1.recordid}}
{{ WHERE t2.ro = PCR.datacount”)}}

The same query sequency is working fine in Spark 1.6 but failing with below exeption in Spark 2,2. It throws exception while parsing above 2nd query.

 

{{+org.apache.spark.sql.catalyst.errors.package$TreeNodeException: makeCopy, tree:+}}
 {{+CatalogRelation `ilink_perf_athenaprod`.`maindata`, org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, [recordid#1506... 89 more fields|#1506... 89 more fields]+ }}
 \{{ at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:56)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.makeCopy(TreeNode.scala:385)}}
 \{{ at org.apache.spark.sql.catalyst.plans.QueryPlan.mapExpressions(QueryPlan.scala:300)}}
 \{{ at org.apache.spark.sql.catalyst.plans.QueryPlan.transformExpressionsDown(QueryPlan.scala:258)}}
 \{{ at org.apache.spark.sql.catalyst.plans.QueryPlan.transformExpressions(QueryPlan.scala:249)}}
 \{{ at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$$anonfun$5.applyOrElse(Analyzer.scala:722)}}
 \{{ at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$$anonfun$5.applyOrElse(Analyzer.scala:721)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:289)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:289)}}
 \{{ at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:70)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:288)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
 \{{ at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$.org$apache$spark$sql$catalyst$analysis$Analyzer$ResolveReferences$$dedupRight(Analyzer.scala:721)}}
 \{{ at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$$anonfun$apply$9.applyOrElse(Analyzer.scala:817)}}
 \{{ at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$$anonfun$apply$9.applyOrElse(Analyzer.scala:790)}}
 \{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$resolveOperators$1.apply(LogicalPlan.scala:62)}}
 \{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$resolveOperators$1.apply(LogicalPlan.scala:62)}}
 \{{ at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:70)}}
 \{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperators(LogicalPlan.scala:61)}}
 \{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:59)}}
 \{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:59)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
 \{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperators(LogicalPlan.scala:59)}}
 \{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:59)}}
 \{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:59)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
 \{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperators(LogicalPlan.scala:59)}}
 \{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:59)}}
 \{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:59)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
 \{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperators(LogicalPlan.scala:59)}}
 \{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:59)}}
 \{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:59)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
 \{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperators(LogicalPlan.scala:59)}}
 \{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:59)}}
 \{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:59)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
 \{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperators(LogicalPlan.scala:59)}}
 \{{ at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$.apply(Analyzer.scala:790)}}
 \{{ at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$.apply(Analyzer.scala:668)}}
 \{{ at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1$$anonfun$apply$1.apply(RuleExecutor.scala:85)}}
 \{{ at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1$$anonfun$apply$1.apply(RuleExecutor.scala:82)}}
 \{{ at scala.collection.LinearSeqOptimized$class.foldLeft(LinearSeqOptimized.scala:124)}}
 \{{ at scala.collection.immutable.List.foldLeft(List.scala:84)}}
 \{{ at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1.apply(RuleExecutor.scala:82)}}
 \{{ at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1.apply(RuleExecutor.scala:74)}}
 \{{ at scala.collection.immutable.List.foreach(List.scala:381)}}
 \{{ at org.apache.spark.sql.catalyst.rules.RuleExecutor.execute(RuleExecutor.scala:74)}}
 \{{ at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:69)}}
 \{{ at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:67)}}
 \{{ at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:50)}}
 \{{ at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:66)}}
 \{{ at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:623)}}
 \{{ at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:691)}}
 \{{ at Main.StandardAdvParamRules$.predictorScoreRuleExecute(StandardAdvParamRules.scala:398)}}
 \{{ at Main.StandardAdvParamRules$$anonfun$execute$1.apply(StandardAdvParamRules.scala:213)}}
 \{{ at Main.StandardAdvParamRules$$anonfun$execute$1.apply(StandardAdvParamRules.scala:195)}}
 \{{ at scala.collection.immutable.Map$Map1.foreach(Map.scala:116)}}
 \{{ at Main.StandardAdvParamRules$.execute(StandardAdvParamRules.scala:195)}}
 \{{ at Main.RuleExecution$.runRules(RuleExecution.scala:64)}}
 \{{ at Main.RuleEngine$.main(RuleEngine.scala:35)}}
 \{{ at Main.RuleEngine.main(RuleEngine.scala)}}
 \{{ at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)}}
 \{{ at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)}}
 \{{ at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)}}
 \{{ at java.lang.reflect.Method.invoke(Method.java:498)}}
 \{{ at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:755)}}
 \{{ at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)}}
 \{{ at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)}}
 \{{ at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)}}
 \{{ at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)}}
 {{Caused by: java.lang.reflect.InvocationTargetException}}
 \{{ at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)}}
 \{{ at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)}}
 \{{ at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)}}
 \{{ at java.lang.reflect.Constructor.newInstance(Constructor.java:423)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$makeCopy$1$$anonfun$apply$13.apply(TreeNode.scala:411)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$makeCopy$1$$anonfun$apply$13.apply(TreeNode.scala:411)}}
 \{{ at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:70)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$makeCopy$1.apply(TreeNode.scala:410)}}
 \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$makeCopy$1.apply(TreeNode.scala:385)}}
 \{{ at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:52)}}
 \{{ ... 188 more}}
 {{Caused by: java.lang.AssertionError: assertion failed}}
 \{{ at scala.Predef$.assert(Predef.scala:156)}}
 \{{ at org.apache.spark.sql.catalyst.catalog.CatalogRelation.<init>(interface.scala:410)}}
 \{{ ... 198 more}}

 

 

  was:
I am in the process of upgrading Spark 1.6 to Spark 2.2.

I have two stage query and I am running with hiveContext.

{{1) hiveContext.sql("SELECT *, ROW_NUMBER() OVER (PARTITION BY ConfigID, rowid ORDER BY date DESC) AS ro }}
{{ FROM (SELECT DISTINCT PC.ConfigID, VPM.seqNo, VC.rowid ,VC.recordid ,VC.data,CASE }}
{{ WHEN pcs.score BETWEEN PC.from AND PC.to }}
{{ AND ((PC.csacnt IS NOT NULL AND CC.status = 4 }}
{{ AND CC.cnclusion = mc.ca) OR (PC.csacnt IS NULL)) THEN 1 ELSE 0 END AS Flag }}
{{ FROM {color:#f79232}maindata{color} VC }}
{{ INNER JOIN scoretable pcs ON VC.s_recordid = pcs.s_recordid }}
{{ INNER JOIN cnfgtable PC ON PC.subid = VC.subid }}
{{ INNER JOIN prdtable VPM ON PC.configID = VPM.CNFG_ID }}
{{ LEFT JOIN casetable CC ON CC.rowid = VC.rowid }}
{{ LEFT JOIN prdcnfg mc ON mc.configID = PC.configID WHERE VC.date BETWEEN VPM.StartDate and VPM.EndDate) A }}
{{ WHERE A.Flag =1").}}createOrReplaceTempView{{("stage1")}}

{{2) hiveContext.sql("SELECT DISTINCT t1.ConfigID As cnfg_id ,vct.* }}
{{ FROM stage1 t1 }}
{{ INNER JOIN stage1 t2 ON t1.rowid = t2.rowid AND T1.ConfigID = t2.ConfigID }}
{{ INNER JOIN cnfgtable PCR ON PCR.ConfigID = t2.ConfigID }}
{{ INNER JOIN {color:#f79232}maindata{color} vct on vct.recordid = t1.recordid}}
{{ WHERE t2.ro = PCR.datacount")}}

The same query sequency is working fine in Spark 1.6 but failing with below exeption in Spark 2,2. It throws exception while parsing above 2nd query.

 

{{+org.apache.spark.sql.catalyst.errors.package$TreeNodeException: makeCopy, tree:+}}
{{+CatalogRelation `ilink_perf_athenaprod`.`maindata`, org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, [recordid#1506... 89 more fields]+ }}
{{ at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:56)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.makeCopy(TreeNode.scala:385)}}
{{ at org.apache.spark.sql.catalyst.plans.QueryPlan.mapExpressions(QueryPlan.scala:300)}}
{{ at org.apache.spark.sql.catalyst.plans.QueryPlan.transformExpressionsDown(QueryPlan.scala:258)}}
{{ at org.apache.spark.sql.catalyst.plans.QueryPlan.transformExpressions(QueryPlan.scala:249)}}
{{ at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$$anonfun$5.applyOrElse(Analyzer.scala:722)}}
{{ at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$$anonfun$5.applyOrElse(Analyzer.scala:721)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:289)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:289)}}
{{ at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:70)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:288)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
{{ at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$.org$apache$spark$sql$catalyst$analysis$Analyzer$ResolveReferences$$dedupRight(Analyzer.scala:721)}}
{{ at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$$anonfun$apply$9.applyOrElse(Analyzer.scala:817)}}
{{ at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$$anonfun$apply$9.applyOrElse(Analyzer.scala:790)}}
{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$resolveOperators$1.apply(LogicalPlan.scala:62)}}
{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$resolveOperators$1.apply(LogicalPlan.scala:62)}}
{{ at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:70)}}
{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperators(LogicalPlan.scala:61)}}
{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:59)}}
{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:59)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperators(LogicalPlan.scala:59)}}
{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:59)}}
{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:59)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperators(LogicalPlan.scala:59)}}
{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:59)}}
{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:59)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperators(LogicalPlan.scala:59)}}
{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:59)}}
{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:59)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperators(LogicalPlan.scala:59)}}
{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:59)}}
{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:59)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperators(LogicalPlan.scala:59)}}
{{ at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$.apply(Analyzer.scala:790)}}
{{ at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$.apply(Analyzer.scala:668)}}
{{ at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1$$anonfun$apply$1.apply(RuleExecutor.scala:85)}}
{{ at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1$$anonfun$apply$1.apply(RuleExecutor.scala:82)}}
{{ at scala.collection.LinearSeqOptimized$class.foldLeft(LinearSeqOptimized.scala:124)}}
{{ at scala.collection.immutable.List.foldLeft(List.scala:84)}}
{{ at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1.apply(RuleExecutor.scala:82)}}
{{ at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1.apply(RuleExecutor.scala:74)}}
{{ at scala.collection.immutable.List.foreach(List.scala:381)}}
{{ at org.apache.spark.sql.catalyst.rules.RuleExecutor.execute(RuleExecutor.scala:74)}}
{{ at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:69)}}
{{ at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:67)}}
{{ at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:50)}}
{{ at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:66)}}
{{ at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:623)}}
{{ at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:691)}}
{{ at Main.StandardAdvParamRules$.predictorScoreRuleExecute(StandardAdvParamRules.scala:398)}}
{{ at Main.StandardAdvParamRules$$anonfun$execute$1.apply(StandardAdvParamRules.scala:213)}}
{{ at Main.StandardAdvParamRules$$anonfun$execute$1.apply(StandardAdvParamRules.scala:195)}}
{{ at scala.collection.immutable.Map$Map1.foreach(Map.scala:116)}}
{{ at Main.StandardAdvParamRules$.execute(StandardAdvParamRules.scala:195)}}
{{ at Main.RuleExecution$.runRules(RuleExecution.scala:64)}}
{{ at Main.RuleEngine$.main(RuleEngine.scala:35)}}
{{ at Main.RuleEngine.main(RuleEngine.scala)}}
{{ at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)}}
{{ at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)}}
{{ at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)}}
{{ at java.lang.reflect.Method.invoke(Method.java:498)}}
{{ at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:755)}}
{{ at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)}}
{{ at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)}}
{{ at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)}}
{{ at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)}}
{{Caused by: java.lang.reflect.InvocationTargetException}}
{{ at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)}}
{{ at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)}}
{{ at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)}}
{{ at java.lang.reflect.Constructor.newInstance(Constructor.java:423)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$makeCopy$1$$anonfun$apply$13.apply(TreeNode.scala:411)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$makeCopy$1$$anonfun$apply$13.apply(TreeNode.scala:411)}}
{{ at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:70)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$makeCopy$1.apply(TreeNode.scala:410)}}
{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$makeCopy$1.apply(TreeNode.scala:385)}}
{{ at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:52)}}
{{ ... 188 more}}
{{Caused by: java.lang.AssertionError: assertion failed}}
{{ at scala.Predef$.assert(Predef.scala:156)}}
{{ at org.apache.spark.sql.catalyst.catalog.CatalogRelation.<init>(interface.scala:410)}}
{{ ... 198 more}}

 

 


> Spark 2.2.0 Upgrade Issue : Throwing TreeNodeException: makeCopy, tree: CatalogRelation Errors
> ----------------------------------------------------------------------------------------------
>
>                 Key: SPARK-25107
>                 URL: https://issues.apache.org/jira/browse/SPARK-25107
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core, SQL
>    Affects Versions: 2.2.0
>         Environment: Spark Version : 2.2.0.cloudera2
>            Reporter: Karan
>            Priority: Major
>
> I am in the process of upgrading Spark 1.6 to Spark 2.2.
> I have two stage query and I am running with hiveContext.
> {{1) hiveContext.sql("SELECT *, ROW_NUMBER() OVER (PARTITION BY ConfigID, rowid ORDER BY date DESC) AS ro }}
> {{ FROM (SELECT DISTINCT PC.ConfigID, VPM.seqNo, VC.rowid ,VC.recordid ,VC.data,CASE }}
> {{ WHEN pcs.score BETWEEN PC.from AND PC.to }}
> {{ AND ((PC.csacnt IS NOT NULL AND CC.status = 4 }}
> {{ AND CC.cnclusion = mc.ca) OR (PC.csacnt IS NULL)) THEN 1 ELSE 0 END AS Flag }}
> {{ FROM maindata VC }}
> {{ INNER JOIN scoretable pcs ON VC.s_recordid = pcs.s_recordid }}
> {{ INNER JOIN cnfgtable PC ON PC.subid = VC.subid }}
> {{ INNER JOIN prdtable VPM ON PC.configID = VPM.CNFG_ID }}
> {{ LEFT JOIN casetable CC ON CC.rowid = VC.rowid }}
> {{ LEFT JOIN prdcnfg mc ON mc.configID = PC.configID WHERE VC.date BETWEEN VPM.StartDate and VPM.EndDate) A }}
> {{ WHERE A.Flag =1").registerTempTable("stage1")}}{{2) hiveContext.sql("SELECT DISTINCT t1.ConfigID As cnfg_id ,vct.* }}
> {{ FROM stage1 t1 }}
> {{ INNER JOIN stage1 t2 ON t1.rowid = t2.rowid AND T1.ConfigID = t2.ConfigID }}
> {{ INNER JOIN cnfgtable PCR ON PCR.ConfigID = t2.ConfigID }}
> {{ INNER JOIN maindata vct on vct.recordid = t1.recordid}}
> {{ WHERE t2.ro = PCR.datacount”)}}
> The same query sequency is working fine in Spark 1.6 but failing with below exeption in Spark 2,2. It throws exception while parsing above 2nd query.
>  
> {{+org.apache.spark.sql.catalyst.errors.package$TreeNodeException: makeCopy, tree:+}}
>  {{+CatalogRelation `ilink_perf_athenaprod`.`maindata`, org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe, [recordid#1506... 89 more fields|#1506... 89 more fields]+ }}
>  \{{ at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:56)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.makeCopy(TreeNode.scala:385)}}
>  \{{ at org.apache.spark.sql.catalyst.plans.QueryPlan.mapExpressions(QueryPlan.scala:300)}}
>  \{{ at org.apache.spark.sql.catalyst.plans.QueryPlan.transformExpressionsDown(QueryPlan.scala:258)}}
>  \{{ at org.apache.spark.sql.catalyst.plans.QueryPlan.transformExpressions(QueryPlan.scala:249)}}
>  \{{ at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$$anonfun$5.applyOrElse(Analyzer.scala:722)}}
>  \{{ at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$$anonfun$5.applyOrElse(Analyzer.scala:721)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:289)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:289)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:70)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:288)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:286)}}
>  \{{ at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$.org$apache$spark$sql$catalyst$analysis$Analyzer$ResolveReferences$$dedupRight(Analyzer.scala:721)}}
>  \{{ at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$$anonfun$apply$9.applyOrElse(Analyzer.scala:817)}}
>  \{{ at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$$anonfun$apply$9.applyOrElse(Analyzer.scala:790)}}
>  \{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$resolveOperators$1.apply(LogicalPlan.scala:62)}}
>  \{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$resolveOperators$1.apply(LogicalPlan.scala:62)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:70)}}
>  \{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperators(LogicalPlan.scala:61)}}
>  \{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:59)}}
>  \{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:59)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
>  \{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperators(LogicalPlan.scala:59)}}
>  \{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:59)}}
>  \{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:59)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
>  \{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperators(LogicalPlan.scala:59)}}
>  \{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:59)}}
>  \{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:59)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
>  \{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperators(LogicalPlan.scala:59)}}
>  \{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:59)}}
>  \{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:59)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
>  \{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperators(LogicalPlan.scala:59)}}
>  \{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:59)}}
>  \{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$1.apply(LogicalPlan.scala:59)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:306)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:304)}}
>  \{{ at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperators(LogicalPlan.scala:59)}}
>  \{{ at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$.apply(Analyzer.scala:790)}}
>  \{{ at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveReferences$.apply(Analyzer.scala:668)}}
>  \{{ at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1$$anonfun$apply$1.apply(RuleExecutor.scala:85)}}
>  \{{ at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1$$anonfun$apply$1.apply(RuleExecutor.scala:82)}}
>  \{{ at scala.collection.LinearSeqOptimized$class.foldLeft(LinearSeqOptimized.scala:124)}}
>  \{{ at scala.collection.immutable.List.foldLeft(List.scala:84)}}
>  \{{ at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1.apply(RuleExecutor.scala:82)}}
>  \{{ at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1.apply(RuleExecutor.scala:74)}}
>  \{{ at scala.collection.immutable.List.foreach(List.scala:381)}}
>  \{{ at org.apache.spark.sql.catalyst.rules.RuleExecutor.execute(RuleExecutor.scala:74)}}
>  \{{ at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:69)}}
>  \{{ at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:67)}}
>  \{{ at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:50)}}
>  \{{ at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:66)}}
>  \{{ at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:623)}}
>  \{{ at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:691)}}
>  \{{ at Main.StandardAdvParamRules$.predictorScoreRuleExecute(StandardAdvParamRules.scala:398)}}
>  \{{ at Main.StandardAdvParamRules$$anonfun$execute$1.apply(StandardAdvParamRules.scala:213)}}
>  \{{ at Main.StandardAdvParamRules$$anonfun$execute$1.apply(StandardAdvParamRules.scala:195)}}
>  \{{ at scala.collection.immutable.Map$Map1.foreach(Map.scala:116)}}
>  \{{ at Main.StandardAdvParamRules$.execute(StandardAdvParamRules.scala:195)}}
>  \{{ at Main.RuleExecution$.runRules(RuleExecution.scala:64)}}
>  \{{ at Main.RuleEngine$.main(RuleEngine.scala:35)}}
>  \{{ at Main.RuleEngine.main(RuleEngine.scala)}}
>  \{{ at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)}}
>  \{{ at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)}}
>  \{{ at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)}}
>  \{{ at java.lang.reflect.Method.invoke(Method.java:498)}}
>  \{{ at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:755)}}
>  \{{ at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)}}
>  \{{ at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)}}
>  \{{ at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)}}
>  \{{ at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)}}
>  {{Caused by: java.lang.reflect.InvocationTargetException}}
>  \{{ at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)}}
>  \{{ at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)}}
>  \{{ at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)}}
>  \{{ at java.lang.reflect.Constructor.newInstance(Constructor.java:423)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$makeCopy$1$$anonfun$apply$13.apply(TreeNode.scala:411)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$makeCopy$1$$anonfun$apply$13.apply(TreeNode.scala:411)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:70)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$makeCopy$1.apply(TreeNode.scala:410)}}
>  \{{ at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$makeCopy$1.apply(TreeNode.scala:385)}}
>  \{{ at org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:52)}}
>  \{{ ... 188 more}}
>  {{Caused by: java.lang.AssertionError: assertion failed}}
>  \{{ at scala.Predef$.assert(Predef.scala:156)}}
>  \{{ at org.apache.spark.sql.catalyst.catalog.CatalogRelation.<init>(interface.scala:410)}}
>  \{{ ... 198 more}}
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org