You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Andrés Ivaldi <ia...@gmail.com> on 2017/04/11 19:22:52 UTC

Exception on Join with Spark2.1

Hello, I'm using spark embedded, So far with Spark 2.0.2 was all ok, after
update Spark to 2.1.0, I'm having problems when join to Datset.

The query are generated dinamically, but I have two Dataset one with a
WindowFunction and the other is de same Dataset  before the application of
the windowFunction.

I can get the data before the join them, but after, I'm getting this
exception


    [11/04/2017 15:55:12 - default-akka.actor.default-dispatcher-6] ERROR
ar.com.visionaris.engine.Engine  -
org.apache.spark.sql.catalyst.errors.package$TreeNodeException: execute,
tree:
Exchange hashpartitioning(TiempoAAMM#67, 2)
+- *Sort [TiempoAAMM#67 ASC NULLS FIRST], true, 0
   +- *Sort [TiempoAAMM#67 ASC NULLS FIRST], true, 0
      +- Exchange rangepartitioning(TiempoAAMM#67 ASC NULLS FIRST, 2)
         +- *HashAggregate(keys=[TiempoAAMM#67, spark_grouping_id#65],
functions=[sum(Cantidad#23)], output=[TiempoAAMM#67, Cantidad_SUM#96])
            +- Exchange hashpartitioning(TiempoAAMM#67,
spark_grouping_id#65, 2)
               +- *HashAggregate(keys=[TiempoAAMM#67,
spark_grouping_id#65], functions=[partial_sum(Cantidad#23)],
output=[TiempoAAMM#67, spark_grouping_id#65, sum#160])
                  +- *Filter isnotnull(TiempoAAMM#67)
                     +- *!Expand [ArrayBuffer(Cantidad#23, TiempoAAMM#36,
0), ArrayBuffer(Cantidad#23, null, 1)], [Cantidad#23, TiempoAAMM#67,
spark_grouping_id#65]
                        +- InMemoryTableScan [Cantidad#23]
                              +- InMemoryRelation [Cantidad#23,
TiempoAAMM#36], true, 10000, StorageLevel(disk, memory, deserialized, 1
replicas)
                                    +- *Project [Cantidad#6 AS Cantidad#23,
TiempoAAMM#4 AS TiempoAAMM#36]
                                       +- *FileScan parquet
[TiempoAAMM#4,Cantidad#6] Batched: true, Format: Parquet, Location:
InMemoryFileIndex[file:/tmp/vs/cache/dataSet-16], PartitionFilters: [],
PushedFilters: [], ReadSchema:
struct<TiempoAAMM:string,Cantidad:decimal(18,2)>

at
org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:56)
at
org.apache.spark.sql.execution.exchange.ShuffleExchange.doExecute(ShuffleExchange.scala:112)
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:135)
at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at
org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:132)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:113)
at
org.apache.spark.sql.execution.InputAdapter.inputRDDs(WholeStageCodegenExec.scala:235)
at org.apache.spark.sql.execution.SortExec.inputRDDs(SortExec.scala:121)
at
org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:368)
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:135)
at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at
org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:132)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:113)
at
org.apache.spark.sql.execution.InputAdapter.doExecute(WholeStageCodegenExec.scala:227)
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:135)
at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at
org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:132)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:113)
at
org.apache.spark.sql.execution.joins.SortMergeJoinExec.inputRDDs(SortMergeJoinExec.scala:326)
at
org.apache.spark.sql.execution.ProjectExec.inputRDDs(basicPhysicalOperators.scala:42)
at
org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:368)
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:135)
at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at
org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:132)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:113)
at
org.apache.spark.sql.execution.SparkPlan.getByteArrayRdd(SparkPlan.scala:225)
at org.apache.spark.sql.execution.SparkPlan.executeTake(SparkPlan.scala:308)
at
org.apache.spark.sql.execution.CollectLimitExec.executeCollect(limit.scala:38)
at
org.apache.spark.sql.Dataset$$anonfun$org$apache$spark$sql$Dataset$$execute$1$1.apply(Dataset.scala:2371)
at
org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:57)
at org.apache.spark.sql.Dataset.withNewExecutionId(Dataset.scala:2765)
at org.apache.spark.sql.Dataset.org
$apache$spark$sql$Dataset$$execute$1(Dataset.scala:2370)
at org.apache.spark.sql.Dataset.org
$apache$spark$sql$Dataset$$collect(Dataset.scala:2377)
at org.apache.spark.sql.Dataset$$anonfun$head$1.apply(Dataset.scala:2113)
at org.apache.spark.sql.Dataset$$anonfun$head$1.apply(Dataset.scala:2112)
at org.apache.spark.sql.Dataset.withTypedCallback(Dataset.scala:2795)
at org.apache.spark.sql.Dataset.head(Dataset.scala:2112)
at org.apache.spark.sql.Dataset.take(Dataset.scala:2327)
at org.apache.spark.sql.Dataset.showString(Dataset.scala:248)
at org.apache.spark.sql.Dataset.show(Dataset.scala:636)
at org.apache.spark.sql.Dataset.show(Dataset.scala:595)
at org.apache.spark.sql.Dataset.show(Dataset.scala:604)
at
ar.com.visionaris.engine.spark.api.calculator.AnalysisCalculator$$anonfun$executeCalculation$7$$anonfun$apply$8.apply(AnalysisCalculation.scala:262)
at
ar.com.visionaris.engine.spark.api.calculator.AnalysisCalculator$$anonfun$executeCalculation$7$$anonfun$apply$8.apply(AnalysisCalculation.scala:217)
at scala.collection.immutable.List.foreach(List.scala:381)
at
ar.com.visionaris.engine.spark.api.calculator.AnalysisCalculator$$anonfun$executeCalculation$7.apply(AnalysisCalculation.scala:216)
at
ar.com.visionaris.engine.spark.api.calculator.AnalysisCalculator$$anonfun$executeCalculation$7.apply(AnalysisCalculation.scala:212)
at scala.collection.immutable.List.foreach(List.scala:381)
at
ar.com.visionaris.engine.spark.api.calculator.AnalysisCalculator.executeCalculation(AnalysisCalculation.scala:212)
at ar.com.visionaris.engine.Engine$.execute(Engine.scala:120)
at
ar.com.visionaris.engine.http.rest.api.EngineRestApi$.execute(EngineRestApi.scala:98)
at
ar.com.visionaris.engine.http.action.RestApiAction$$anonfun$1$$anonfun$apply$13$$anonfun$apply$14$$anonfun$apply$15$$anonfun$apply$16.apply(RestAction.scala:58)
at
ar.com.visionaris.engine.http.action.RestApiAction$$anonfun$1$$anonfun$apply$13$$anonfun$apply$14$$anonfun$apply$15$$anonfun$apply$16.apply(RestAction.scala:58)
at
akka.http.scaladsl.server.directives.RouteDirectives$$anonfun$complete$1.apply(RouteDirectives.scala:47)
at
akka.http.scaladsl.server.directives.RouteDirectives$$anonfun$complete$1.apply(RouteDirectives.scala:47)
at
akka.http.scaladsl.server.StandardRoute$$anon$1.apply(StandardRoute.scala:19)
at
akka.http.scaladsl.server.StandardRoute$$anon$1.apply(StandardRoute.scala:19)
at
akka.http.scaladsl.server.directives.BasicDirectives$$anonfun$mapRouteResult$1$$anonfun$apply$3.apply(BasicDirectives.scala:32)
at
akka.http.scaladsl.server.directives.BasicDirectives$$anonfun$mapRouteResult$1$$anonfun$apply$3.apply(BasicDirectives.scala:32)
at
akka.http.scaladsl.server.directives.FutureDirectives$$anonfun$onComplete$1$$anonfun$apply$1$$anonfun$apply$2.apply(FutureDirectives.scala:25)
at
akka.http.scaladsl.server.directives.FutureDirectives$$anonfun$onComplete$1$$anonfun$apply$1$$anonfun$apply$2.apply(FutureDirectives.scala:25)
at
akka.http.scaladsl.util.FastFuture$$anonfun$transformWith$extension0$1.apply(FastFuture.scala:37)
at
akka.http.scaladsl.util.FastFuture$$anonfun$transformWith$extension0$1.apply(FastFuture.scala:37)
at
akka.http.scaladsl.util.FastFuture$.akka$http$scaladsl$util$FastFuture$$strictTransform$1(FastFuture.scala:41)
at
akka.http.scaladsl.util.FastFuture$.transformWith$extension1(FastFuture.scala:45)
at
akka.http.scaladsl.util.FastFuture$.transformWith$extension0(FastFuture.scala:37)
at
akka.http.scaladsl.server.directives.FutureDirectives$$anonfun$onComplete$1$$anonfun$apply$1.apply(FutureDirectives.scala:25)
at
akka.http.scaladsl.server.directives.FutureDirectives$$anonfun$onComplete$1$$anonfun$apply$1.apply(FutureDirectives.scala:23)
at
akka.http.scaladsl.server.directives.BasicDirectives$$anonfun$textract$1$$anonfun$apply$5.apply(BasicDirectives.scala:88)
at
akka.http.scaladsl.server.directives.BasicDirectives$$anonfun$textract$1$$anonfun$apply$5.apply(BasicDirectives.scala:88)
at
akka.http.scaladsl.server.directives.BasicDirectives$$anonfun$mapRouteResult$1$$anonfun$apply$3.apply(BasicDirectives.scala:32)
at
akka.http.scaladsl.server.directives.BasicDirectives$$anonfun$mapRouteResult$1$$anonfun$apply$3.apply(BasicDirectives.scala:32)
at
akka.http.scaladsl.server.directives.BasicDirectives$$anonfun$textract$1$$anonfun$apply$5.apply(BasicDirectives.scala:88)
at
akka.http.scaladsl.server.directives.BasicDirectives$$anonfun$textract$1$$anonfun$apply$5.apply(BasicDirectives.scala:88)
at
akka.http.scaladsl.server.directives.BasicDirectives$$anonfun$mapRequestContext$1$$anonfun$apply$1.apply(BasicDirectives.scala:23)
at
akka.http.scaladsl.server.directives.BasicDirectives$$anonfun$mapRequestContext$1$$anonfun$apply$1.apply(BasicDirectives.scala:23)
at
akka.http.scaladsl.server.directives.BasicDirectives$$anonfun$textract$1$$anonfun$apply$5.apply(BasicDirectives.scala:88)
at
akka.http.scaladsl.server.directives.BasicDirectives$$anonfun$textract$1$$anonfun$apply$5.apply(BasicDirectives.scala:88)
at
akka.http.scaladsl.server.RouteConcatenation$RouteWithConcatenation$$anonfun$$tilde$1$$anonfun$apply$1.apply(RouteConcatenation.scala:28)
at
akka.http.scaladsl.server.RouteConcatenation$RouteWithConcatenation$$anonfun$$tilde$1$$anonfun$apply$1.apply(RouteConcatenation.scala:25)
at
akka.http.scaladsl.util.FastFuture$.akka$http$scaladsl$util$FastFuture$$strictTransform$1(FastFuture.scala:41)
at
akka.http.scaladsl.util.FastFuture$.transformWith$extension1(FastFuture.scala:45)
at
akka.http.scaladsl.util.FastFuture$.flatMap$extension(FastFuture.scala:26)
at
akka.http.scaladsl.server.RouteConcatenation$RouteWithConcatenation$$anonfun$$tilde$1.apply(RouteConcatenation.scala:25)
at
akka.http.scaladsl.server.RouteConcatenation$RouteWithConcatenation$$anonfun$$tilde$1.apply(RouteConcatenation.scala:23)
at
akka.http.scaladsl.server.RouteConcatenation$RouteWithConcatenation$$anonfun$$tilde$1.apply(RouteConcatenation.scala:25)
at
akka.http.scaladsl.server.RouteConcatenation$RouteWithConcatenation$$anonfun$$tilde$1.apply(RouteConcatenation.scala:23)
at
akka.http.scaladsl.server.RouteConcatenation$RouteWithConcatenation$$anonfun$$tilde$1.apply(RouteConcatenation.scala:25)
at
akka.http.scaladsl.server.RouteConcatenation$RouteWithConcatenation$$anonfun$$tilde$1.apply(RouteConcatenation.scala:23)
at
akka.http.scaladsl.server.RouteConcatenation$RouteWithConcatenation$$anonfun$$tilde$1.apply(RouteConcatenation.scala:25)
at
akka.http.scaladsl.server.RouteConcatenation$RouteWithConcatenation$$anonfun$$tilde$1.apply(RouteConcatenation.scala:23)
at
akka.http.scaladsl.server.directives.BasicDirectives$$anonfun$mapRequestContext$1$$anonfun$apply$1.apply(BasicDirectives.scala:23)
at
akka.http.scaladsl.server.directives.BasicDirectives$$anonfun$mapRequestContext$1$$anonfun$apply$1.apply(BasicDirectives.scala:23)
at
akka.http.scaladsl.server.directives.BasicDirectives$$anonfun$textract$1$$anonfun$apply$5.apply(BasicDirectives.scala:88)
at
akka.http.scaladsl.server.directives.BasicDirectives$$anonfun$textract$1$$anonfun$apply$5.apply(BasicDirectives.scala:88)
at
akka.http.scaladsl.server.RouteConcatenation$RouteWithConcatenation$$anonfun$$tilde$1$$anonfun$apply$1.apply(RouteConcatenation.scala:28)
at
akka.http.scaladsl.server.RouteConcatenation$RouteWithConcatenation$$anonfun$$tilde$1$$anonfun$apply$1.apply(RouteConcatenation.scala:25)
at
akka.http.scaladsl.util.FastFuture$.akka$http$scaladsl$util$FastFuture$$strictTransform$1(FastFuture.scala:41)
at
akka.http.scaladsl.util.FastFuture$.transformWith$extension1(FastFuture.scala:45)
at
akka.http.scaladsl.util.FastFuture$.flatMap$extension(FastFuture.scala:26)
at
akka.http.scaladsl.server.RouteConcatenation$RouteWithConcatenation$$anonfun$$tilde$1.apply(RouteConcatenation.scala:25)
at
akka.http.scaladsl.server.RouteConcatenation$RouteWithConcatenation$$anonfun$$tilde$1.apply(RouteConcatenation.scala:23)
at
akka.http.scaladsl.server.RouteConcatenation$RouteWithConcatenation$$anonfun$$tilde$1.apply(RouteConcatenation.scala:25)
at
akka.http.scaladsl.server.RouteConcatenation$RouteWithConcatenation$$anonfun$$tilde$1.apply(RouteConcatenation.scala:23)
at
akka.http.scaladsl.server.RouteConcatenation$RouteWithConcatenation$$anonfun$$tilde$1.apply(RouteConcatenation.scala:25)
at
akka.http.scaladsl.server.RouteConcatenation$RouteWithConcatenation$$anonfun$$tilde$1.apply(RouteConcatenation.scala:23)
at
akka.http.scaladsl.server.directives.BasicDirectives$$anonfun$mapRouteResult$1$$anonfun$apply$3.apply(BasicDirectives.scala:32)
at
akka.http.scaladsl.server.directives.BasicDirectives$$anonfun$mapRouteResult$1$$anonfun$apply$3.apply(BasicDirectives.scala:32)
at
akka.http.scaladsl.server.directives.BasicDirectives$$anonfun$textract$1$$anonfun$apply$5.apply(BasicDirectives.scala:88)
at
akka.http.scaladsl.server.directives.BasicDirectives$$anonfun$textract$1$$anonfun$apply$5.apply(BasicDirectives.scala:88)
at
akka.http.scaladsl.server.RouteConcatenation$RouteWithConcatenation$$anonfun$$tilde$1.apply(RouteConcatenation.scala:25)
at
akka.http.scaladsl.server.RouteConcatenation$RouteWithConcatenation$$anonfun$$tilde$1.apply(RouteConcatenation.scala:23)
at
akka.http.scaladsl.server.directives.BasicDirectives$$anonfun$mapRouteResult$1$$anonfun$apply$3.apply(BasicDirectives.scala:32)
at
akka.http.scaladsl.server.directives.BasicDirectives$$anonfun$mapRouteResult$1$$anonfun$apply$3.apply(BasicDirectives.scala:32)
at
akka.http.scaladsl.server.directives.BasicDirectives$$anonfun$mapRouteResultWith$1$$anonfun$apply$4.apply(BasicDirectives.scala:35)
at
akka.http.scaladsl.server.directives.BasicDirectives$$anonfun$mapRouteResultWith$1$$anonfun$apply$4.apply(BasicDirectives.scala:35)
at
akka.http.scaladsl.server.directives.BasicDirectives$$anonfun$textract$1$$anonfun$apply$5.apply(BasicDirectives.scala:88)
at
akka.http.scaladsl.server.directives.BasicDirectives$$anonfun$textract$1$$anonfun$apply$5.apply(BasicDirectives.scala:88)
at
akka.http.scaladsl.server.directives.ExecutionDirectives$$anonfun$handleExceptions$1$$anonfun$apply$1.apply(ExecutionDirectives.scala:27)
at
akka.http.scaladsl.server.directives.ExecutionDirectives$$anonfun$handleExceptions$1$$anonfun$apply$1.apply(ExecutionDirectives.scala:23)
at
akka.http.scaladsl.server.Route$$anonfun$asyncHandler$1.apply(Route.scala:64)
at
akka.http.scaladsl.server.Route$$anonfun$asyncHandler$1.apply(Route.scala:63)
at akka.stream.impl.fusing.MapAsync$$anon$9$$anon$10.onPush(Ops.scala:590)
at
akka.stream.impl.fusing.GraphInterpreter.processElement$1(GraphInterpreter.scala:575)
at
akka.stream.impl.fusing.GraphInterpreter.processEvent(GraphInterpreter.scala:586)
at
akka.stream.impl.fusing.GraphInterpreter.execute(GraphInterpreter.scala:533)
at
akka.stream.impl.fusing.GraphInterpreterShell.runBatch(ActorGraphInterpreter.scala:445)
at
akka.stream.impl.fusing.GraphInterpreterShell.receive(ActorGraphInterpreter.scala:400)
at
akka.stream.impl.fusing.ActorGraphInterpreter$$anonfun$receive$1.applyOrElse(ActorGraphInterpreter.scala:547)
at akka.actor.Actor$class.aroundReceive(Actor.scala:467)
at
akka.stream.impl.fusing.ActorGraphInterpreter.aroundReceive(ActorGraphInterpreter.scala:495)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
at akka.actor.ActorCell.invoke(ActorCell.scala:487)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
at akka.dispatch.Mailbox.run(Mailbox.scala:220)
at
akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:397)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: org.apache.spark.sql.catalyst.errors.package$TreeNodeException:
execute, tree:
Exchange rangepartitioning(TiempoAAMM#67 ASC NULLS FIRST, 2)
+- *HashAggregate(keys=[TiempoAAMM#67, spark_grouping_id#65],
functions=[sum(Cantidad#23)], output=[TiempoAAMM#67, Cantidad_SUM#96])
   +- Exchange hashpartitioning(TiempoAAMM#67, spark_grouping_id#65, 2)
      +- *HashAggregate(keys=[TiempoAAMM#67, spark_grouping_id#65],
functions=[partial_sum(Cantidad#23)], output=[TiempoAAMM#67,
spark_grouping_id#65, sum#160])
         +- *Filter isnotnull(TiempoAAMM#67)
            +- *!Expand [ArrayBuffer(Cantidad#23, TiempoAAMM#36, 0),
ArrayBuffer(Cantidad#23, null, 1)], [Cantidad#23, TiempoAAMM#67,
spark_grouping_id#65]
               +- InMemoryTableScan [Cantidad#23]
                     +- InMemoryRelation [Cantidad#23, TiempoAAMM#36],
true, 10000, StorageLevel(disk, memory, deserialized, 1 replicas)
                           +- *Project [Cantidad#6 AS Cantidad#23,
TiempoAAMM#4 AS TiempoAAMM#36]
                              +- *FileScan parquet
[TiempoAAMM#4,Cantidad#6] Batched: true, Format: Parquet, Location:
InMemoryFileIndex[file:/tmp/vs/cache/dataSet-16], PartitionFilters: [],
PushedFilters: [], ReadSchema:
struct<TiempoAAMM:string,Cantidad:decimal(18,2)>

at
org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:56)
at
org.apache.spark.sql.execution.exchange.ShuffleExchange.doExecute(ShuffleExchange.scala:112)
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:135)
at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at
org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:132)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:113)
at
org.apache.spark.sql.execution.InputAdapter.inputRDDs(WholeStageCodegenExec.scala:235)
at org.apache.spark.sql.execution.SortExec.inputRDDs(SortExec.scala:121)
at org.apache.spark.sql.execution.SortExec.inputRDDs(SortExec.scala:121)
at
org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:368)
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:135)
at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at
org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:132)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:113)
at
org.apache.spark.sql.execution.exchange.ShuffleExchange.prepareShuffleDependency(ShuffleExchange.scala:85)
at
org.apache.spark.sql.execution.exchange.ShuffleExchange$$anonfun$doExecute$1.apply(ShuffleExchange.scala:121)
at
org.apache.spark.sql.execution.exchange.ShuffleExchange$$anonfun$doExecute$1.apply(ShuffleExchange.scala:112)
at
org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:52)
... 147 more
Caused by: org.apache.spark.sql.catalyst.errors.package$TreeNodeException:
execute, tree:
Exchange hashpartitioning(TiempoAAMM#67, spark_grouping_id#65, 2)
+- *HashAggregate(keys=[TiempoAAMM#67, spark_grouping_id#65],
functions=[partial_sum(Cantidad#23)], output=[TiempoAAMM#67,
spark_grouping_id#65, sum#160])
   +- *Filter isnotnull(TiempoAAMM#67)
      +- *!Expand [ArrayBuffer(Cantidad#23, TiempoAAMM#36, 0),
ArrayBuffer(Cantidad#23, null, 1)], [Cantidad#23, TiempoAAMM#67,
spark_grouping_id#65]
         +- InMemoryTableScan [Cantidad#23]
               +- InMemoryRelation [Cantidad#23, TiempoAAMM#36], true,
10000, StorageLevel(disk, memory, deserialized, 1 replicas)
                     +- *Project [Cantidad#6 AS Cantidad#23, TiempoAAMM#4
AS TiempoAAMM#36]
                        +- *FileScan parquet [TiempoAAMM#4,Cantidad#6]
Batched: true, Format: Parquet, Location:
InMemoryFileIndex[file:/tmp/vs/cache/dataSet-16], PartitionFilters: [],
PushedFilters: [], ReadSchema:
struct<TiempoAAMM:string,Cantidad:decimal(18,2)>

at
org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:56)
at
org.apache.spark.sql.execution.exchange.ShuffleExchange.doExecute(ShuffleExchange.scala:112)
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:135)
at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at
org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:132)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:113)
at
org.apache.spark.sql.execution.InputAdapter.inputRDDs(WholeStageCodegenExec.scala:235)
at
org.apache.spark.sql.execution.aggregate.HashAggregateExec.inputRDDs(HashAggregateExec.scala:141)
at
org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:368)
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:135)
at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at
org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:132)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:113)
at
org.apache.spark.sql.execution.exchange.ShuffleExchange.prepareShuffleDependency(ShuffleExchange.scala:85)
at
org.apache.spark.sql.execution.exchange.ShuffleExchange$$anonfun$doExecute$1.apply(ShuffleExchange.scala:121)
at
org.apache.spark.sql.execution.exchange.ShuffleExchange$$anonfun$doExecute$1.apply(ShuffleExchange.scala:112)
at
org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:52)
... 168 more
Caused by: org.apache.spark.sql.catalyst.errors.package$TreeNodeException:
Binding attribute, tree: TiempoAAMM#36
at
org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:56)
at
org.apache.spark.sql.catalyst.expressions.BindReferences$$anonfun$bindReference$1.applyOrElse(BoundAttribute.scala:88)
at
org.apache.spark.sql.catalyst.expressions.BindReferences$$anonfun$bindReference$1.applyOrElse(BoundAttribute.scala:87)
at
org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:288)
at
org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:288)
at
org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:70)
at
org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:287)
at
org.apache.spark.sql.catalyst.trees.TreeNode.transform(TreeNode.scala:277)
at
org.apache.spark.sql.catalyst.expressions.BindReferences$.bindReference(BoundAttribute.scala:87)
at
org.apache.spark.sql.execution.ExpandExec$$anonfun$5$$anonfun$apply$1.apply$mcVI$sp(ExpandExec.scala:169)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160)
at
org.apache.spark.sql.execution.ExpandExec$$anonfun$5.apply(ExpandExec.scala:167)
at
org.apache.spark.sql.execution.ExpandExec$$anonfun$5.apply(ExpandExec.scala:165)
at
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
at scala.collection.AbstractTraversable.map(Traversable.scala:104)
at org.apache.spark.sql.execution.ExpandExec.doConsume(ExpandExec.scala:165)
at
org.apache.spark.sql.execution.CodegenSupport$class.consume(WholeStageCodegenExec.scala:153)
at
org.apache.spark.sql.execution.InputAdapter.consume(WholeStageCodegenExec.scala:218)
at
org.apache.spark.sql.execution.InputAdapter.doProduce(WholeStageCodegenExec.scala:246)
at
org.apache.spark.sql.execution.CodegenSupport$$anonfun$produce$1.apply(WholeStageCodegenExec.scala:83)
at
org.apache.spark.sql.execution.CodegenSupport$$anonfun$produce$1.apply(WholeStageCodegenExec.scala:78)
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:135)
at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at
org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:132)
at
org.apache.spark.sql.execution.CodegenSupport$class.produce(WholeStageCodegenExec.scala:78)
at
org.apache.spark.sql.execution.InputAdapter.produce(WholeStageCodegenExec.scala:218)
at org.apache.spark.sql.execution.ExpandExec.doProduce(ExpandExec.scala:93)
at
org.apache.spark.sql.execution.CodegenSupport$$anonfun$produce$1.apply(WholeStageCodegenExec.scala:83)
at
org.apache.spark.sql.execution.CodegenSupport$$anonfun$produce$1.apply(WholeStageCodegenExec.scala:78)
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:135)
at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at
org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:132)
at
org.apache.spark.sql.execution.CodegenSupport$class.produce(WholeStageCodegenExec.scala:78)
at org.apache.spark.sql.execution.ExpandExec.produce(ExpandExec.scala:36)
at
org.apache.spark.sql.execution.FilterExec.doProduce(basicPhysicalOperators.scala:128)
at
org.apache.spark.sql.execution.CodegenSupport$$anonfun$produce$1.apply(WholeStageCodegenExec.scala:83)
at
org.apache.spark.sql.execution.CodegenSupport$$anonfun$produce$1.apply(WholeStageCodegenExec.scala:78)
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:135)
at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at
org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:132)
at
org.apache.spark.sql.execution.CodegenSupport$class.produce(WholeStageCodegenExec.scala:78)
at
org.apache.spark.sql.execution.FilterExec.produce(basicPhysicalOperators.scala:88)
at
org.apache.spark.sql.execution.aggregate.HashAggregateExec.doProduceWithKeys(HashAggregateExec.scala:598)
at
org.apache.spark.sql.execution.aggregate.HashAggregateExec.doProduce(HashAggregateExec.scala:148)
at
org.apache.spark.sql.execution.CodegenSupport$$anonfun$produce$1.apply(WholeStageCodegenExec.scala:83)
at
org.apache.spark.sql.execution.CodegenSupport$$anonfun$produce$1.apply(WholeStageCodegenExec.scala:78)
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:135)
at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at
org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:132)
at
org.apache.spark.sql.execution.CodegenSupport$class.produce(WholeStageCodegenExec.scala:78)
at
org.apache.spark.sql.execution.aggregate.HashAggregateExec.produce(HashAggregateExec.scala:38)
at
org.apache.spark.sql.execution.WholeStageCodegenExec.doCodeGen(WholeStageCodegenExec.scala:313)
at
org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:354)
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
at
org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:135)
at
org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at
org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:132)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:113)
at
org.apache.spark.sql.execution.exchange.ShuffleExchange.prepareShuffleDependency(ShuffleExchange.scala:85)
at
org.apache.spark.sql.execution.exchange.ShuffleExchange$$anonfun$doExecute$1.apply(ShuffleExchange.scala:121)
at
org.apache.spark.sql.execution.exchange.ShuffleExchange$$anonfun$doExecute$1.apply(ShuffleExchange.scala:112)
at
org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:52)
... 188 more
Caused by: java.lang.RuntimeException: Couldn't find TiempoAAMM#36 in
[Cantidad#23]
at scala.sys.package$.error(package.scala:27)
at
org.apache.spark.sql.catalyst.expressions.BindReferences$$anonfun$bindReference$1$$anonfun$applyOrElse$1.apply(BoundAttribute.scala:94)
at
org.apache.spark.sql.catalyst.expressions.BindReferences$$anonfun$bindReference$1$$anonfun$applyOrElse$1.apply(BoundAttribute.scala:88)
at
org.apache.spark.sql.catalyst.errors.package$.attachTree(package.scala:52)
... 254 more

[11/04/2017 15:55:12 - default-akka.actor.default-dispatcher-6] ERROR
ar.com.visionaris.engine.http.rest.api.EngineRestApi  -
org.apache.spark.sql.catalyst.errors.package$TreeNodeException: execute,
tree:
Exchange hashpartitioning(TiempoAAMM#67, 2)
+- *Sort [TiempoAAMM#67 ASC NULLS FIRST], true, 0
   +- *Sort [TiempoAAMM#67 ASC NULLS FIRST], true, 0
      +- Exchange rangepartitioning(TiempoAAMM#67 ASC NULLS FIRST, 2)
         +- *HashAggregate(keys=[TiempoAAMM#67, spark_grouping_id#65],
functions=[sum(Cantidad#23)], output=[TiempoAAMM#67, Cantidad_SUM#96])
            +- Exchange hashpartitioning(TiempoAAMM#67,
spark_grouping_id#65, 2)
               +- *HashAggregate(keys=[TiempoAAMM#67,
spark_grouping_id#65], functions=[partial_sum(Cantidad#23)],
output=[TiempoAAMM#67, spark_grouping_id#65, sum#160])
                  +- *Filter isnotnull(TiempoAAMM#67)
                     +- *!Expand [ArrayBuffer(Cantidad#23, TiempoAAMM#36,
0), ArrayBuffer(Cantidad#23, null, 1)], [Cantidad#23, TiempoAAMM#67,
spark_grouping_id#65]
                        +- InMemoryTableScan [Cantidad#23]
                              +- InMemoryRelation [Cantidad#23,
TiempoAAMM#36], true, 10000, StorageLevel(disk, memory, deserialized, 1
replicas)
                                    +- *Project [Cantidad#6 AS Cantidad#23,
TiempoAAMM#4 AS TiempoAAMM#36]
                                       +- *FileScan parquet
[TiempoAAMM#4,Cantidad#6] Batched: true, Format: Parquet, Location:
InMemoryFileIndex[file:/tmp/vs/cache/dataSet-16], PartitionFilters: [],
PushedFilters: [], ReadSchema:
struct<TiempoAAMM:string,Cantidad:decimal(18,2)>


I had to go back with Spark 2.0.2

Regards

-- 
Ing. Ivaldi Andres