You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/12/07 02:04:23 UTC

[GitHub] [spark] HyukjinKwon commented on a change in pull request #26787: [SPARK-30158][SQL][CORE] Seq -> Array for sc.parallelize for 2.13 compatibility; remove WrappedArray

HyukjinKwon commented on a change in pull request #26787: [SPARK-30158][SQL][CORE] Seq -> Array for sc.parallelize for 2.13 compatibility; remove WrappedArray
URL: https://github.com/apache/spark/pull/26787#discussion_r355091865
 
 

 ##########
 File path: core/src/main/scala/org/apache/spark/executor/Executor.scala
 ##########
 @@ -566,7 +566,7 @@ private[spark] class Executor(
           val (accums, accUpdates) = collectAccumulatorsAndResetStatusOnFailure(taskStartTimeNs)
           // Here and below, put task metric peaks in a WrappedArray to expose them as a Seq
           // without requiring a copy.
-          val metricPeaks = WrappedArray.make(metricsPoller.getTaskMetricPeaks(taskId))
+          val metricPeaks = metricsPoller.getTaskMetricPeaks(taskId)
 
 Review comment:
   Sean, I think we should fix RowEncoder too if we shouldn't use WrappedArary:
   
   https://github.com/apache/spark/blob/b917a6593dc969b9b766259eb8cbbd6e90e0dc53/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/encoders/RowEncoder.scala#L294-L299
   
   Out of curiosity, what will we use as the replacement for WrappedArray? I'm asking this as it might affect users directly:
   
   ```scala
   
   scala> spark.range(1).selectExpr("array(1)").collect()
   res0: Array[org.apache.spark.sql.Row] = Array([WrappedArray(1)])
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org