You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/02/03 02:01:43 UTC

[GitHub] srowen commented on a change in pull request #23727: [SPARK-26817][CORE] Use System.nanoTime to measure time intervals

srowen commented on a change in pull request #23727: [SPARK-26817][CORE] Use System.nanoTime to measure time intervals
URL: https://github.com/apache/spark/pull/23727#discussion_r253284627
 
 

 ##########
 File path: core/src/main/scala/org/apache/spark/scheduler/ResultTask.scala
 ##########
 @@ -75,14 +75,14 @@ private[spark] class ResultTask[T, U](
   override def runTask(context: TaskContext): U = {
     // Deserialize the RDD and the func using the broadcast variables.
     val threadMXBean = ManagementFactory.getThreadMXBean
-    val deserializeStartTime = System.currentTimeMillis()
+    val deserializeStartTime = System.nanoTime()
     val deserializeStartCpuTime = if (threadMXBean.isCurrentThreadCpuTimeSupported) {
       threadMXBean.getCurrentThreadCpuTime
     } else 0L
     val ser = SparkEnv.get.closureSerializer.newInstance()
     val (rdd, func) = ser.deserialize[(RDD[T], (TaskContext, Iterator[T]) => U)](
       ByteBuffer.wrap(taskBinary.value), Thread.currentThread.getContextClassLoader)
-    _executorDeserializeTime = System.currentTimeMillis() - deserializeStartTime
+    _executorDeserializeTime = System.nanoTime() - deserializeStartTime
 
 Review comment:
   Same here... if the variable name changes then at least in the diff we can see all the places it was used and more easily review that they're still correct.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org