You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by cloud-fan <gi...@git.apache.org> on 2018/09/18 04:06:19 UTC

[GitHub] spark pull request #22443: [SPARK-25339][TEST] Refactor FilterPushdownBenchm...

Github user cloud-fan commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22443#discussion_r218293752
  
    --- Diff: sql/core/src/test/scala/org/apache/spark/sql/execution/benchmark/FilterPushdownBenchmark.scala ---
    @@ -17,29 +17,28 @@
     
     package org.apache.spark.sql.execution.benchmark
     
    -import java.io.{File, FileOutputStream, OutputStream}
    +import java.io.File
     
     import scala.util.{Random, Try}
     
    -import org.scalatest.{BeforeAndAfterEachTestData, Suite, TestData}
    -
     import org.apache.spark.SparkConf
    -import org.apache.spark.SparkFunSuite
     import org.apache.spark.sql.{DataFrame, SparkSession}
     import org.apache.spark.sql.functions.monotonically_increasing_id
     import org.apache.spark.sql.internal.SQLConf
     import org.apache.spark.sql.internal.SQLConf.ParquetOutputTimestampType
     import org.apache.spark.sql.types.{ByteType, Decimal, DecimalType, TimestampType}
    -import org.apache.spark.util.{Benchmark, Utils}
    +import org.apache.spark.util.{Benchmark, BenchmarkBase => FileBenchmarkBase, Utils}
     
     /**
      * Benchmark to measure read performance with Filter pushdown.
    - * To run this:
    - *  build/sbt "sql/test-only *FilterPushdownBenchmark"
    - *
    - * Results will be written to "benchmarks/FilterPushdownBenchmark-results.txt".
    + * To run this benchmark:
    + *  1. without sbt: bin/spark-submit --class <this class> <spark sql test jar>
    + *  2. build/sbt "sql/test:runMain <this class>"
    + *  3. generate result: SPARK_GENERATE_BENCHMARK_FILES=1 build/sbt "sql/test:runMain <this class>"
    --- End diff --
    
    shall we print the benchmark result if `SPARK_GENERATE_BENCHMARK_FILES` is not set?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org