You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2020/09/08 01:52:56 UTC

[GitHub] [spark] HyukjinKwon commented on a change in pull request #29667: [SPARK-32813][SQL] Get default config of vectorized reader if no active SparkSession

HyukjinKwon commented on a change in pull request #29667:
URL: https://github.com/apache/spark/pull/29667#discussion_r484608842



##########
File path: sql/core/src/main/scala/org/apache/spark/sql/execution/DataSourceScanExec.scala
##########
@@ -177,7 +177,11 @@ case class FileSourceScanExec(
 
   private lazy val needsUnsafeRowConversion: Boolean = {
     if (relation.fileFormat.isInstanceOf[ParquetSource]) {
-      SparkSession.getActiveSession.get.sessionState.conf.parquetVectorizedReaderEnabled
+      SparkSession.getActiveSession.orElse(SparkSession.getDefaultSession).map { session =>
+        session.sessionState.conf.parquetVectorizedReaderEnabled
+      }.getOrElse {
+        SQLConf.get.parquetVectorizedReaderEnabled

Review comment:
       @viirya, can't we just use `SQLConf.get.parquetVectorizedReaderEnabled` directly? 
   Just to clarify, the behaviour here we want is:
   - Use the conf in active Spark session
   - If not, use default one.
   ?

##########
File path: sql/core/src/main/scala/org/apache/spark/sql/execution/DataSourceScanExec.scala
##########
@@ -177,7 +177,11 @@ case class FileSourceScanExec(
 
   private lazy val needsUnsafeRowConversion: Boolean = {
     if (relation.fileFormat.isInstanceOf[ParquetSource]) {
-      SparkSession.getActiveSession.get.sessionState.conf.parquetVectorizedReaderEnabled
+      SparkSession.getActiveSession.orElse(SparkSession.getDefaultSession).map { session =>
+        session.sessionState.conf.parquetVectorizedReaderEnabled
+      }.getOrElse {
+        SQLConf.get.parquetVectorizedReaderEnabled

Review comment:
       @viirya, can't we just use `SQLConf.get.parquetVectorizedReaderEnabled` directly? 
   Just to clarify, the behaviour here we want is:
   - Use the conf in active Spark session
   - If not, use default one.
   
   ?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org