You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/04/02 13:43:22 UTC

[GitHub] [spark] cxzl25 commented on a change in pull request #23895: [SPARK-26992][STS] Fix STS scheduler pool correct delivery

cxzl25 commented on a change in pull request #23895: [SPARK-26992][STS] Fix STS scheduler pool correct delivery
URL: https://github.com/apache/spark/pull/23895#discussion_r271310966
 
 

 ##########
 File path: sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkExecuteStatementOperation.scala
 ##########
 @@ -291,6 +287,20 @@ private[hive] class SparkExecuteStatementOperation(
       sqlContext.sparkContext.cancelJobGroup(statementId)
     }
   }
+
+  private def withSchedulerPool[T](body: => T): T = {
+    val pool = sessionToActivePool.get(parentSession.getSessionHandle)
+    if (pool != null) {
 
 Review comment:
   Because only the user submitted sql has ```set spark.sql.thriftserver.scheduler.pool=xxx```, ```sessionToActivePool``` will put xxx value, will not be the default scheduler, so the default scheduler pool name is null
   
   Unconditional setting it is worried about ambiguity, because only after the put value, there will be a need to take this value, although there is no overhead
   
   https://github.com/apache/spark/blob/43bf4ae6417fcb15d0fbc7880f14f307c164d464/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkExecuteStatementOperation.scala#L236-L239

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org