You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2020/07/03 00:17:24 UTC

[GitHub] [spark] ueshin commented on a change in pull request #28986: [SPARK-32160][CORE][PYSPARK] Disallow to create SparkContext in executors.

ueshin commented on a change in pull request #28986:
URL: https://github.com/apache/spark/pull/28986#discussion_r449319134



##########
File path: core/src/main/scala/org/apache/spark/SparkContext.scala
##########
@@ -2554,6 +2557,19 @@ object SparkContext extends Logging {
     }
   }
 
+  /**
+   * Called to ensure that SparkContext is created or accessed only on the Driver.
+   *
+   * Throws an exception if a SparkContext is about to be created in executors.
+   */
+  private[spark] def assertOnDriver(): Unit = {
+    if (TaskContext.get != null) {

Review comment:
       Under local mode:
   
   ```
   scala> sc.range(0, 1).foreach { _ => new SparkContext(new SparkConf().setAppName("test").setMaster("local")) }
   java.lang.IllegalStateException: SparkContext should only be created and accessed on the driver.
   ...
   ```
   
   before this patch:
   
   ```
   scala> sc.range(0, 1).foreach { _ => new SparkContext(new SparkConf().setAppName("test").setMaster("local")) }
   org.apache.spark.SparkException: Only one SparkContext should be running in this JVM (see SPARK-2243).The currently running SparkContext was created at:
   org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:921)
   ...
   ```
   
   Although the exception is different, it fails anyway.
   
   I think the new error message is more reasonable.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org