You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (Jira)" <ji...@apache.org> on 2022/02/05 04:22:00 UTC
[jira] [Resolved] (SPARK-38073) NameError: name 'sc' is not defined when running driver with IPyhon and Pyhon > 3.7
[ https://issues.apache.org/jira/browse/SPARK-38073?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Dongjoon Hyun resolved SPARK-38073.
-----------------------------------
Fix Version/s: 3.3.0
3.2.2
Resolution: Fixed
Issue resolved by pull request 35396
[https://github.com/apache/spark/pull/35396]
> NameError: name 'sc' is not defined when running driver with IPyhon and Pyhon > 3.7
> -----------------------------------------------------------------------------------
>
> Key: SPARK-38073
> URL: https://issues.apache.org/jira/browse/SPARK-38073
> Project: Spark
> Issue Type: Bug
> Components: PySpark, Spark Shell
> Affects Versions: 3.2.0, 3.3.0
> Reporter: Maciej Szymkiewicz
> Assignee: Maciej Szymkiewicz
> Priority: Major
> Fix For: 3.3.0, 3.2.2
>
>
> When {{PYSPARK_DRIVER_PYTHON=$(which ipython) bin/pyspark}} is executed with Python >= 3.8, function registered wiht atexit seems to be executed in different scope than in Python 3.7.
> It result in {{NameError: name 'sc' is not defined}} on exit:
> {code:python}
> Welcome to
> ____ __
> / __/__ ___ _____/ /__
> _\ \/ _ \/ _ `/ __/ '_/
> /__ / .__/\_,_/_/ /_/\_\ version 3.3.0-SNAPSHOT
> /_/
> Using Python version 3.8.12 (default, Oct 12 2021 21:57:06)
> Spark context Web UI available at http://192.168.0.198:4040
> Spark context available as 'sc' (master = local[*], app id = local-1643555855409).
> SparkSession available as 'spark'.
> In [1]:
> Do you really want to exit ([y]/n)? y
> Error in atexit._run_exitfuncs:
> Traceback (most recent call last):
> File "/path/to/spark/python/pyspark/shell.py", line 49, in <lambda>
> atexit.register(lambda: sc.stop())
> NameError: name 'sc' is not defined
> {code}
> This could be easily fixed by capturing `sc` instance
> {code:none}
> diff --git a/python/pyspark/shell.py b/python/pyspark/shell.py
> index f0c487877a..4164e3ab0c 100644
> --- a/python/pyspark/shell.py
> +++ b/python/pyspark/shell.py
> @@ -46,7 +46,7 @@ except Exception:
>
> sc = spark.sparkContext
> sql = spark.sql
> -atexit.register(lambda: sc.stop())
> +atexit.register((lambda sc: lambda: sc.stop())(sc))
>
> # for compatibility
> sqlContext = spark._wrapped
> {code}
--
This message was sent by Atlassian Jira
(v8.20.1#820001)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org