You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Kaushal Prajapati (Jira)" <ji...@apache.org> on 2019/11/22 18:18:00 UTC

[jira] [Updated] (SPARK-30002) Reuse SparkSession in pyspark via Gateway

     [ https://issues.apache.org/jira/browse/SPARK-30002?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Kaushal Prajapati updated SPARK-30002:
--------------------------------------
    Description: 
In PySpark, we create SparkContext via user spark configurations or the default ones, and it gets launched through py4j gateway internally.

Let's say if I have launched py4j gateway from another application then to communicate with the same py4j gateway I have to set below configuration:-

 
{code:java}
export PYSPARK_GATEWAY_PORT=12345
export PYSPARK_GATEWAY_SECRET=***********************
{code}
 

So when PySpark tries to create its own SparkContext after the communication has been set up, it doesn't check whether there is any available SparkContext in the same JVM.

Current code snippet:-

 
{code:java}
def _initialize_context(self, jconf):
    """
    Initialize SparkContext in function to allow subclass specific initialization
    """
    return self._jvm.JavaSparkContext(jconf){code}
 

 

After changing it to the following, it works fine for me.
{code:java}
def _initialize_context(self, jconf):
    """
    Initialize SparkContext in function to allow subclass specific initialization
    """
    return self._jvm.JavaSparkContext(self._jvm.org.apache.spark.SparkContext.getOrCreate(jconf)){code}
 

It looks like a good use case for improvement.

  was:
In PySpark, we create SparkContext via user spark configurations or the default ones, and it gets launched through py4j gateway internally.

Let's say if I have launched py4j gateway from another application then to communicate with the same py4j gateway I have to set below configuration:-

 
{code:java}
export PYSPARK_GATEWAY_PORT=12345
export PYSPARK_GATEWAY_SECRET=***********************
{code}
 

So when PySpark tries to create its own SparkContext after the communication has been set up, it doesn't check whether there is any available SparkContext in the same JVM.

Current code snippet:-

 
{code:java}
def _initialize_context(self, jconf):
    """
    Initialize SparkContext in function to allow subclass specific initialization
    """
    return self._jvm.JavaSparkContext(jconf){code}
 

 

If we change it to the following, it works fine for me.
{code:java}
def _initialize_context(self, jconf):
    """
    Initialize SparkContext in function to allow subclass specific initialization
    """
    return self._jvm.JavaSparkContext(self._jvm.org.apache.spark.SparkContext.getOrCreate(jconf)){code}
 

It looks like a good use case for improvement.


> Reuse SparkSession in pyspark via Gateway
> -----------------------------------------
>
>                 Key: SPARK-30002
>                 URL: https://issues.apache.org/jira/browse/SPARK-30002
>             Project: Spark
>          Issue Type: Improvement
>          Components: PySpark
>    Affects Versions: 2.4.0, 2.4.4
>            Reporter: Kaushal Prajapati
>            Priority: Minor
>
> In PySpark, we create SparkContext via user spark configurations or the default ones, and it gets launched through py4j gateway internally.
> Let's say if I have launched py4j gateway from another application then to communicate with the same py4j gateway I have to set below configuration:-
>  
> {code:java}
> export PYSPARK_GATEWAY_PORT=12345
> export PYSPARK_GATEWAY_SECRET=***********************
> {code}
>  
> So when PySpark tries to create its own SparkContext after the communication has been set up, it doesn't check whether there is any available SparkContext in the same JVM.
> Current code snippet:-
>  
> {code:java}
> def _initialize_context(self, jconf):
>     """
>     Initialize SparkContext in function to allow subclass specific initialization
>     """
>     return self._jvm.JavaSparkContext(jconf){code}
>  
>  
> After changing it to the following, it works fine for me.
> {code:java}
> def _initialize_context(self, jconf):
>     """
>     Initialize SparkContext in function to allow subclass specific initialization
>     """
>     return self._jvm.JavaSparkContext(self._jvm.org.apache.spark.SparkContext.getOrCreate(jconf)){code}
>  
> It looks like a good use case for improvement.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org