You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2018/01/26 09:48:00 UTC

[jira] [Assigned] (SPARK-23228) Able to track Python create SparkSession in JVM

     [ https://issues.apache.org/jira/browse/SPARK-23228?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Apache Spark reassigned SPARK-23228:
------------------------------------

    Assignee:     (was: Apache Spark)

> Able to track Python create SparkSession in JVM
> -----------------------------------------------
>
>                 Key: SPARK-23228
>                 URL: https://issues.apache.org/jira/browse/SPARK-23228
>             Project: Spark
>          Issue Type: Improvement
>          Components: PySpark
>    Affects Versions: 2.4.0
>            Reporter: Saisai Shao
>            Priority: Minor
>
> Currently when we write a {{SparkListener}} which invokes {{SparkSession}} and loaded in PySpark application. This {{SparkListener}} will fail to get {{SparkSession}} created by PySpark, so the below {{assert}} will throw an exception. To avoid such issue, we should add PySpark created {{SparkSession}} into JVM {{defaultSession}}.
> {code}
> spark.sql("CREATE TABLE test (a INT)")
> {code}
> {code}
> class TestSparkSession extends SparkListener with Logging {
>   override def onOtherEvent(event: SparkListenerEvent): Unit = {
>     event match {
>       case CreateTableEvent(db, table) =>
>         val session = SparkSession.getActiveSession.orElse(SparkSession.getDefaultSession).get
>         assert(session != null)
>         val tableInfo = session.sharedState.externalCatalog.getTable(db, table)
>         logInfo(s"Table info ${tableInfo}")
>       case e =>
>         logInfo(s"event $e")
>     }
>   }
> }
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org