You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by JoshRosen <gi...@git.apache.org> on 2015/12/30 04:26:56 UTC

[GitHub] spark pull request: [SPARK-12579][SQL] Force user-specified JDBC d...

GitHub user JoshRosen opened a pull request:

    https://github.com/apache/spark/pull/10519

    [SPARK-12579][SQL] Force user-specified JDBC driver to take precedence

    Spark SQL's JDBC data source allows users to specify an explicit JDBC driver to load (using the `driver` argument), but in the current code it's possible that the user-specified driver will not be used when it comes time to actually create a JDBC connection.
    
    In a nutshell, the problem is that you might have multiple JDBC drivers on the classpath that claim to be able to handle the same subprotocol, so simply registering the user-provided driver class with the our `DriverRegistry` and JDBC's `DriverManager` is not sufficient to ensure that it's actually used when creating the JDBC connection.
    
    This patch addresses this issue by first registering the user-specified driver with the DriverManager, then iterating over the driver manager's loaded drivers in order to obtain the correct driver and use it to create a connection (previously, we just called `DriverManager.getConnection()` directly).
    
    If a user did not specify a JDBC driver to use, then we call `DriverManager.getDriver` to figure out the class of the driver to use, then pass that class's name to executors; this guards against corner-case bugs in situations where the driver and executor JVMs might have different sets of JDBC drivers on their classpaths (previously, there was the (rare) potential for `DriverManager.getConnection()` to use different drivers on the driver and executors if the user had not explicitly specified a JDBC driver class and the classpaths were different).
    
    This patch is inspired by a similar patch that I made to the `spark-redshift` library (https://github.com/databricks/spark-redshift/pull/143), which contains its own modified fork of some of Spark's JDBC data source code (for cross-Spark-version compatibility reasons).

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/JoshRosen/spark jdbc-driver-precedence

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/10519.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #10519
    
----
commit 3554d68fd38df399fa863c5c14110cc17a826038
Author: Josh Rosen <jo...@databricks.com>
Date:   2015-12-30T02:28:39Z

    Force user-specified JDBC driver to take precedence.

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-12579][SQL] Force user-specified JDBC d...

Posted by yhuai <gi...@git.apache.org>.
Github user yhuai commented on the pull request:

    https://github.com/apache/spark/pull/10519#issuecomment-168764660
  
    Also merging to branch 1.6.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-12579][SQL] Force user-specified JDBC d...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/10519#issuecomment-167938479
  
    Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-12579][SQL] Force user-specified JDBC d...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/10519#issuecomment-168620703
  
    **[Test build #48647 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/48647/consoleFull)** for PR 10519 at commit [`3554d68`](https://github.com/apache/spark/commit/3554d68fd38df399fa863c5c14110cc17a826038).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-12579][SQL] Force user-specified JDBC d...

Posted by yhuai <gi...@git.apache.org>.
Github user yhuai commented on the pull request:

    https://github.com/apache/spark/pull/10519#issuecomment-168762943
  
    LGTM


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-12579][SQL] Force user-specified JDBC d...

Posted by JoshRosen <gi...@git.apache.org>.
Github user JoshRosen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/10519#discussion_r48762855
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JdbcUtils.scala ---
    @@ -34,10 +35,31 @@ import org.apache.spark.sql.{DataFrame, Row}
     object JdbcUtils extends Logging {
     
       /**
    -   * Establishes a JDBC connection.
    +   * Returns a factory for creating connections to the given JDBC URL.
    +   *
    +   * @param url the JDBC url to connect to.
    +   * @param properties JDBC connection properties.
        */
    -  def createConnection(url: String, connectionProperties: Properties): Connection = {
    -    JDBCRDD.getConnector(connectionProperties.getProperty("driver"), url, connectionProperties)()
    +  def createConnectionFactory(url: String, properties: Properties): () => Connection = {
    +    val userSpecifiedDriverClass = Option(properties.getProperty("driver"))
    +    userSpecifiedDriverClass.foreach(DriverRegistry.register)
    +    // Performing this part of the logic on the driver guards against the corner-case where the
    +    // driver returned for a URL is different on the driver and executors due to classpath
    +    // differences.
    +    val driverClass: String = userSpecifiedDriverClass.getOrElse {
    +      DriverManager.getDriver(url).getClass.getCanonicalName
    +    }
    +    () => {
    +      userSpecifiedDriverClass.foreach(DriverRegistry.register)
    --- End diff --
    
    Yep, that's right: this function gets shipped to executors, where it's called to create connections.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-12579][SQL] Force user-specified JDBC d...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/10519#issuecomment-168645890
  
    **[Test build #48647 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/48647/consoleFull)** for PR 10519 at commit [`3554d68`](https://github.com/apache/spark/commit/3554d68fd38df399fa863c5c14110cc17a826038).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-12579][SQL] Force user-specified JDBC d...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/10519#issuecomment-168646362
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/48647/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-12579][SQL] Force user-specified JDBC d...

Posted by JoshRosen <gi...@git.apache.org>.
Github user JoshRosen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/10519#discussion_r48762926
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JdbcUtils.scala ---
    @@ -34,10 +35,31 @@ import org.apache.spark.sql.{DataFrame, Row}
     object JdbcUtils extends Logging {
     
       /**
    -   * Establishes a JDBC connection.
    +   * Returns a factory for creating connections to the given JDBC URL.
    +   *
    +   * @param url the JDBC url to connect to.
    +   * @param properties JDBC connection properties.
        */
    -  def createConnection(url: String, connectionProperties: Properties): Connection = {
    -    JDBCRDD.getConnector(connectionProperties.getProperty("driver"), url, connectionProperties)()
    +  def createConnectionFactory(url: String, properties: Properties): () => Connection = {
    +    val userSpecifiedDriverClass = Option(properties.getProperty("driver"))
    +    userSpecifiedDriverClass.foreach(DriverRegistry.register)
    +    // Performing this part of the logic on the driver guards against the corner-case where the
    +    // driver returned for a URL is different on the driver and executors due to classpath
    +    // differences.
    +    val driverClass: String = userSpecifiedDriverClass.getOrElse {
    +      DriverManager.getDriver(url).getClass.getCanonicalName
    +    }
    +    () => {
    +      userSpecifiedDriverClass.foreach(DriverRegistry.register)
    +      val driver: Driver = DriverManager.getDrivers.asScala.collectFirst {
    +        case d: DriverWrapper if d.wrapped.getClass.getCanonicalName == driverClass => d
    --- End diff --
    
    This is the only real bit of trickiness here and was the part that was missing from my `spark-redshift` patch.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-12579][SQL] Force user-specified JDBC d...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/10519#issuecomment-167938457
  
    **[Test build #48446 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/48446/consoleFull)** for PR 10519 at commit [`3554d68`](https://github.com/apache/spark/commit/3554d68fd38df399fa863c5c14110cc17a826038).
     * This patch passes all tests.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-12579][SQL] Force user-specified JDBC d...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/10519#issuecomment-167938480
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/48446/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-12579][SQL] Force user-specified JDBC d...

Posted by yhuai <gi...@git.apache.org>.
Github user yhuai commented on a diff in the pull request:

    https://github.com/apache/spark/pull/10519#discussion_r48758295
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JdbcUtils.scala ---
    @@ -34,10 +35,31 @@ import org.apache.spark.sql.{DataFrame, Row}
     object JdbcUtils extends Logging {
     
       /**
    -   * Establishes a JDBC connection.
    +   * Returns a factory for creating connections to the given JDBC URL.
    +   *
    +   * @param url the JDBC url to connect to.
    +   * @param properties JDBC connection properties.
        */
    -  def createConnection(url: String, connectionProperties: Properties): Connection = {
    -    JDBCRDD.getConnector(connectionProperties.getProperty("driver"), url, connectionProperties)()
    +  def createConnectionFactory(url: String, properties: Properties): () => Connection = {
    +    val userSpecifiedDriverClass = Option(properties.getProperty("driver"))
    +    userSpecifiedDriverClass.foreach(DriverRegistry.register)
    +    // Performing this part of the logic on the driver guards against the corner-case where the
    +    // driver returned for a URL is different on the driver and executors due to classpath
    +    // differences.
    +    val driverClass: String = userSpecifiedDriverClass.getOrElse {
    +      DriverManager.getDriver(url).getClass.getCanonicalName
    +    }
    +    () => {
    +      userSpecifiedDriverClass.foreach(DriverRegistry.register)
    +      val driver: Driver = DriverManager.getDrivers.asScala.collectFirst {
    +        case d: DriverWrapper if d.wrapped.getClass.getCanonicalName == driverClass => d
    +        case d if d.getClass.getCanonicalName == driverClass => d
    +      }.getOrElse {
    +        throw new IllegalStateException(
    +          s"Did not find registered driver with class $driverClass")
    --- End diff --
    
    Maybe we can add some contents to ask users to check if they have added the correct jdbc driver jar?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-12579][SQL] Force user-specified JDBC d...

Posted by JoshRosen <gi...@git.apache.org>.
Github user JoshRosen commented on the pull request:

    https://github.com/apache/spark/pull/10519#issuecomment-168617582
  
    /cc @yhuai @marmbrus or @rxin for a review pass.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-12579][SQL] Force user-specified JDBC d...

Posted by asfgit <gi...@git.apache.org>.
Github user asfgit closed the pull request at:

    https://github.com/apache/spark/pull/10519


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-12579][SQL] Force user-specified JDBC d...

Posted by JoshRosen <gi...@git.apache.org>.
Github user JoshRosen commented on the pull request:

    https://github.com/apache/spark/pull/10519#issuecomment-168617538
  
    Jenkins, retest this please.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-12579][SQL] Force user-specified JDBC d...

Posted by yhuai <gi...@git.apache.org>.
Github user yhuai commented on a diff in the pull request:

    https://github.com/apache/spark/pull/10519#discussion_r48758188
  
    --- Diff: sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JdbcUtils.scala ---
    @@ -34,10 +35,31 @@ import org.apache.spark.sql.{DataFrame, Row}
     object JdbcUtils extends Logging {
     
       /**
    -   * Establishes a JDBC connection.
    +   * Returns a factory for creating connections to the given JDBC URL.
    +   *
    +   * @param url the JDBC url to connect to.
    +   * @param properties JDBC connection properties.
        */
    -  def createConnection(url: String, connectionProperties: Properties): Connection = {
    -    JDBCRDD.getConnector(connectionProperties.getProperty("driver"), url, connectionProperties)()
    +  def createConnectionFactory(url: String, properties: Properties): () => Connection = {
    +    val userSpecifiedDriverClass = Option(properties.getProperty("driver"))
    +    userSpecifiedDriverClass.foreach(DriverRegistry.register)
    +    // Performing this part of the logic on the driver guards against the corner-case where the
    +    // driver returned for a URL is different on the driver and executors due to classpath
    +    // differences.
    +    val driverClass: String = userSpecifiedDriverClass.getOrElse {
    +      DriverManager.getDriver(url).getClass.getCanonicalName
    +    }
    +    () => {
    +      userSpecifiedDriverClass.foreach(DriverRegistry.register)
    --- End diff --
    
    This is the one that register the right driver at the executor side, right?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-12579][SQL] Force user-specified JDBC d...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/10519#issuecomment-167931687
  
    **[Test build #48446 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/48446/consoleFull)** for PR 10519 at commit [`3554d68`](https://github.com/apache/spark/commit/3554d68fd38df399fa863c5c14110cc17a826038).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-12579][SQL] Force user-specified JDBC d...

Posted by yhuai <gi...@git.apache.org>.
Github user yhuai commented on the pull request:

    https://github.com/apache/spark/pull/10519#issuecomment-168762994
  
    Merging to master.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-12579][SQL] Force user-specified JDBC d...

Posted by JoshRosen <gi...@git.apache.org>.
Github user JoshRosen commented on the pull request:

    https://github.com/apache/spark/pull/10519#issuecomment-168435471
  
    I'd appreciate any feedback on how we can/should test this change and prevent this behavior from regressing in the future.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-12579][SQL] Force user-specified JDBC d...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/10519#issuecomment-168646360
  
    Merged build finished. Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-12579][SQL] Force user-specified JDBC d...

Posted by yhuai <gi...@git.apache.org>.
Github user yhuai commented on the pull request:

    https://github.com/apache/spark/pull/10519#issuecomment-168751741
  
    Looks good to me. Just have a quick clarification question.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org