You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2021/03/15 15:12:40 UTC

[GitHub] [spark] gengliangwang opened a new pull request #31840: [SPARK-34745][SQL] Unify overflow exception error message of integral types

gengliangwang opened a new pull request #31840:
URL: https://github.com/apache/spark/pull/31840


   <!--
   Thanks for sending a pull request!  Here are some tips for you:
     1. If this is your first time, please read our contributor guidelines: https://spark.apache.org/contributing.html
     2. Ensure you have added or run the appropriate tests for your PR: https://spark.apache.org/developer-tools.html
     3. If the PR is unfinished, add '[WIP]' in your PR title, e.g., '[WIP][SPARK-XXXX] Your PR title ...'.
     4. Be sure to keep the PR description updated to reflect all changes.
     5. Please write your PR title to summarize what this PR proposes.
     6. If possible, provide a concise example to reproduce the issue for a faster review.
     7. If you want to add a new configuration, please read the guideline first for naming configurations in
        'core/src/main/scala/org/apache/spark/internal/config/ConfigEntry.scala'.
   -->
   
   ### What changes were proposed in this pull request?
   <!--
   Please clarify what changes you are proposing. The purpose of this section is to outline the changes and how this PR fixes the issue. 
   If possible, please consider writing useful notes for better and faster reviews in your PR. See the examples below.
     1. If you refactor some codes with changing classes, showing the class hierarchy will help reviewers.
     2. If you fix some SQL features, you can provide some references of other DBMSes.
     3. If there is design documentation, please add the link.
     4. If there is a discussion in the mailing list, please add the link.
   -->
   Current, the overflow exception error messages of integral types are different.
   For Byte/Short type, the message is "... caused overflow"
   For Int/Long, the message is "int/long overflow" since Spark is calling the "*Exact"(e.g. addExact, negateExact) methods from java.lang.Math.
   
   We should unify the error message by changing the message of Byte/Short as "tinyint/smallint overflow"
   
   ### Why are the changes needed?
   <!--
   Please clarify why the changes are needed. For instance,
     1. If you propose a new API, clarify the use case for a new API.
     2. If you fix a bug, you can clarify why it is a bug.
   -->
   Standardize exception messages in Spark
   
   ### Does this PR introduce _any_ user-facing change?
   <!--
   Note that it means *any* user-facing change including all aspects such as the documentation fix.
   If yes, please clarify the previous behavior and the change this PR proposes - provide the console output, description and/or an example to show the behavior difference if possible.
   If possible, please also clarify if this is a user-facing change compared to the released Spark versions or within the unreleased branches such as master.
   If no, write 'No'.
   -->
   Yes, change the error message.
   
   ### How was this patch tested?
   <!--
   If tests were added, say they were added here. Please make sure to add some test cases that check the changes thoroughly including negative and positive cases if possible.
   If it was tested in a way different from regular unit tests, please clarify how you tested step by step, ideally copy and paste-able, so that other reviewers can test and check, and descendants can verify in the future.
   If tests were not added, please describe why they were not added and/or why it was difficult to add.
   -->
   Unit tests.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] MaxGekk commented on a change in pull request #31840: [SPARK-34745][SQL] Unify overflow exception error message of integral types

Posted by GitBox <gi...@apache.org>.
MaxGekk commented on a change in pull request #31840:
URL: https://github.com/apache/spark/pull/31840#discussion_r594533710



##########
File path: sql/catalyst/src/main/scala/org/apache/spark/sql/types/numerics.scala
##########
@@ -26,7 +26,7 @@ import org.apache.spark.sql.types.Decimal.DecimalIsConflicted
 private[sql] object ByteExactNumeric extends ByteIsIntegral with Ordering.ByteOrdering {
   private def checkOverflow(res: Int, x: Byte, y: Byte, op: String): Unit = {
     if (res > Byte.MaxValue || res < Byte.MinValue) {
-      throw new ArithmeticException(s"$x $op $y caused overflow.")

Review comment:
       Actually, we have > 100 places where we use `*Exact` methods from the standard library. Such methods just throw the exception w/o any clues about inputs:
   ```java
       public static int multiplyExact(int x, int y) {
           long r = (long)x * (long)y;
           if ((int)r != r) {
               throw new ArithmeticException("integer overflow");
           }
           return (int)r;
       }
   ```
   As the first step, we could align our `ArithmeticException` to the standard library (just a few places). After that we can think of how we can improve such kind of exceptions in all possible places in Spark, and give users more context.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] gengliangwang commented on a change in pull request #31840: [SPARK-34745][SQL] Unify overflow exception error message of integral types

Posted by GitBox <gi...@apache.org>.
gengliangwang commented on a change in pull request #31840:
URL: https://github.com/apache/spark/pull/31840#discussion_r594520355



##########
File path: sql/catalyst/src/main/scala/org/apache/spark/sql/types/numerics.scala
##########
@@ -26,7 +26,7 @@ import org.apache.spark.sql.types.Decimal.DecimalIsConflicted
 private[sql] object ByteExactNumeric extends ByteIsIntegral with Ordering.ByteOrdering {
   private def checkOverflow(res: Int, x: Byte, y: Byte, op: String): Unit = {
     if (res > Byte.MaxValue || res < Byte.MinValue) {
-      throw new ArithmeticException(s"$x $op $y caused overflow.")

Review comment:
       > It's a bit risky to implement the "exact" methods ourselves, as JDK may update them in future versions. I'd rather add a try-catch to change the error message.
   
   So which one do you prefer? 
   1. Changing the error message of byte/short overflow as simply: "tinyint/smallint overflow" (I check PostgreSQL and it  also simply show error messages like "ERROR: integer out of range")
   2.Add try/catch in int/long arithmetic operations and throw a new exception with details.
   
   cc @MaxGekk @cloud-fan @maropu 
   
   




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] SparkQA commented on pull request #31840: [SPARK-34745][SQL] Unify overflow exception error message of integral types

Posted by GitBox <gi...@apache.org>.
SparkQA commented on pull request #31840:
URL: https://github.com/apache/spark/pull/31840#issuecomment-799522809


   **[Test build #136069 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/136069/testReport)** for PR 31840 at commit [`ff16782`](https://github.com/apache/spark/commit/ff16782e86da3f16c032dbb4fac7f531aa023f48).


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] gengliangwang commented on a change in pull request #31840: [SPARK-34745][SQL] Unify overflow exception error message of integral types

Posted by GitBox <gi...@apache.org>.
gengliangwang commented on a change in pull request #31840:
URL: https://github.com/apache/spark/pull/31840#discussion_r594520355



##########
File path: sql/catalyst/src/main/scala/org/apache/spark/sql/types/numerics.scala
##########
@@ -26,7 +26,7 @@ import org.apache.spark.sql.types.Decimal.DecimalIsConflicted
 private[sql] object ByteExactNumeric extends ByteIsIntegral with Ordering.ByteOrdering {
   private def checkOverflow(res: Int, x: Byte, y: Byte, op: String): Unit = {
     if (res > Byte.MaxValue || res < Byte.MinValue) {
-      throw new ArithmeticException(s"$x $op $y caused overflow.")

Review comment:
       > It's a bit risky to implement the "exact" methods ourselves, as JDK may update them in future versions. I'd rather add a try-catch to change the error message.
   
   So which one do you prefer? 
   
   1. Changing the error message of byte/short overflow as simply: "tinyint/smallint overflow" (I check PostgreSQL and it  also simply show error messages like "ERROR: integer out of range")
   2. Add try/catch in int/long arithmetic operations and throw a new exception with details.
   3. Keep the current situation and don't do the unification.
   
   cc @MaxGekk @cloud-fan @maropu 
   
   




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] maropu commented on a change in pull request #31840: [SPARK-34745][SQL] Unify overflow exception error message of integral types

Posted by GitBox <gi...@apache.org>.
maropu commented on a change in pull request #31840:
URL: https://github.com/apache/spark/pull/31840#discussion_r594790320



##########
File path: sql/catalyst/src/main/scala/org/apache/spark/sql/types/numerics.scala
##########
@@ -26,7 +26,7 @@ import org.apache.spark.sql.types.Decimal.DecimalIsConflicted
 private[sql] object ByteExactNumeric extends ByteIsIntegral with Ordering.ByteOrdering {
   private def checkOverflow(res: Int, x: Byte, y: Byte, op: String): Unit = {
     if (res > Byte.MaxValue || res < Byte.MinValue) {
-      throw new ArithmeticException(s"$x $op $y caused overflow.")

Review comment:
       > For Int/Long, the message is "int/long overflow" since Spark is calling the "*Exact"(e.g. addExact, negateExact) methods from java.lang.Math.
   
   The unification itself looks nice and I think we should add try-catch in the int/log cases first.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] MaxGekk commented on a change in pull request #31840: [SPARK-34745][SQL] Unify overflow exception error message of integral types

Posted by GitBox <gi...@apache.org>.
MaxGekk commented on a change in pull request #31840:
URL: https://github.com/apache/spark/pull/31840#discussion_r594481549



##########
File path: sql/catalyst/src/main/scala/org/apache/spark/sql/types/numerics.scala
##########
@@ -26,7 +26,7 @@ import org.apache.spark.sql.types.Decimal.DecimalIsConflicted
 private[sql] object ByteExactNumeric extends ByteIsIntegral with Ordering.ByteOrdering {
   private def checkOverflow(res: Int, x: Byte, y: Byte, op: String): Unit = {
     if (res > Byte.MaxValue || res < Byte.MinValue) {
-      throw new ArithmeticException(s"$x $op $y caused overflow.")

Review comment:
       As an end user, I would prefer to see which particular values caused the issue. Unification is nice but it shouldn't make user experience worse, IMHO. 




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] AmplabJenkins commented on pull request #31840: [SPARK-34745][SQL] Unify overflow exception error message of integral types

Posted by GitBox <gi...@apache.org>.
AmplabJenkins commented on pull request #31840:
URL: https://github.com/apache/spark/pull/31840#issuecomment-799532556


   
   Refer to this link for build results (access rights to CI server needed): 
   https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder-K8s/40652/
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] AmplabJenkins removed a comment on pull request #31840: [SPARK-34745][SQL] Unify overflow exception error message of integral types

Posted by GitBox <gi...@apache.org>.
AmplabJenkins removed a comment on pull request #31840:
URL: https://github.com/apache/spark/pull/31840#issuecomment-799532556


   
   Refer to this link for build results (access rights to CI server needed): 
   https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder-K8s/40652/
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] gengliangwang commented on a change in pull request #31840: [SPARK-34745][SQL] Unify overflow exception error message of integral types

Posted by GitBox <gi...@apache.org>.
gengliangwang commented on a change in pull request #31840:
URL: https://github.com/apache/spark/pull/31840#discussion_r594497283



##########
File path: sql/catalyst/src/main/scala/org/apache/spark/sql/types/numerics.scala
##########
@@ -26,7 +26,7 @@ import org.apache.spark.sql.types.Decimal.DecimalIsConflicted
 private[sql] object ByteExactNumeric extends ByteIsIntegral with Ordering.ByteOrdering {
   private def checkOverflow(res: Int, x: Byte, y: Byte, op: String): Unit = {
     if (res > Byte.MaxValue || res < Byte.MinValue) {
-      throw new ArithmeticException(s"$x $op $y caused overflow.")

Review comment:
       Yes, I thought about this. 
   On the other way, we can align the error message of int/log to the error message of byte/short, which is more user-friendly. Then basically we have to re-implement the "exact" methods in Spark.
   cc @cloud-fan 




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] gengliangwang commented on a change in pull request #31840: [SPARK-34745][SQL] Unify overflow exception error message of integral types

Posted by GitBox <gi...@apache.org>.
gengliangwang commented on a change in pull request #31840:
URL: https://github.com/apache/spark/pull/31840#discussion_r594520355



##########
File path: sql/catalyst/src/main/scala/org/apache/spark/sql/types/numerics.scala
##########
@@ -26,7 +26,7 @@ import org.apache.spark.sql.types.Decimal.DecimalIsConflicted
 private[sql] object ByteExactNumeric extends ByteIsIntegral with Ordering.ByteOrdering {
   private def checkOverflow(res: Int, x: Byte, y: Byte, op: String): Unit = {
     if (res > Byte.MaxValue || res < Byte.MinValue) {
-      throw new ArithmeticException(s"$x $op $y caused overflow.")

Review comment:
       > It's a bit risky to implement the "exact" methods ourselves, as JDK may update them in future versions. I'd rather add a try-catch to change the error message.
   
   So which one do you prefer? 
   
   1. Changing the error message of byte/short overflow as simply: "tinyint/smallint overflow" (I check PostgreSQL and it  also simply show error messages like "ERROR: integer out of range")
   2. Add try/catch in int/long arithmetic operations and throw a new exception with details.
   
   cc @MaxGekk @cloud-fan @maropu 
   
   




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] github-actions[bot] commented on pull request #31840: [SPARK-34745][SQL] Unify overflow exception error message of integral types

Posted by GitBox <gi...@apache.org>.
github-actions[bot] commented on pull request #31840:
URL: https://github.com/apache/spark/pull/31840#issuecomment-868086097


   We're closing this PR because it hasn't been updated in a while. This isn't a judgement on the merit of the PR in any way. It's just a way of keeping the PR queue manageable.
   If you'd like to revive this PR, please reopen it and ask a committer to remove the Stale tag!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] gengliangwang commented on a change in pull request #31840: [SPARK-34745][SQL] Unify overflow exception error message of integral types

Posted by GitBox <gi...@apache.org>.
gengliangwang commented on a change in pull request #31840:
URL: https://github.com/apache/spark/pull/31840#discussion_r594871661



##########
File path: sql/catalyst/src/main/scala/org/apache/spark/sql/errors/QueryExecutionErrors.scala
##########
@@ -293,15 +293,6 @@ object QueryExecutionErrors {
     new IllegalStateException("table stats must be specified.")
   }
 
-  def unaryMinusCauseOverflowError(originValue: Short): ArithmeticException = {
-    new ArithmeticException(s"- $originValue caused overflow.")
-  }
-
-  def binaryArithmeticCauseOverflowError(
-      eval1: Short, symbol: String, eval2: Short): ArithmeticException = {
-    new ArithmeticException(s"$eval1 $symbol $eval2 caused overflow.")
-  }

Review comment:
       > Why did you remove these methods from this file?
   
   The PR is to align the error message of byte/short to int/long's. After the change,`binaryArithmeticCauseOverflowError` is not used in anywhere. 




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] maropu commented on a change in pull request #31840: [SPARK-34745][SQL] Unify overflow exception error message of integral types

Posted by GitBox <gi...@apache.org>.
maropu commented on a change in pull request #31840:
URL: https://github.com/apache/spark/pull/31840#discussion_r594791555



##########
File path: sql/catalyst/src/main/scala/org/apache/spark/sql/errors/QueryExecutionErrors.scala
##########
@@ -293,15 +293,6 @@ object QueryExecutionErrors {
     new IllegalStateException("table stats must be specified.")
   }
 
-  def unaryMinusCauseOverflowError(originValue: Short): ArithmeticException = {
-    new ArithmeticException(s"- $originValue caused overflow.")
-  }
-
-  def binaryArithmeticCauseOverflowError(
-      eval1: Short, symbol: String, eval2: Short): ArithmeticException = {
-    new ArithmeticException(s"$eval1 $symbol $eval2 caused overflow.")
-  }

Review comment:
       Why did you remove these methods from this file? How about merging these funcs into a unified one like this?
   ```
     def arithmeticCauseOverflowError(overflowCause: String): ArithmeticException = {
       new ArithmeticException(s"$overflowCause caused overflow.")
     }
   ```




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a change in pull request #31840: [SPARK-34745][SQL] Unify overflow exception error message of integral types

Posted by GitBox <gi...@apache.org>.
cloud-fan commented on a change in pull request #31840:
URL: https://github.com/apache/spark/pull/31840#discussion_r594885844



##########
File path: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala
##########
@@ -61,7 +61,7 @@ case class UnaryMinus(
         s"""
            |$javaType $originValue = ($javaType)($eval);
            |if ($originValue == $javaBoxedType.MIN_VALUE) {
-           |  throw QueryExecutionErrors.unaryMinusCauseOverflowError($originValue);
+           |  throw new ArithmeticException("${dataType.simpleString} overflow");

Review comment:
       shall we put this error message in `QueryExecutionErrors` like `unaryMinusCauseOverflowError`?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] github-actions[bot] closed pull request #31840: [SPARK-34745][SQL] Unify overflow exception error message of integral types

Posted by GitBox <gi...@apache.org>.
github-actions[bot] closed pull request #31840:
URL: https://github.com/apache/spark/pull/31840


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark] cloud-fan commented on a change in pull request #31840: [SPARK-34745][SQL] Unify overflow exception error message of integral types

Posted by GitBox <gi...@apache.org>.
cloud-fan commented on a change in pull request #31840:
URL: https://github.com/apache/spark/pull/31840#discussion_r594509676



##########
File path: sql/catalyst/src/main/scala/org/apache/spark/sql/types/numerics.scala
##########
@@ -26,7 +26,7 @@ import org.apache.spark.sql.types.Decimal.DecimalIsConflicted
 private[sql] object ByteExactNumeric extends ByteIsIntegral with Ordering.ByteOrdering {
   private def checkOverflow(res: Int, x: Byte, y: Byte, op: String): Unit = {
     if (res > Byte.MaxValue || res < Byte.MinValue) {
-      throw new ArithmeticException(s"$x $op $y caused overflow.")

Review comment:
       It's a bit risky to implement the "exact" methods ourselves, as JDK may update them in future versions. I'd rather add a try-catch to change the error message.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org