You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by ge...@apache.org on 2021/04/01 06:50:42 UTC

[spark] branch master updated: [SPARK-34881][SQL][FOLLOW-UP] Use multiline string for TryCast' expression description

This is an automated email from the ASF dual-hosted git repository.

gengliang pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 8a2138d  [SPARK-34881][SQL][FOLLOW-UP] Use multiline string for TryCast' expression description
8a2138d is described below

commit 8a2138d09f489512e229c6a9e9860d7bf9ac6445
Author: Hyukjin Kwon <gu...@apache.org>
AuthorDate: Thu Apr 1 14:50:05 2021 +0800

    [SPARK-34881][SQL][FOLLOW-UP] Use multiline string for TryCast' expression description
    
    ### What changes were proposed in this pull request?
    
    This PR fixes JDK 11 compilation failed:
    
    ```
    /home/runner/work/spark/spark/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/TryCast.scala:35: error: annotation argument needs to be a constant; found: "_FUNC_(expr AS type) - Casts the value `expr` to the target data type `type`. ".+("This expression is identical to CAST with configuration `spark.sql.ansi.enabled` as ").+("true, except it returns NULL instead of raising an error. Note that the behavior of this ").+("expression doesn\'t depend on configuration  [...]
        "true, except it returns NULL instead of raising an error. Note that the behavior of this " +
    ```
    
    For whatever reason, it doesn't know that the string is actually a constant. This PR simply switches it to multi-line style (which is actually more correct).
    
    Reference:
    
    https://github.com/apache/spark/blob/bd0990e3e813d17065c593fc74f383b494fe8146/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/aggregate/ApproximatePercentile.scala#L53-L57
    
    ### Why are the changes needed?
    
    To recover the build.
    
    ### Does this PR introduce _any_ user-facing change?
    
    No, dev-only.
    
    ### How was this patch tested?
    
     CI in this PR
    
    Closes #32019 from HyukjinKwon/SPARK-34881.
    
    Lead-authored-by: Hyukjin Kwon <gu...@apache.org>
    Co-authored-by: HyukjinKwon <gu...@apache.org>
    Signed-off-by: Gengliang Wang <lt...@gmail.com>
---
 .../org/apache/spark/sql/catalyst/expressions/TryCast.scala    | 10 ++++++----
 1 file changed, 6 insertions(+), 4 deletions(-)

diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/TryCast.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/TryCast.scala
index aba76db..cae25a2 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/TryCast.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/TryCast.scala
@@ -30,10 +30,12 @@ import org.apache.spark.sql.types.DataType
  * session local timezone by an analyzer [[ResolveTimeZone]].
  */
 @ExpressionDescription(
-  usage = "_FUNC_(expr AS type) - Casts the value `expr` to the target data type `type`. " +
-    "This expression is identical to CAST with configuration `spark.sql.ansi.enabled` as " +
-    "true, except it returns NULL instead of raising an error. Note that the behavior of this " +
-    "expression doesn't depend on configuration `spark.sql.ansi.enabled`.",
+  usage = """
+    _FUNC_(expr AS type) - Casts the value `expr` to the target data type `type`.
+      This expression is identical to CAST with configuration `spark.sql.ansi.enabled` as
+      true, except it returns NULL instead of raising an error. Note that the behavior of this
+      expression doesn't depend on configuration `spark.sql.ansi.enabled`.
+  """,
   examples = """
     Examples:
       > SELECT _FUNC_('10' as int);

---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org