You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by gu...@apache.org on 2020/02/24 07:50:02 UTC
[spark] branch branch-3.0 updated: [SPARK-30898][SQL] The behavior
of MakeDecimal should not depend on SQLConf.get
This is an automated email from the ASF dual-hosted git repository.
gurwls223 pushed a commit to branch branch-3.0
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/branch-3.0 by this push:
new cfbcc58 [SPARK-30898][SQL] The behavior of MakeDecimal should not depend on SQLConf.get
cfbcc58 is described below
commit cfbcc58639d9ba1e5eb96642bc8d335dfa415dfb
Author: Peter Toth <pe...@gmail.com>
AuthorDate: Mon Feb 24 16:48:48 2020 +0900
[SPARK-30898][SQL] The behavior of MakeDecimal should not depend on SQLConf.get
### What changes were proposed in this pull request?
This PR adds a new `nullOnOverflow` parameter to `MakeDecimal` so as to avoid its value depending on `SQLConf.get` and change during planning.
### Why are the changes needed?
This allows to avoid the issue when the configuration change between different phases of planning, and this can silently break a query plan which can lead to crashes or data corruption.
### Does this PR introduce any user-facing change?
No.
### How was this patch tested?
Existing UTs.
Closes #27656 from peter-toth/SPARK-30898.
Authored-by: Peter Toth <pe...@gmail.com>
Signed-off-by: HyukjinKwon <gu...@apache.org>
(cherry picked from commit a372f76cbdfe1fb6b87f0ad7b1ced22604406f1a)
Signed-off-by: HyukjinKwon <gu...@apache.org>
---
.../sql/catalyst/expressions/decimalExpressions.scala | 16 ++++++++++++++--
1 file changed, 14 insertions(+), 2 deletions(-)
diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/decimalExpressions.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/decimalExpressions.scala
index 7b2489e..9014ebf 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/decimalExpressions.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/decimalExpressions.scala
@@ -45,9 +45,15 @@ case class UnscaledValue(child: Expression) extends UnaryExpression {
* Note: this expression is internal and created only by the optimizer,
* we don't need to do type check for it.
*/
-case class MakeDecimal(child: Expression, precision: Int, scale: Int) extends UnaryExpression {
+case class MakeDecimal(
+ child: Expression,
+ precision: Int,
+ scale: Int,
+ nullOnOverflow: Boolean) extends UnaryExpression {
- private val nullOnOverflow = !SQLConf.get.ansiEnabled
+ def this(child: Expression, precision: Int, scale: Int) = {
+ this(child, precision, scale, !SQLConf.get.ansiEnabled)
+ }
override def dataType: DataType = DecimalType(precision, scale)
override def nullable: Boolean = child.nullable || nullOnOverflow
@@ -83,6 +89,12 @@ case class MakeDecimal(child: Expression, precision: Int, scale: Int) extends Un
}
}
+object MakeDecimal {
+ def apply(child: Expression, precision: Int, scale: Int): MakeDecimal = {
+ new MakeDecimal(child, precision, scale)
+ }
+}
+
/**
* An expression used to wrap the children when promote the precision of DecimalType to avoid
* promote multiple times.
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org