You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Maxim Gekk (Jira)" <ji...@apache.org> on 2020/02/18 20:30:00 UTC

[jira] [Comment Edited] (SPARK-30858) IntegralDivide's dataType should not depend on SQLConf.get

    [ https://issues.apache.org/jira/browse/SPARK-30858?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17039433#comment-17039433 ] 

Maxim Gekk edited comment on SPARK-30858 at 2/18/20 8:29 PM:
-------------------------------------------------------------

The *div* function binds on this particular expression [https://github.com/apache/spark/blob/919d551ddbf7575abe7fe47d4bbba62164d6d845/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/FunctionRegistry.scala#L282] . I am not sure that we can replace it during analysis.


was (Author: maxgekk):
The *div* function binds on this particular expressions [https://github.com/apache/spark/blob/919d551ddbf7575abe7fe47d4bbba62164d6d845/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/FunctionRegistry.scala#L282] . I am not sure that we can replace it during analysis.

> IntegralDivide's dataType should not depend on SQLConf.get
> ----------------------------------------------------------
>
>                 Key: SPARK-30858
>                 URL: https://issues.apache.org/jira/browse/SPARK-30858
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: Herman van Hövell
>            Priority: Blocker
>
> {{IntegralDivide}}'s dataType depends on the value of {{SQLConf.get.integralDivideReturnLong}}. This is a problem because the configuration can change between different phases of planning, and this can silently break a query plan which can lead to crashes or data corruption. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org