You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Takeshi Yamamuro (Jira)" <ji...@apache.org> on 2019/10/30 06:54:00 UTC

[jira] [Created] (SPARK-29650) Discard a NULL constant in LIMIT

Takeshi Yamamuro created SPARK-29650:
----------------------------------------

             Summary: Discard a NULL constant in LIMIT
                 Key: SPARK-29650
                 URL: https://issues.apache.org/jira/browse/SPARK-29650
             Project: Spark
          Issue Type: Sub-task
          Components: SQL
    Affects Versions: 3.0.0
            Reporter: Takeshi Yamamuro


In PostgreSQL, a NULL constant is accepted in LIMIT and its just ignored.

But, in spark, it throws an exception below;
{code:java}
select * from int8_tbl limit (case when random() < 0.5 then bigint(null) end);

org.apache.spark.sql.AnalysisException
The limit expression must evaluate to a constant value, but got CASE WHEN (`_nondeterministic` < CAST(0.5BD AS DOUBLE)) THEN CAST(NULL AS BIGINT) END; {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org