You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Max Gekk (Jira)" <ji...@apache.org> on 2022/10/12 07:03:00 UTC
[jira] [Updated] (SPARK-40768) Migrate type check failures of bloom_filter_agg() onto error classes
[ https://issues.apache.org/jira/browse/SPARK-40768?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Max Gekk updated SPARK-40768:
-----------------------------
Description:
Replace TypeCheckFailure by DataTypeMismatch in type checks in bloom_filter_agg():
https://github.com/apache/spark/blob/1f4e4c812a9dc6d7e35631c1663c1ba6f6d9b721/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/aggregate/BloomFilterAggregate.scala#L66-L76
was:
Replace TypeCheckFailure by DataTypeMismatch in type checks in the interval expressions:
1. Average
https://github.com/apache/spark/blob/47d119dfc1a06ee2d520396129b4f09bc22d3fb7/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/TypeUtils.scala#L78
2. ApproxCountDistinctForIntervals (3):
https://github.com/apache/spark/blob/08123a3795683238352e5bf55452de381349fdd9/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/aggregate/ApproxCountDistinctForIntervals.scala#L80-L91
> Migrate type check failures of bloom_filter_agg() onto error classes
> --------------------------------------------------------------------
>
> Key: SPARK-40768
> URL: https://issues.apache.org/jira/browse/SPARK-40768
> Project: Spark
> Issue Type: Sub-task
> Components: SQL
> Affects Versions: 3.4.0
> Reporter: Max Gekk
> Priority: Major
>
> Replace TypeCheckFailure by DataTypeMismatch in type checks in bloom_filter_agg():
> https://github.com/apache/spark/blob/1f4e4c812a9dc6d7e35631c1663c1ba6f6d9b721/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/aggregate/BloomFilterAggregate.scala#L66-L76
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org