You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (Jira)" <ji...@apache.org> on 2020/03/02 19:27:00 UTC
[jira] [Resolved] (SPARK-30993) GenerateUnsafeRowJoiner corrupts
the value if the datatype is UDF and its sql type has fixed length
[ https://issues.apache.org/jira/browse/SPARK-30993?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Dongjoon Hyun resolved SPARK-30993.
-----------------------------------
Fix Version/s: 3.0.0
Assignee: Jungtaek Lim
Resolution: Fixed
This is resolved via https://github.com/apache/spark/pull/27747 in `branch-3.0`.
And, https://github.com/apache/spark/pull/27761 is under review.
> GenerateUnsafeRowJoiner corrupts the value if the datatype is UDF and its sql type has fixed length
> ---------------------------------------------------------------------------------------------------
>
> Key: SPARK-30993
> URL: https://issues.apache.org/jira/browse/SPARK-30993
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.3.4, 2.4.5, 3.0.0
> Reporter: Jungtaek Lim
> Assignee: Jungtaek Lim
> Priority: Major
> Fix For: 3.0.0
>
>
> This is reported by user mailing list, though the mail thread is regarding suspect of the behavior of mapGroupsWithState.
> [https://lists.apache.org/thread.html/r08b44a7afac4e4c971633d30b4e5d11bd7c0d6e28180e03b874ea58b%40%3Cuser.spark.apache.org%3E]
> The actual culprit is, there're a couple of methods which don't handle UDT and it makes GenerateUnsafeRowJoiner to generate incorrect code. Specifically, the issue occurs when the sql type of UDT has fixed length - GenerateUnsafeRowJoiner has the logic to update the offset position for all variable-length data, and due to this bug, UDT field with fixed length is being treated as variable-length data and its value is modified.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org