You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Bruce Robbins (Jira)" <ji...@apache.org> on 2022/08/19 21:32:00 UTC
[jira] [Commented] (SPARK-40152) Codegen compilation error when using split_part
[ https://issues.apache.org/jira/browse/SPARK-40152?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17582045#comment-17582045 ]
Bruce Robbins commented on SPARK-40152:
---------------------------------------
Seems to be a simple case of missing semicolons. I think it's a very simple fix.
> Codegen compilation error when using split_part
> -----------------------------------------------
>
> Key: SPARK-40152
> URL: https://issues.apache.org/jira/browse/SPARK-40152
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 3.3.0
> Reporter: Bruce Robbins
> Priority: Major
>
> The following query throws an error:
> {noformat}
> create or replace temp view v1 as
> select * from values
> ('11.12.13', '.', 3)
> as v1(col1, col2, col3);
> cache table v1;
> SELECT split_part(col1, col2, col3)
> from v1;
> {noformat}
> The error is:
> {noformat}
> 22/08/19 14:25:14 ERROR CodeGenerator: failed to compile: org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 42, Column 1: Expression "project_isNull_0 = false" is not a type
> org.codehaus.commons.compiler.CompileException: File 'generated.java', Line 42, Column 1: Expression "project_isNull_0 = false" is not a type
> at org.codehaus.janino.Java$Atom.toTypeOrCompileException(Java.java:3934)
> at org.codehaus.janino.Parser.parseBlockStatement(Parser.java:1887)
> at org.codehaus.janino.Parser.parseBlockStatements(Parser.java:1811)
> at org.codehaus.janino.Parser.parseBlock(Parser.java:1792)
> at
> {noformat}
> In the end, {{split_part}} does successfully execute, although in interpreted mode.
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org