You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by "Dian Fu (Jira)" <ji...@apache.org> on 2022/03/30 07:16:00 UTC

[jira] [Closed] (FLINK-26919) Table API program compiles field with "Cannot resolve field [a], input field list:[EXPR$0]"

     [ https://issues.apache.org/jira/browse/FLINK-26919?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Dian Fu closed FLINK-26919.
---------------------------
    Resolution: Invalid

> Table API program compiles field with "Cannot resolve field [a], input field list:[EXPR$0]"
> -------------------------------------------------------------------------------------------
>
>                 Key: FLINK-26919
>                 URL: https://issues.apache.org/jira/browse/FLINK-26919
>             Project: Flink
>          Issue Type: Bug
>          Components: API / Python, Table SQL / Planner
>    Affects Versions: 1.14.0
>            Reporter: Dian Fu
>            Priority: Major
>
> For the following job:
> {code}
> import datetime as dt
> from pyflink.table import *
> from pyflink.table.window import Tumble
> from pyflink.table.expressions import col, lit, UNBOUNDED_RANGE, CURRENT_RANGE
> def make_table():
>     env_settings = EnvironmentSettings.in_batch_mode()
>     table_env = TableEnvironment.create(env_settings)
>     source_data_path = '///path/to/source/directory/test.csv'
>     sink_data_path = '///path/to/sink/directory/'
>     source_ddl = f"""
>         create table Orders(
>             a VARCHAR,
>             b BIGINT,
>             c BIGINT,
>             rowtime TIMESTAMP(3),
>             WATERMARK FOR rowtime AS rowtime - INTERVAL '1' SECOND
>         ) with (
>             'connector' = 'filesystem',
>             'format' = 'csv',
>             'path' = '{source_data_path}'
>         )
>             """
>     table_env.execute_sql(source_ddl)
>     sink_ddl = f"""
>     create table `Result`(
>         a VARCHAR,
>         cnt BIGINT
>     ) with (
>         'connector' = 'filesystem',
>         'format' = 'csv',
>         'path' = '{sink_data_path}'
>     )
>     """
>     table_env.execute_sql(sink_ddl)
>     orders = table_env.from_path('Orders')
>     orders.order_by(orders.a).select(orders.a, orders.b.count).execute_insert("Result").wait()
> if __name__ == '__main__':
>     make_table()
> {code}
> It compiles failed with the following exception:
> {code}
> org.apache.flink.table.api.ValidationException: Cannot resolve field [a], input field list:[EXPR$0].
> 	at org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.failForField(ReferenceResolverRule.java:93)
> 	at org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.lambda$null$3(ReferenceResolverRule.java:87)
> 	at java.util.Optional.orElseThrow(Optional.java:290)
> 	at org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.lambda$null$4(ReferenceResolverRule.java:85)
> 	at java.util.Optional.orElseGet(Optional.java:267)
> 	at org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.lambda$visit$5(ReferenceResolverRule.java:79)
> 	at java.util.Optional.orElseGet(Optional.java:267)
> 	at org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.visit(ReferenceResolverRule.java:73)
> 	at org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.visit(ReferenceResolverRule.java:51)
> 	at org.apache.flink.table.expressions.ApiExpressionVisitor.visit(ApiExpressionVisitor.java:29)
> 	at org.apache.flink.table.expressions.UnresolvedReferenceExpression.accept(UnresolvedReferenceExpression.java:59)
> 	at org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule.lambda$apply$0(ReferenceResolverRule.java:47)
> 	at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
> 	at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1384)
> 	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482)
> 	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472)
> 	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
> 	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
> 	at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)
> 	at org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule.apply(ReferenceResolverRule.java:48)
> 	at org.apache.flink.table.expressions.resolver.ExpressionResolver.lambda$null$2(ExpressionResolver.java:241)
> 	at java.util.function.Function.lambda$andThen$1(Function.java:88)
> 	at java.util.function.Function.lambda$andThen$1(Function.java:88)
> 	at java.util.function.Function.lambda$andThen$1(Function.java:88)
> 	at java.util.function.Function.lambda$andThen$1(Function.java:88)
> 	at org.apache.flink.table.expressions.resolver.ExpressionResolver.resolve(ExpressionResolver.java:204)
> 	at org.apache.flink.table.operations.utils.OperationTreeBuilder.projectInternal(OperationTreeBuilder.java:194)
> 	at org.apache.flink.table.operations.utils.OperationTreeBuilder.project(OperationTreeBuilder.java:169)
> 	at org.apache.flink.table.api.internal.TableImpl.select(TableImpl.java:138)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498)
> 	at org.apache.flink.api.python.shaded.py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
> 	at org.apache.flink.api.python.shaded.py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
> 	at org.apache.flink.api.python.shaded.py4j.Gateway.invoke(Gateway.java:282)
> 	at org.apache.flink.api.python.shaded.py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
> 	at org.apache.flink.api.python.shaded.py4j.commands.CallCommand.execute(CallCommand.java:79)
> 	at org.apache.flink.api.python.shaded.py4j.GatewayConnection.run(GatewayConnection.java:238)
> 	at java.lang.Thread.run(Thread.java:748)
> {code}
> However, if we change orders.b.count to orders.b, it compiles succeed.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)