You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@flink.apache.org by "Robert Metzger (Jira)" <ji...@apache.org> on 2020/06/09 15:16:00 UTC

[jira] [Created] (FLINK-18221) SQL client YAML validation incomplete

Robert Metzger created FLINK-18221:
--------------------------------------

             Summary: SQL client YAML validation incomplete
                 Key: FLINK-18221
                 URL: https://issues.apache.org/jira/browse/FLINK-18221
             Project: Flink
          Issue Type: Bug
          Components: Table SQL / Client
    Affects Versions: 1.12.0
            Reporter: Robert Metzger


When defining the following {{table}} in the {{sql-client-defaults.yaml}} file:

{code}
  - name: sink
    type: sink-table
    connector:
      type: filesystem
      path: "gs://robert-playground/sqlout.csv"
    format:
      type: csv
      fields:
        - name: stringfield
          type: VARCHAR
  - name: customer
    type: source-table
{code}

You get this error message
{code}
Exception in thread "main" org.apache.flink.table.client.SqlClientException: Unexpected exception. This is a bug. Please consider filing an issue.
	at org.apache.flink.table.client.SqlClient.main(SqlClient.java:213)
Caused by: org.apache.flink.table.client.gateway.SqlExecutionException: Could not create execution context.
	at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:818)
	at org.apache.flink.table.client.gateway.local.LocalExecutor.openSession(LocalExecutor.java:230)
	at org.apache.flink.table.client.SqlClient.start(SqlClient.java:108)
	at org.apache.flink.table.client.SqlClient.main(SqlClient.java:201)
Caused by: org.apache.flink.table.api.TableException: Property with key 'schema' could not be found. This is a bug because the validation logic should have checked that before.
	at org.apache.flink.table.descriptors.DescriptorProperties.lambda$exceptionSupplier$47(DescriptorProperties.java:1569)
	at java.util.Optional.orElseThrow(Optional.java:290)
	at org.apache.flink.table.descriptors.DescriptorProperties.getTableSchema(DescriptorProperties.java:695)
	at org.apache.flink.table.catalog.CatalogTableImpl.fromProperties(CatalogTableImpl.java:97)
	at org.apache.flink.table.client.gateway.local.ExecutionContext.createTableSink(ExecutionContext.java:397)
	at org.apache.flink.table.client.gateway.local.ExecutionContext.lambda$initializeCatalogs$8(ExecutionContext.java:580)
	at java.util.LinkedHashMap.forEach(LinkedHashMap.java:684)
	at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeCatalogs(ExecutionContext.java:575)
	at org.apache.flink.table.client.gateway.local.ExecutionContext.initializeTableEnvironment(ExecutionContext.java:512)
	at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:171)
	at org.apache.flink.table.client.gateway.local.ExecutionContext.<init>(ExecutionContext.java:124)
	at org.apache.flink.table.client.gateway.local.ExecutionContext$Builder.build(ExecutionContext.java:807)
	... 3 more
{code}

As a user, I would expect a friendlier error message that explains what and where something is missing. It is also confusing that the exception says "This is a bug because the validation logic should have checked that before.".



--
This message was sent by Atlassian Jira
(v8.3.4#803005)