You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@flink.apache.org by "Jingsong Lee (Jira)" <ji...@apache.org> on 2020/02/13 08:35:00 UTC
[jira] [Created] (FLINK-16032) Depends on core classifier hive-exec
in hive connector
Jingsong Lee created FLINK-16032:
------------------------------------
Summary: Depends on core classifier hive-exec in hive connector
Key: FLINK-16032
URL: https://issues.apache.org/jira/browse/FLINK-16032
Project: Flink
Issue Type: Bug
Components: Connectors / Hive
Reporter: Jingsong Lee
Fix For: 1.11.0
Now we depends on non-core classifier hive-exec, it is a uber jar and it contains a lot of classes.
This make parquet vectorization support very hard, because we use many deep api of parquet, and it is hard to compatible with multi parquet versions.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)