You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by "Rui Li (Jira)" <ji...@apache.org> on 2020/08/25 08:31:00 UTC

[jira] [Commented] (FLINK-19025) table sql write orc file but hive2.1.1 can not read

    [ https://issues.apache.org/jira/browse/FLINK-19025?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17183862#comment-17183862 ] 

Rui Li commented on FLINK-19025:
--------------------------------

Hey [~McClone], could you provide instructions about how to reproduce the issue?
Please also be noted that, we never guarantee data written by filesystem connector can be consumed by hive. It's recommended to use hive connector to write the data in order to be compatible with hive.

> table sql write orc file but hive2.1.1 can not read
> ---------------------------------------------------
>
>                 Key: FLINK-19025
>                 URL: https://issues.apache.org/jira/browse/FLINK-19025
>             Project: Flink
>          Issue Type: Bug
>          Components: Connectors / ORC
>    Affects Versions: 1.11.0
>            Reporter: McClone
>            Priority: Blocker
>
> table sql write orc file but hive2.1.1 create external table can not read data.Because flink use orc-core-1.5.6.jar but hive 2.1.1 use his own orcfile jar.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)