You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@orc.apache.org by "Owen O'Malley (JIRA)" <ji...@apache.org> on 2016/04/22 18:39:13 UTC

[jira] [Resolved] (ORC-44) How to flush orc writer?

     [ https://issues.apache.org/jira/browse/ORC-44?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Owen O'Malley resolved ORC-44.
------------------------------
    Resolution: Information Provided
      Assignee: Owen O'Malley

> How to flush orc writer?
> ------------------------
>
>                 Key: ORC-44
>                 URL: https://issues.apache.org/jira/browse/ORC-44
>             Project: Orc
>          Issue Type: Bug
>         Environment: hadoop version: 2.5.0-cdh5.3.2
> hive version: 0.13.1
>            Reporter: Tao Li
>            Assignee: Owen O'Malley
>
> I am using  org.apache.hadoop.hive.ql.io.orc.Writer API to generate orc file. I wan to flush the memory data to hdfs. Method close() is work for me, but it will close the orc file. Is there some method like flush() which I can use to flush the memory but not close the orc file?
> {code:java}
> OrcFile.WriterOptions writerOptions = OrcFile.writerOptions(conf);
> writerOptions.inspector(deserializer.getObjectInspector());
> Writer writer = OrcFile.createWriter(new Path(file), writerOptions);
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)