You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Robert Kruszewski (JIRA)" <ji...@apache.org> on 2017/05/09 19:07:04 UTC
[jira] [Commented] (SPARK-20682) Support a new faster ORC data
source based on Apache ORC
[ https://issues.apache.org/jira/browse/SPARK-20682?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16003321#comment-16003321 ]
Robert Kruszewski commented on SPARK-20682:
-------------------------------------------
Would it make sense to bring hive-storage-api to sql/core and allow people use orc without -Phive? There's a ton of cruft that hive brings like hive-exec which technically isn't necessary for orc. Looking at hive-storage-api it's mostly bean classes to define types and filters.
> Support a new faster ORC data source based on Apache ORC
> --------------------------------------------------------
>
> Key: SPARK-20682
> URL: https://issues.apache.org/jira/browse/SPARK-20682
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 1.4.1, 1.5.2, 1.6.3, 2.1.1
> Reporter: Dongjoon Hyun
>
> Since SPARK-2883, Apache Spark supports Apache ORC inside `sql/hive` module with Hive dependency. This issue aims to add a new and faster ORC data source inside `sql/core` and to replace the old ORC data source eventually. In this issue, the latest Apache ORC 1.4.0 (released yesterday) is used.
> There are four key benefits.
> - Speed: Use both Spark `ColumnarBatch` and ORC `RowBatch` together. This is faster than the current implementation in Spark.
> - Stability: Apache ORC 1.4.0 has many fixes and we can depend on ORC community more.
> - Usability: User can use `ORC` data sources with hive module, i.e, `-Phive`.
> - Maintainability: Reduce the Hive dependency and can remove old legacy code later.
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org