You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (Jira)" <ji...@apache.org> on 2023/01/14 23:45:00 UTC
[jira] [Resolved] (SPARK-42012) Implement DataFrameReader.orc
[ https://issues.apache.org/jira/browse/SPARK-42012?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Dongjoon Hyun resolved SPARK-42012.
-----------------------------------
Fix Version/s: 3.4.0
Resolution: Fixed
Issue resolved by pull request 39567
[https://github.com/apache/spark/pull/39567]
> Implement DataFrameReader.orc
> -----------------------------
>
> Key: SPARK-42012
> URL: https://issues.apache.org/jira/browse/SPARK-42012
> Project: Spark
> Issue Type: Sub-task
> Components: Connect
> Affects Versions: 3.4.0
> Reporter: Hyukjin Kwon
> Assignee: Sandeep Singh
> Priority: Major
> Fix For: 3.4.0
>
>
> {code}
> pyspark/sql/tests/test_datasources.py:114 (DataSourcesParityTests.test_read_multiple_orc_file)
> self = <pyspark.sql.tests.connect.test_parity_datasources.DataSourcesParityTests testMethod=test_read_multiple_orc_file>
> def test_read_multiple_orc_file(self):
> > df = self.spark.read.orc(
> [
> "python/test_support/sql/orc_partitioned/b=0/c=0",
> "python/test_support/sql/orc_partitioned/b=1/c=1",
> ]
> )
> ../test_datasources.py:116:
> _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
> self = <pyspark.sql.connect.readwriter.DataFrameReader object at 0x7fb170946b50>
> args = (['python/test_support/sql/orc_partitioned/b=0/c=0', 'python/test_support/sql/orc_partitioned/b=1/c=1'],)
> kwargs = {}
> def orc(self, *args: Any, **kwargs: Any) -> None:
> > raise NotImplementedError("orc() is not implemented.")
> E NotImplementedError: orc() is not implemented.
> ../../connect/readwriter.py:228: NotImplementedError
> {code}
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org