You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2021/09/07 00:18:00 UTC

[jira] [Commented] (SPARK-35803) Spark SQL does not support creating views using DataSource v2 based data sources

    [ https://issues.apache.org/jira/browse/SPARK-35803?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17410844#comment-17410844 ] 

Apache Spark commented on SPARK-35803:
--------------------------------------

User 'planga82' has created a pull request for this issue:
https://github.com/apache/spark/pull/33922

> Spark SQL does not support creating views using DataSource v2 based data sources
> --------------------------------------------------------------------------------
>
>                 Key: SPARK-35803
>                 URL: https://issues.apache.org/jira/browse/SPARK-35803
>             Project: Spark
>          Issue Type: New Feature
>          Components: SQL
>    Affects Versions: 2.4.8, 3.1.2
>            Reporter: David Rabinowitz
>            Priority: Major
>
> When a temporary view is created in Spark SQL using an external data source, Spark then tries to create the relevant relation using DataSource.resolveRelation() method. Unlike DataFrameReader.load(), resolveRelation() does not check if the provided DataSource implements the DataSourceV2 interface and instead tries to use the RelationProvider trait in order to generate the Relation.
> Furthermore, DataSourceV2Relation is not a subclass of BaseRelation, so it cannot be used in resolveRelation().
> Last, I tried to implement the RelationProvider trait in my Java implementation of DataSourceV2, but the match inside resolveRelation() did not detect it as RelationProvider.
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org