You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wenchen Fan (JIRA)" <ji...@apache.org> on 2016/07/09 12:40:11 UTC
[jira] [Updated] (SPARK-16401) Data Source APIs: Extending
RelationProvider and CreatableRelationProvider Without
SchemaRelationProvider
[ https://issues.apache.org/jira/browse/SPARK-16401?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Wenchen Fan updated SPARK-16401:
--------------------------------
Fix Version/s: 2.1.0
> Data Source APIs: Extending RelationProvider and CreatableRelationProvider Without SchemaRelationProvider
> ---------------------------------------------------------------------------------------------------------
>
> Key: SPARK-16401
> URL: https://issues.apache.org/jira/browse/SPARK-16401
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.0.0
> Reporter: Xiao Li
> Assignee: Xiao Li
> Priority: Critical
> Fix For: 2.0.1, 2.1.0
>
>
> When users try to implement a data source API with extending only RelationProvider and CreatableRelationProvider, they will hit an error when resolving the relation.
> {noformat}
> spark.read
> .format("org.apache.spark.sql.test.DefaultSourceWithoutUserSpecifiedSchema")
> .load()
> .write.
> format("org.apache.spark.sql.test.DefaultSourceWithoutUserSpecifiedSchema")
> .save()
> {noformat}
> The error they hit is like
> {noformat}
> xyzDataSource does not allow user-specified schemas.;
> org.apache.spark.sql.AnalysisException: xyzDataSource does not allow user-specified schemas.;
> at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:319)
> at org.apache.spark.sql.execution.datasources.DataSource.write(DataSource.scala:494)
> at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:211)
> {noformat}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org