You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Shixiong Zhu (JIRA)" <ji...@apache.org> on 2017/07/21 06:25:00 UTC
[jira] [Commented] (SPARK-21488) Make saveAsTable() and
createOrReplaceTempView() return dataframe of created table/ created view
[ https://issues.apache.org/jira/browse/SPARK-21488?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16095846#comment-16095846 ]
Shixiong Zhu commented on SPARK-21488:
--------------------------------------
Unfortunately, this will break binary compatibility. This cannot be done until 3.0.0.
> Make saveAsTable() and createOrReplaceTempView() return dataframe of created table/ created view
> ------------------------------------------------------------------------------------------------
>
> Key: SPARK-21488
> URL: https://issues.apache.org/jira/browse/SPARK-21488
> Project: Spark
> Issue Type: Improvement
> Components: SQL
> Affects Versions: 2.2.0
> Reporter: Ruslan Dautkhanov
>
> It would be great to make saveAsTable() return dataframe of created table,
> so you could pipe result further as for example
> {code}
> mv_table_df = (sqlc.sql('''
> SELECT ...
> FROM
> ''')
> .write.format("parquet").mode("overwrite")
> .saveAsTable('test.parquet_table')
> .createOrReplaceTempView('mv_table')
> )
> {code}
> ... Above code returns now expectedly:
> {noformat}
> AttributeError: 'NoneType' object has no attribute 'createOrReplaceTempView'
> {noformat}
> If this is implemented, we can skip a step like
> {code}
> sqlc.sql('SELECT * FROM test.parquet_table').createOrReplaceTempView('mv_table')
> {code}
> We have this pattern very frequently.
> Further improvement can be made if createOrReplaceTempView also returns dataframe object, so in one pipeline of functions
> we can
> - create an external table
> - create a dataframe reference to this newly created for SparkSQL and as a Spark variable.
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org