You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/05/21 04:33:45 UTC
[jira] [Resolved] (SPARK-13212) Provide a way to unregister data
sources from a SQLContext
[ https://issues.apache.org/jira/browse/SPARK-13212?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon resolved SPARK-13212.
----------------------------------
Resolution: Incomplete
> Provide a way to unregister data sources from a SQLContext
> ----------------------------------------------------------
>
> Key: SPARK-13212
> URL: https://issues.apache.org/jira/browse/SPARK-13212
> Project: Spark
> Issue Type: Improvement
> Components: SQL
> Affects Versions: 1.6.0
> Reporter: Daniel Darabos
> Priority: Major
> Labels: bulk-closed
>
> We allow our users to run SQL queries on their data via a web interface. We create an isolated SQLContext with {{sqlContext.newSession()}}, create their DataFrames in this context, register them with {{registerTempTable}}, then execute the query with {{isolatedContext.sql(query)}}.
> The issue is that they have the full power of Spark SQL at their disposal. They can run {{SELECT * FROM csv.`/etc/passwd`}}. This specific syntax can be disabled by setting {{spark.sql.runSQLOnFiles}} (a private, undocumented configuration) to {{false}}. But creating a temporary table (http://spark.apache.org/docs/latest/sql-programming-guide.html#loading-data-programmatically) would still work, if we had a HiveContext.
> As long as all DataSources on the classpath are readily available, I don't think we can be reassured about the security implications. So I think a nice solution would be to make the list of available DataSources a property of the SQLContext. Then for the isolated SQLContext we could simply remove all DataSources. This would allow more fine-grained use cases too.
> What do you think?
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org