You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2016/10/08 12:49:20 UTC
[jira] [Commented] (SPARK-11186) Caseness inconsistency between
SQLContext and HiveContext
[ https://issues.apache.org/jira/browse/SPARK-11186?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15557911#comment-15557911 ]
Hyukjin Kwon commented on SPARK-11186:
--------------------------------------
[~smolav] Could you confirm if this still happens in the current master?
> Caseness inconsistency between SQLContext and HiveContext
> ---------------------------------------------------------
>
> Key: SPARK-11186
> URL: https://issues.apache.org/jira/browse/SPARK-11186
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 1.5.1
> Reporter: Santiago M. Mola
> Priority: Minor
>
> Default catalog behaviour for caseness is different in {{SQLContext}} and {{HiveContext}}.
> {code}
> test("Catalog caseness (SQL)") {
> val sqlc = new SQLContext(sc)
> val relationName = "MyTable"
> sqlc.catalog.registerTable(relationName :: Nil, LogicalRelation(new BaseRelation {
> override def sqlContext: SQLContext = sqlc
> override def schema: StructType = StructType(Nil)
> }))
> val tables = sqlc.tableNames()
> assert(tables.contains(relationName))
> }
> test("Catalog caseness (Hive)") {
> val sqlc = new HiveContext(sc)
> val relationName = "MyTable"
> sqlc.catalog.registerTable(relationName :: Nil, LogicalRelation(new BaseRelation {
> override def sqlContext: SQLContext = sqlc
> override def schema: StructType = StructType(Nil)
> }))
> val tables = sqlc.tableNames()
> assert(tables.contains(relationName))
> }
> {code}
> Looking at {{HiveContext#SQLSession}}, I see this is the intended behaviour. But the reason that this is needed seems undocumented (both in the manual or in the source code comments).
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org