You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "kevin yu (JIRA)" <ji...@apache.org> on 2015/10/19 20:52:27 UTC
[jira] [Commented] (SPARK-11186) Caseness inconsistency between
SQLContext and HiveContext
[ https://issues.apache.org/jira/browse/SPARK-11186?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14963835#comment-14963835 ]
kevin yu commented on SPARK-11186:
----------------------------------
Hello Santiago: How did you run the above code? did you get any stack trace? I tried on spark-shell, I got the error, it seems that the SQLContext.value is a protected field.
can't access the . scala> sqlc.catalog.registerTable(relationName :: Nil, LogicalRelation(new BaseRelation {
| override def sqlContext: SQLContext = sqlc
| override def schema: StructType = StructType(Nil)
| }))
<console>:26: error: lazy value catalog in class SQLContext cannot be accessed in org.apache.spark.sql.SQLContext
Access to protected value catalog not permitted because
enclosing class $iwC is not a subclass of
class SQLContext in package sql where target is defined
sqlc.catalog.registerTable(relationName :: Nil, LogicalRelation(new BaseRelation {
> Caseness inconsistency between SQLContext and HiveContext
> ---------------------------------------------------------
>
> Key: SPARK-11186
> URL: https://issues.apache.org/jira/browse/SPARK-11186
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 1.5.1
> Reporter: Santiago M. Mola
> Priority: Minor
>
> Default catalog behaviour for caseness is different in {{SQLContext}} and {{HiveContext}}.
> {code}
> test("Catalog caseness (SQL)") {
> val sqlc = new SQLContext(sc)
> val relationName = "MyTable"
> sqlc.catalog.registerTable(relationName :: Nil, LogicalRelation(new BaseRelation {
> override def sqlContext: SQLContext = sqlc
> override def schema: StructType = StructType(Nil)
> }))
> val tables = sqlc.tableNames()
> assert(tables.contains(relationName))
> }
> test("Catalog caseness (Hive)") {
> val sqlc = new HiveContext(sc)
> val relationName = "MyTable"
> sqlc.catalog.registerTable(relationName :: Nil, LogicalRelation(new BaseRelation {
> override def sqlContext: SQLContext = sqlc
> override def schema: StructType = StructType(Nil)
> }))
> val tables = sqlc.tableNames()
> assert(tables.contains(relationName))
> }
> {code}
> Looking at {{HiveContext#SQLSession}}, I see this is the intended behaviour. But the reason that this is needed seems undocumented (both in the manual or in the source code comments).
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org