You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Yuming Wang (Jira)" <ji...@apache.org> on 2019/10/17 09:28:00 UTC
[jira] [Commented] (SPARK-29498) CatalogTable to HiveTable should
not change the table's ownership
[ https://issues.apache.org/jira/browse/SPARK-29498?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16953543#comment-16953543 ]
Yuming Wang commented on SPARK-29498:
-------------------------------------
I'm working on.
> CatalogTable to HiveTable should not change the table's ownership
> -----------------------------------------------------------------
>
> Key: SPARK-29498
> URL: https://issues.apache.org/jira/browse/SPARK-29498
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.3.4, 2.4.4
> Reporter: Kent Yao
> Priority: Major
>
> How to reproduce:
> {code:scala}
> test("CatalogTable to HiveTable should not change the table's ownership") {
> val catalog = newBasicCatalog()
> val identifier = TableIdentifier("test_table_owner", Some("default"))
> val owner = "Apache Spark"
> val newTable = CatalogTable(
> identifier,
> tableType = CatalogTableType.EXTERNAL,
> storage = CatalogStorageFormat(
> locationUri = None,
> inputFormat = None,
> outputFormat = None,
> serde = None,
> compressed = false,
> properties = Map.empty),
> owner = owner,
> schema = new StructType().add("i", "int"),
> provider = Some("hive"))
> catalog.createTable(newTable, false)
> assert(catalog.getTable("default", "test_table_owner").owner === owner)
> }
> {code}
> {noformat}
> [info] - CatalogTable to HiveTable should not change the table's ownership *** FAILED *** (267 milliseconds)
> [info] "[yumwang]" did not equal "[Apache Spark]" (HiveExternalCatalogSuite.scala:136)
> {noformat}
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org