You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "xubo245 (JIRA)" <ji...@apache.org> on 2018/01/11 07:46:00 UTC

[jira] [Created] (SPARK-23039) Fix the bug in alter table set location.

xubo245 created SPARK-23039:
-------------------------------

             Summary:  Fix the bug in alter table set location.
                 Key: SPARK-23039
                 URL: https://issues.apache.org/jira/browse/SPARK-23039
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 2.2.1
            Reporter: xubo245
            Priority: Critical


TOBO work: Fix the bug in alter table set location.   
org.apache.spark.sql.execution.command.DDLSuite#testSetLocation

{code:java}
    // TODO(gatorsmile): fix the bug in alter table set location.
   //    if (isUsingHiveMetastore) {
    //    assert(storageFormat.properties.get("path") === expected)
    //   }
{code}

Analysis:

because user add locationUri and erase path by 
{code:java}
 newPath = None
{code}
in org.apache.spark.sql.hive.HiveExternalCatalog#restoreDataSourceTable:

{code:java}
val storageWithLocation = {
      val tableLocation = getLocationFromStorageProps(table)
      // We pass None as `newPath` here, to remove the path option in storage properties.
      updateLocationInStorageProps(table, newPath = None).copy(
        locationUri = tableLocation.map(CatalogUtils.stringToURI(_)))
    }
{code}

because " We pass None as `newPath` here, to remove the path option in storage properties."

And locationUri is obtain from path in storage properties

{code:java}
  private def getLocationFromStorageProps(table: CatalogTable): Option[String] = {
    CaseInsensitiveMap(table.storage.properties).get("path")
  }
{code}

So we can use locationUri instead path




--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org