You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@carbondata.apache.org by ku...@apache.org on 2019/09/11 05:12:13 UTC
[carbondata] branch master updated: [CARBONDATA-3507] Fix Create
Table As Select Failure in Spark-2.3
This is an automated email from the ASF dual-hosted git repository.
kunalkapoor pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/carbondata.git
The following commit(s) were added to refs/heads/master by this push:
new 18b151c [CARBONDATA-3507] Fix Create Table As Select Failure in Spark-2.3
18b151c is described below
commit 18b151c575da21b388c76e5451a1ec8a942d989a
Author: manishnalla1994 <ma...@gmail.com>
AuthorDate: Thu Aug 29 12:00:11 2019 +0530
[CARBONDATA-3507] Fix Create Table As Select Failure in Spark-2.3
Problem: Create table as select fails with Spark-2.3.
Cause: When creating the table location path the function
removes the "hdfs://" part from the path and then stores it,
due to which in later stages the file is treated as a Local Carbon File.
Solution: Get the original table path without removing the prefix.
This closes #3368
---
.../main/scala/org/apache/spark/sql/hive/CarbonFileMetastore.scala | 4 +---
1 file changed, 1 insertion(+), 3 deletions(-)
diff --git a/integration/spark2/src/main/scala/org/apache/spark/sql/hive/CarbonFileMetastore.scala b/integration/spark2/src/main/scala/org/apache/spark/sql/hive/CarbonFileMetastore.scala
index b19b11c..684bcbb 100644
--- a/integration/spark2/src/main/scala/org/apache/spark/sql/hive/CarbonFileMetastore.scala
+++ b/integration/spark2/src/main/scala/org/apache/spark/sql/hive/CarbonFileMetastore.scala
@@ -567,9 +567,7 @@ class CarbonFileMetastore extends CarbonMetaStore {
}
val tableLocation = catalogTable.storage.locationUri match {
case tableLoc@Some(uri) =>
- if (tableLoc.get.isInstanceOf[URI]) {
- FileFactory.getUpdatedFilePath(tableLoc.get.asInstanceOf[URI].getPath)
- }
+ FileFactory.getUpdatedFilePath(tableLoc.get.toString)
case None =>
CarbonEnv.getTablePath(tableIdentifier.database, tableIdentifier.table)(sparkSession)
}