You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@carbondata.apache.org by ra...@apache.org on 2019/08/12 14:20:40 UTC

[carbondata] branch master updated: [CARBONDATA-3490] Fix concurrent data load failure with carbondata FileNotFound exception

This is an automated email from the ASF dual-hosted git repository.

ravipesala pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/carbondata.git


The following commit(s) were added to refs/heads/master by this push:
     new a73cadd  [CARBONDATA-3490] Fix concurrent data load failure with carbondata FileNotFound exception
a73cadd is described below

commit a73cadda438de57713ffc5fd85a86b4fdb5442c7
Author: ajantha-bhat <aj...@gmail.com>
AuthorDate: Fri Aug 9 10:19:32 2019 +0530

    [CARBONDATA-3490] Fix concurrent data load failure with carbondata FileNotFound exception
    
    problem: When two load is happening concurrently, one load is cleaning the temp directory of the concurrent load
    
    cause: temp directory to store the carbon files is created using system.get nano time, due to this two load have same store location. when one load is completed, it cleaned the temp directory. causing dataload failure for other load.
    
    solution:
    use UUID instead of nano time while creating the temp directory to have each load a unique directory.
    
    This closes #3352
---
 .../main/scala/org/apache/carbondata/spark/util/CommonUtil.scala   | 7 +++++--
 1 file changed, 5 insertions(+), 2 deletions(-)

diff --git a/integration/spark-common/src/main/scala/org/apache/carbondata/spark/util/CommonUtil.scala b/integration/spark-common/src/main/scala/org/apache/carbondata/spark/util/CommonUtil.scala
index 7015279..8d6cdfb 100644
--- a/integration/spark-common/src/main/scala/org/apache/carbondata/spark/util/CommonUtil.scala
+++ b/integration/spark-common/src/main/scala/org/apache/carbondata/spark/util/CommonUtil.scala
@@ -21,6 +21,7 @@ import java.io.File
 import java.math.BigDecimal
 import java.text.SimpleDateFormat
 import java.util
+import java.util.UUID
 import java.util.regex.{Matcher, Pattern}
 
 import scala.collection.JavaConverters._
@@ -777,8 +778,10 @@ object CommonUtil {
     val isCarbonUseYarnLocalDir = CarbonProperties.getInstance().getProperty(
       CarbonCommonConstants.CARBON_LOADING_USE_YARN_LOCAL_DIR,
       CarbonCommonConstants.CARBON_LOADING_USE_YARN_LOCAL_DIR_DEFAULT).equalsIgnoreCase("true")
-    val tmpLocationSuffix =
-      s"${File.separator}carbon${System.nanoTime()}${CarbonCommonConstants.UNDERSCORE}$index"
+    val tmpLocationSuffix = s"${ File.separator }carbon${
+      UUID.randomUUID().toString
+        .replace("-", "")
+    }${ CarbonCommonConstants.UNDERSCORE }$index"
     if (isCarbonUseYarnLocalDir) {
       val yarnStoreLocations = Util.getConfiguredLocalDirs(SparkEnv.get.conf)