You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@carbondata.apache.org by sraghunandan <gi...@git.apache.org> on 2017/09/06 09:43:22 UTC

[GitHub] carbondata pull request #1332: [WIP]Regenerate hive saved data incase test c...

GitHub user sraghunandan opened a pull request:

    https://github.com/apache/carbondata/pull/1332

    [WIP]Regenerate hive saved data incase test case fails

    

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/sraghunandan/carbondata-1 disable_hive_result_caching

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/carbondata/pull/1332.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #1332
    
----
commit 19950412b431cc96a346a5f02fe65f4dfd66c7c9
Author: sraghunandan <ca...@gmail.com>
Date:   2017-09-06T09:39:44Z

    Regenerate hive saved data incase test case fails
    Reasons:
    1.May be the test case changed
    2.May be the input data changed
    3.May be the environment changed

----


---

[GitHub] carbondata pull request #1332: [CARBONDATA-1456]Regenerate cached hive resul...

Posted by asfgit <gi...@git.apache.org>.
Github user asfgit closed the pull request at:

    https://github.com/apache/carbondata/pull/1332


---

[GitHub] carbondata pull request #1332: [CARBONDATA-1456]Regenerate cached hive resul...

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1332#discussion_r137919083
  
    --- Diff: integration/spark-common-cluster-test/src/test/scala/org/apache/spark/sql/common/util/QueryTest.scala ---
    @@ -84,22 +82,34 @@ class QueryTest extends PlanTest with Suite {
         checkAnswer(df, expectedAnswer.collect())
       }
     
    -  protected def checkAnswer(carbon: String, hive: String, uniqueIdentifier:String): Unit = {
    -    val path = TestQueryExecutor.hiveresultpath + "/"+uniqueIdentifier
    +  protected def checkAnswer(carbon: String, hive: String, uniqueIdentifier: String): Unit = {
    +    val path = TestQueryExecutor.hiveresultpath + "/" + uniqueIdentifier
         if (FileFactory.isFileExist(path, FileFactory.getFileType(path))) {
    -      val objinp = new ObjectInputStream(FileFactory.getDataInputStream(path, FileFactory.getFileType(path)))
    +      val objinp = new ObjectInputStream(FileFactory
    +        .getDataInputStream(path, FileFactory.getFileType(path)))
           val rows = objinp.readObject().asInstanceOf[Array[Row]]
           objinp.close()
    -      checkAnswer(sql(carbon), rows)
    +      QueryTest.checkAnswer(sql(carbon), rows) match {
    +        case Some(errorMessage) => {
    +          FileFactory.deleteFile(path, FileFactory.getFileType(path))
    +          writeAndCheckAnswer(carbon, hive, path)
    --- End diff --
    
    Got it , my misunderstanding 


---

[GitHub] carbondata pull request #1332: [CARBONDATA-1456]Regenerate cached hive resul...

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1332#discussion_r137439030
  
    --- Diff: integration/spark-common-cluster-test/src/test/scala/org/apache/spark/sql/common/util/QueryTest.scala ---
    @@ -84,22 +82,34 @@ class QueryTest extends PlanTest with Suite {
         checkAnswer(df, expectedAnswer.collect())
       }
     
    -  protected def checkAnswer(carbon: String, hive: String, uniqueIdentifier:String): Unit = {
    -    val path = TestQueryExecutor.hiveresultpath + "/"+uniqueIdentifier
    +  protected def checkAnswer(carbon: String, hive: String, uniqueIdentifier: String): Unit = {
    +    val path = TestQueryExecutor.hiveresultpath + "/" + uniqueIdentifier
         if (FileFactory.isFileExist(path, FileFactory.getFileType(path))) {
    -      val objinp = new ObjectInputStream(FileFactory.getDataInputStream(path, FileFactory.getFileType(path)))
    +      val objinp = new ObjectInputStream(FileFactory
    +        .getDataInputStream(path, FileFactory.getFileType(path)))
           val rows = objinp.readObject().asInstanceOf[Array[Row]]
           objinp.close()
    -      checkAnswer(sql(carbon), rows)
    +      QueryTest.checkAnswer(sql(carbon), rows) match {
    +        case Some(errorMessage) => {
    +          FileFactory.deleteFile(path, FileFactory.getFileType(path))
    +          writeAndCheckAnswer(carbon, hive, path)
    --- End diff --
    
    Doesn't it go to endless loop when test fails?


---

[GitHub] carbondata issue #1332: [CARBONDATA-1456]Regenerate cached hive results if c...

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on the issue:

    https://github.com/apache/carbondata/pull/1332
  
    LGTM


---

[GitHub] carbondata issue #1332: [CARBONDATA-1456]Regenerate cached hive results if c...

Posted by sraghunandan <gi...@git.apache.org>.
Github user sraghunandan commented on the issue:

    https://github.com/apache/carbondata/pull/1332
  
    ok to test


---

[GitHub] carbondata issue #1332: [CARBONDATA-1456]Regenerate cached hive results if c...

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on the issue:

    https://github.com/apache/carbondata/pull/1332
  
    SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/577/



---

[GitHub] carbondata issue #1332: [WIP]Regenerate hive saved data incase test case fai...

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on the issue:

    https://github.com/apache/carbondata/pull/1332
  
    SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/559/



---

[GitHub] carbondata pull request #1332: [CARBONDATA-1456]Regenerate cached hive resul...

Posted by sraghunandan <gi...@git.apache.org>.
Github user sraghunandan commented on a diff in the pull request:

    https://github.com/apache/carbondata/pull/1332#discussion_r137459712
  
    --- Diff: integration/spark-common-cluster-test/src/test/scala/org/apache/spark/sql/common/util/QueryTest.scala ---
    @@ -84,22 +82,34 @@ class QueryTest extends PlanTest with Suite {
         checkAnswer(df, expectedAnswer.collect())
       }
     
    -  protected def checkAnswer(carbon: String, hive: String, uniqueIdentifier:String): Unit = {
    -    val path = TestQueryExecutor.hiveresultpath + "/"+uniqueIdentifier
    +  protected def checkAnswer(carbon: String, hive: String, uniqueIdentifier: String): Unit = {
    +    val path = TestQueryExecutor.hiveresultpath + "/" + uniqueIdentifier
         if (FileFactory.isFileExist(path, FileFactory.getFileType(path))) {
    -      val objinp = new ObjectInputStream(FileFactory.getDataInputStream(path, FileFactory.getFileType(path)))
    +      val objinp = new ObjectInputStream(FileFactory
    +        .getDataInputStream(path, FileFactory.getFileType(path)))
           val rows = objinp.readObject().asInstanceOf[Array[Row]]
           objinp.close()
    -      checkAnswer(sql(carbon), rows)
    +      QueryTest.checkAnswer(sql(carbon), rows) match {
    +        case Some(errorMessage) => {
    +          FileFactory.deleteFile(path, FileFactory.getFileType(path))
    +          writeAndCheckAnswer(carbon, hive, path)
    --- End diff --
    
    i couldn't understand your comment. how it would go to infinite loop?
    we are not using recursive call


---

[GitHub] carbondata issue #1332: [CARBONDATA-1456]Regenerate cached hive results if c...

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on the issue:

    https://github.com/apache/carbondata/pull/1332
  
    SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/565/



---

[GitHub] carbondata issue #1332: [CARBONDATA-1456]Regenerate cached hive results if c...

Posted by ravipesala <gi...@git.apache.org>.
Github user ravipesala commented on the issue:

    https://github.com/apache/carbondata/pull/1332
  
    SDV Build Fail , Please check CI http://144.76.159.231:8080/job/ApacheSDVTests/585/



---

[GitHub] carbondata issue #1332: [CARBONDATA-1456]Regenerate cached hive results if c...

Posted by CarbonDataQA <gi...@git.apache.org>.
Github user CarbonDataQA commented on the issue:

    https://github.com/apache/carbondata/pull/1332
  
    Build Success with Spark 2.1.0, Please check CI http://136.243.101.176:8080/job/ApacheCarbonPRBuilder/3434/



---