You are viewing a plain text version of this content. The canonical link for it is here.
Posted to notifications@asterixdb.apache.org by "Till Westmann (Code Review)" <do...@asterixdb.incubator.apache.org> on 2016/08/30 07:40:10 UTC

Change in asterixdb[master]: Update SQL++ test harness to use new HTTP API

Till Westmann has uploaded a new change for review.

  https://asterix-gerrit.ics.uci.edu/1127

Change subject: Update SQL++ test harness to use new HTTP API
......................................................................

Update SQL++ test harness to use new HTTP API

Change-Id: Ia36967386183777c6d840634850a8f8fc2ea4411
---
M asterixdb/asterix-app/src/test/java/org/apache/asterix/test/runtime/SqlppExecutionTest.java
M asterixdb/asterix-app/src/test/resources/runtimets/testsuite_sqlpp.xml
M asterixdb/asterix-common/src/test/java/org/apache/asterix/test/aql/ResultExtractor.java
M asterixdb/asterix-common/src/test/java/org/apache/asterix/test/aql/TestExecutor.java
4 files changed, 59 insertions(+), 53 deletions(-)


  git pull ssh://asterix-gerrit.ics.uci.edu:29418/asterixdb refs/changes/27/1127/1

diff --git a/asterixdb/asterix-app/src/test/java/org/apache/asterix/test/runtime/SqlppExecutionTest.java b/asterixdb/asterix-app/src/test/java/org/apache/asterix/test/runtime/SqlppExecutionTest.java
index 84542d0..84e644b 100644
--- a/asterixdb/asterix-app/src/test/java/org/apache/asterix/test/runtime/SqlppExecutionTest.java
+++ b/asterixdb/asterix-app/src/test/java/org/apache/asterix/test/runtime/SqlppExecutionTest.java
@@ -86,7 +86,6 @@
             testArgs.add(new Object[] { ctx });
         }
         return testArgs;
-
     }
 
     protected TestCaseContext tcCtx;
diff --git a/asterixdb/asterix-app/src/test/resources/runtimets/testsuite_sqlpp.xml b/asterixdb/asterix-app/src/test/resources/runtimets/testsuite_sqlpp.xml
index b48a99c..01a036c 100644
--- a/asterixdb/asterix-app/src/test/resources/runtimets/testsuite_sqlpp.xml
+++ b/asterixdb/asterix-app/src/test/resources/runtimets/testsuite_sqlpp.xml
@@ -1660,7 +1660,7 @@
     <test-case FilePath="dml">
       <compilation-unit name="insert-duplicated-keys">
         <output-dir compare="Text">insert-duplicated-keys</output-dir>
-        <expected-error>org.apache.hyracks.storage.am.common.exceptions.TreeIndexDuplicateKeyException: Failed to insert key since key already exists</expected-error>
+        <expected-error>Failed to insert key since key already exists</expected-error>
       </compilation-unit>
     </test-case>
     <test-case FilePath="dml">
@@ -1683,7 +1683,7 @@
     <test-case FilePath="dml">
       <compilation-unit name="insert-with-autogenerated-pk_adm_02">
         <output-dir compare="Text">insert-with-autogenerated-pk_adm_02</output-dir>
-        <expected-error>org.apache.hyracks.algebricks.common.exceptions.AlgebricksException: Duplicate field id encountered</expected-error>
+        <expected-error>Duplicate field id encountered</expected-error>
       </compilation-unit>
     </test-case>
     <test-case FilePath="dml">
@@ -1704,13 +1704,13 @@
     <test-case FilePath="dml">
       <compilation-unit name="load-with-autogenerated-pk_adm_02">
         <output-dir compare="Text">load-with-autogenerated-pk_adm_02</output-dir>
-        <expected-error>org.apache.asterix.external.parser.ADMDataParser$ParseException</expected-error>
+        <expected-error>Parse error at (0, 5): This record is closed, you can not add extra fields! new field name: id</expected-error>
       </compilation-unit>
     </test-case>
     <test-case FilePath="dml">
       <compilation-unit name="load-with-autogenerated-pk_adm_03">
         <output-dir compare="Text">load-with-autogenerated-pk_adm_03</output-dir>
-        <expected-error>org.apache.asterix.external.parser.ADMDataParser$ParseException</expected-error>
+        <expected-error>Parse error at (0, 5): This record is closed, you can not add extra fields! new field name: id</expected-error>
       </compilation-unit>
     </test-case>
     <test-case FilePath="dml">
@@ -3121,7 +3121,7 @@
     </test-case>
   </test-group>
   <test-group name="open-index-enforced">
-    <test-group FilePath="open-index-enforced/error-checking">
+    <test-group name="open-index-enforced/error-checking">
       <test-case FilePath="open-index-enforced/error-checking">
         <compilation-unit name="enforced-field-name-collision">
           <output-dir compare="Text">enforced-field-name-collision</output-dir>
@@ -3131,13 +3131,13 @@
       <test-case FilePath="open-index-enforced/error-checking">
         <compilation-unit name="enforced-field-type-collision">
           <output-dir compare="Text">enforced-field-type-collision</output-dir>
-          <expected-error>Error: A field "[value]" is already defined with the type "STRING"</expected-error>
+          <expected-error>A field &quot;[value]&quot; is already defined with the type &quot;STRING&quot;</expected-error>
         </compilation-unit>
       </test-case>
       <test-case FilePath="open-index-enforced/error-checking">
         <compilation-unit name="missing-enforce-statement">
           <output-dir compare="Text">missing-enforce-statement</output-dir>
-          <expected-error>org.apache.hyracks.algebricks.common.exceptions.AlgebricksException: Cannot create typed index on "[value]" field without enforcing it's type</expected-error>
+          <expected-error>Cannot create typed index on "[value]" field without enforcing it's type</expected-error>
         </compilation-unit>
       </test-case>
       <test-case FilePath="open-index-enforced/error-checking">
@@ -3149,11 +3149,11 @@
       <test-case FilePath="open-index-enforced/error-checking">
         <compilation-unit name="index-on-closed-type">
           <output-dir compare="Text">index-on-closed-type</output-dir>
-          <expected-error>org.apache.hyracks.algebricks.common.exceptions.AlgebricksException: Typed index on "[value]" field could be created only for open datatype</expected-error>
+          <expected-error>Typed index on "[value]" field could be created only for open datatype</expected-error>
         </compilation-unit>
       </test-case>
     </test-group>
-    <test-group FilePath="open-index-enforced/index-join">
+    <test-group name="open-index-enforced/index-join">
       <test-case FilePath="open-index-enforced/index-join">
         <compilation-unit name="btree-secondary-equi-join">
           <output-dir compare="Text">btree-secondary-equi-join</output-dir>
@@ -3195,7 +3195,7 @@
         </compilation-unit>
       </test-case>
     </test-group>
-    <test-group FilePath="open-index-enforced/index-leftouterjoin">
+    <test-group name="open-index-enforced/index-leftouterjoin">
       <test-case FilePath="open-index-enforced/index-leftouterjoin">
         <compilation-unit name="probe-pidx-with-join-btree-sidx1">
           <output-dir compare="Text">probe-pidx-with-join-btree-sidx1</output-dir>
@@ -3227,7 +3227,7 @@
         </compilation-unit>
       </test-case>
     </test-group>
-    <test-group FilePath="open-index-enforced/index-selection">
+    <test-group name="open-index-enforced/index-selection">
       <test-case FilePath="open-index-enforced/index-selection">
         <compilation-unit name="btree-index-composite-key">
           <output-dir compare="Text">btree-index-composite-key</output-dir>
@@ -3328,7 +3328,7 @@
     </test-group>
   </test-group>
   <test-group name="nested-open-index">
-    <test-group FilePath="nested-open-index/index-join">
+    <test-group name="nested-open-index/index-join">
       <test-case FilePath="nested-open-index/index-join">
         <compilation-unit name="btree-secondary-equi-join">
           <output-dir compare="Text">btree-secondary-equi-join</output-dir>
@@ -3370,7 +3370,7 @@
         </compilation-unit>
       </test-case>
     </test-group>
-    <test-group FilePath="nested-open-index/index-leftouterjoin">
+    <test-group name="nested-open-index/index-leftouterjoin">
       <test-case FilePath="nested-open-index/index-leftouterjoin">
         <compilation-unit name="probe-pidx-with-join-btree-sidx1">
           <output-dir compare="Text">probe-pidx-with-join-btree-sidx1</output-dir>
@@ -3402,7 +3402,7 @@
         </compilation-unit>
       </test-case>
     </test-group>
-    <test-group FilePath="nested-open-index/index-selection">
+    <test-group name="nested-open-index/index-selection">
       <test-case FilePath="nested-open-index/index-selection">
         <compilation-unit name="btree-index-composite-key">
           <output-dir compare="Text">btree-index-composite-key</output-dir>
@@ -3525,7 +3525,7 @@
     </test-group>
   </test-group>
   <test-group name="nested-index">
-    <test-group FilePath="nested-index/index-join">
+    <test-group name="nested-index/index-join">
       <test-case FilePath="nested-index/index-join">
         <compilation-unit name="btree-primary-equi-join">
           <output-dir compare="Text">btree-primary-equi-join</output-dir>
@@ -3572,7 +3572,7 @@
         </compilation-unit>
       </test-case>
     </test-group>
-    <test-group FilePath="nested-index/index-leftouterjoin">
+    <test-group name="nested-index/index-leftouterjoin">
       <test-case FilePath="nested-index/index-leftouterjoin">
         <compilation-unit name="probe-pidx-with-join-btree-sidx1">
           <output-dir compare="Text">probe-pidx-with-join-btree-sidx1</output-dir>
@@ -3604,7 +3604,7 @@
         </compilation-unit>
       </test-case>
     </test-group>
-    <test-group FilePath="nested-index/index-selection">
+    <test-group name="nested-index/index-selection">
       <test-case FilePath="nested-index/index-selection">
         <compilation-unit name="btree-index-composite-key">
           <output-dir compare="Text">btree-index-composite-key</output-dir>
@@ -4575,7 +4575,7 @@
     <test-case FilePath="open-closed">
       <compilation-unit name="query-issue410">
         <output-dir compare="Text">query-issue410</output-dir>
-        <expected-error>HyracksDataException: ASX0000: Field type DOUBLE can't be promoted to type STRING</expected-error>
+        <expected-error>ASX0000: Field type DOUBLE can't be promoted to type STRING</expected-error>
       </compilation-unit>
     </test-case>
     <test-case FilePath="open-closed">
@@ -7099,19 +7099,19 @@
     <test-case FilePath="load">
       <compilation-unit name="csv_05">
         <output-dir compare="Text">csv_05</output-dir>
-        <expected-error>java.io.IOException: At record:</expected-error>
+        <expected-error>At record: 1, field#: 4 - a quote enclosing a field needs to be placed in the beginning of that field</expected-error>
       </compilation-unit>
     </test-case>
     <test-case FilePath="load">
       <compilation-unit name="csv_06">
         <output-dir compare="Text">csv_06</output-dir>
-        <expected-error>java.io.IOException: At record:</expected-error>
+        <expected-error>At record: 1, field#: 3 - a quote enclosing a field needs to be placed in the beginning of that field</expected-error>
       </compilation-unit>
     </test-case>
     <test-case FilePath="load">
       <compilation-unit name="csv_07">
         <output-dir compare="Text">csv_07</output-dir>
-        <expected-error>java.io.IOException: At record:</expected-error>
+        <expected-error>At record: 1, field#: 3 -  A quote enclosing a field needs to be followed by the delimiter</expected-error>
       </compilation-unit>
     </test-case>
     <test-case FilePath="load">
@@ -7149,7 +7149,7 @@
     <test-case FilePath="load">
       <compilation-unit name="issue650_query">
         <output-dir compare="Text">none</output-dir>
-        <expected-error>org.apache.hyracks.algebricks.common.exceptions.AlgebricksException: Unable to load dataset Users since it does not exist</expected-error>
+        <expected-error>Unable to load dataset Users since it does not exist</expected-error>
       </compilation-unit>
     </test-case>
     <test-case FilePath="load">
@@ -7181,7 +7181,7 @@
     <test-case FilePath="load">
       <compilation-unit name="duplicate-key-error">
         <output-dir compare="Text">none</output-dir>
-        <expected-error>org.apache.hyracks.api.exceptions.HyracksException</expected-error>
+        <expected-error>Input stream given to BTree bulk load has duplicates</expected-error>
       </compilation-unit>
     </test-case>
     <test-case FilePath="load">
diff --git a/asterixdb/asterix-common/src/test/java/org/apache/asterix/test/aql/ResultExtractor.java b/asterixdb/asterix-common/src/test/java/org/apache/asterix/test/aql/ResultExtractor.java
index efadc8d..d99602f 100644
--- a/asterixdb/asterix-common/src/test/java/org/apache/asterix/test/aql/ResultExtractor.java
+++ b/asterixdb/asterix-common/src/test/java/org/apache/asterix/test/aql/ResultExtractor.java
@@ -48,10 +48,13 @@
         tokener.next('{');
         String name;
         String type = null;
+        String status = null;
         String results = "";
         while ((name = getFieldName(tokener)) != null) {
-            if ("requestID".equals(name) || "signature".equals(name) || "status".equals(name)) {
+            if ("requestID".equals(name) || "signature".equals(name)) {
                 getStringField(tokener);
+            } else if ("status".equals(name)) {
+                status = getStringField(tokener);
             } else if ("type".equals(name)) {
                 type = getStringField(tokener);
             } else if ("metrics".equals(name)) {
@@ -72,6 +75,9 @@
             // skip along
         }
         tokener.next('}');
+        if (! "success".equals(status)) {
+            throw new Exception("Unexpected status: '" + status + "'");
+        }
         return IOUtils.toInputStream(results);
     }
 
diff --git a/asterixdb/asterix-common/src/test/java/org/apache/asterix/test/aql/TestExecutor.java b/asterixdb/asterix-common/src/test/java/org/apache/asterix/test/aql/TestExecutor.java
index 47290e7..1a785eb 100644
--- a/asterixdb/asterix-common/src/test/java/org/apache/asterix/test/aql/TestExecutor.java
+++ b/asterixdb/asterix-common/src/test/java/org/apache/asterix/test/aql/TestExecutor.java
@@ -30,6 +30,7 @@
 import java.lang.reflect.InvocationTargetException;
 import java.lang.reflect.Method;
 import java.nio.charset.StandardCharsets;
+import java.util.ArrayList;
 import java.util.Arrays;
 import java.util.HashMap;
 import java.util.HashSet;
@@ -270,8 +271,8 @@
             String errorBody = EntityUtils.toString(httpResponse.getEntity());
             try {
                 JSONObject result = new JSONObject(errorBody);
-                String[] errors = {result.getJSONArray("error-code").getString(0), result.getString("summary"),
-                        result.getString("stacktrace")};
+                String[] errors = { result.getJSONArray("error-code").getString(0), result.getString("summary"),
+                        result.getString("stacktrace") };
                 GlobalConfig.ASTERIX_LOGGER.log(Level.SEVERE, errors[2]);
                 String exceptionMsg = "HTTP operation failed: " + errors[0]
                         + "\nSTATUS LINE: " + httpResponse.getStatusLine()
@@ -295,6 +296,10 @@
         method.setHeader("Accept", fmt.mimeType());
         HttpResponse response = executeHttpRequest(method);
         return response.getEntity().getContent();
+    }
+
+    public InputStream executeQueryService(String str, String url) throws Exception {
+        return executeQueryService(str, OutputFormat.CLEAN_JSON, url, new ArrayList<>());
     }
 
     public InputStream executeQueryService(String str, OutputFormat fmt, String url,
@@ -399,7 +404,7 @@
     }
 
     private InputStream getHandleResult(String handle, OutputFormat fmt) throws Exception {
-        final String url = "http://" + host + ":" + port + getPath(Servlets.QUERY_RESULT);
+        final String url = getEndpoint(Servlets.QUERY_RESULT);
 
         // Create a method instance.
         HttpUriRequest request = RequestBuilder.get(url)
@@ -514,7 +519,6 @@
         executeTest(actualPath, testCaseCtx, pb, isDmlRecoveryTest, null);
     }
 
-
     public void executeTest(TestCaseContext testCaseCtx, TestFileContext ctx, String statement,
             boolean isDmlRecoveryTest, ProcessBuilder pb, CompilationUnit cUnit, MutableInt queryCount,
             List<TestFileContext> expectedResultFileCtxs, File testFile, String actualPath) throws Exception {
@@ -524,9 +528,10 @@
         switch (ctx.getType()) {
             case "ddl":
                 if (ctx.getFile().getName().endsWith("aql")) {
-                    executeDDL(statement, "http://" + host + ":" + port + getPath(Servlets.AQL_DDL));
+                    executeDDL(statement, getEndpoint(Servlets.AQL_DDL));
                 } else {
-                    executeDDL(statement, "http://" + host + ":" + port + getPath(Servlets.SQLPP_DDL));
+                    InputStream resultStream = executeQueryService(statement, getEndpoint(Servlets.QUERY_SERVICE));
+                    ResultExtractor.extract(resultStream);
                 }
                 break;
             case "update":
@@ -535,9 +540,10 @@
                     statement = statement.replaceAll("nc1://", "127.0.0.1://../../../../../../asterix-app/");
                 }
                 if (ctx.getFile().getName().endsWith("aql")) {
-                    executeUpdate(statement, "http://" + host + ":" + port + getPath(Servlets.AQL_UPDATE));
+                    executeUpdate(statement, getEndpoint(Servlets.AQL_UPDATE));
                 } else {
-                    executeUpdate(statement, "http://" + host + ":" + port + getPath(Servlets.SQLPP_UPDATE));
+                    InputStream resultStream = executeQueryService(statement, getEndpoint(Servlets.QUERY_SERVICE));
+                    ResultExtractor.extract(resultStream);
                 }
                 break;
             case "query":
@@ -554,26 +560,22 @@
                 OutputFormat fmt = OutputFormat.forCompilationUnit(cUnit);
                 if (ctx.getFile().getName().endsWith("aql")) {
                     if (ctx.getType().equalsIgnoreCase("query")) {
-                        resultStream = executeQuery(statement, fmt,
-                                "http://" + host + ":" + port + getPath(Servlets.AQL_QUERY), cUnit.getParameter());
+                        resultStream = executeQuery(statement, fmt, getEndpoint(Servlets.AQL_QUERY),
+                                cUnit.getParameter());
                     } else if (ctx.getType().equalsIgnoreCase("async")) {
-                        resultStream = executeAnyAQLAsync(statement, false, fmt,
-                                "http://" + host + ":" + port + getPath(Servlets.AQL));
+                        resultStream = executeAnyAQLAsync(statement, false, fmt, getEndpoint(Servlets.AQL));
                     } else if (ctx.getType().equalsIgnoreCase("asyncdefer")) {
-                        resultStream = executeAnyAQLAsync(statement, true, fmt,
-                                "http://" + host + ":" + port + getPath(Servlets.AQL));
+                        resultStream = executeAnyAQLAsync(statement, true, fmt, getEndpoint(Servlets.AQL));
                     }
                 } else {
                     if (ctx.getType().equalsIgnoreCase("query")) {
-                        resultStream = executeQueryService(statement, fmt,
-                                "http://" + host + ":" + port + getPath(Servlets.QUERY_SERVICE), cUnit.getParameter());
+                        resultStream = executeQueryService(statement, fmt, getEndpoint(Servlets.QUERY_SERVICE),
+                                cUnit.getParameter());
                         resultStream = ResultExtractor.extract(resultStream);
                     } else if (ctx.getType().equalsIgnoreCase("async")) {
-                        resultStream = executeAnyAQLAsync(statement, false, fmt,
-                                "http://" + host + ":" + port + getPath(Servlets.SQLPP));
+                        resultStream = executeAnyAQLAsync(statement, false, fmt, getEndpoint(Servlets.SQLPP));
                     } else if (ctx.getType().equalsIgnoreCase("asyncdefer")) {
-                        resultStream = executeAnyAQLAsync(statement, true, fmt,
-                                "http://" + host + ":" + port + getPath(Servlets.SQLPP));
+                        resultStream = executeAnyAQLAsync(statement, true, fmt, getEndpoint(Servlets.SQLPP));
                     }
                 }
                 if (queryCount.intValue() >= expectedResultFileCtxs.size()) {
@@ -599,14 +601,14 @@
                 break;
             case "txnqbc": // qbc represents query before crash
                 resultStream = executeQuery(statement, OutputFormat.forCompilationUnit(cUnit),
-                        "http://" + host + ":" + port + getPath(Servlets.AQL_QUERY), cUnit.getParameter());
+                        getEndpoint(Servlets.AQL_QUERY), cUnit.getParameter());
                 qbcFile = getTestCaseQueryBeforeCrashFile(actualPath, testCaseCtx, cUnit);
                 qbcFile.getParentFile().mkdirs();
                 writeOutputToFile(qbcFile, resultStream);
                 break;
             case "txnqar": // qar represents query after recovery
                 resultStream = executeQuery(statement, OutputFormat.forCompilationUnit(cUnit),
-                        "http://" + host + ":" + port + getPath(Servlets.AQL_QUERY), cUnit.getParameter());
+                        getEndpoint(Servlets.AQL_QUERY), cUnit.getParameter());
                 File qarFile = new File(actualPath + File.separator
                         + testCaseCtx.getTestCase().getFilePath().replace(File.separator, "_") + "_" + cUnit.getName()
                         + "_qar.adm");
@@ -617,7 +619,7 @@
                 break;
             case "txneu": // eu represents erroneous update
                 try {
-                    executeUpdate(statement, "http://" + host + ":" + port + getPath(Servlets.AQL_UPDATE));
+                    executeUpdate(statement, getEndpoint(Servlets.AQL_UPDATE));
                 } catch (Exception e) {
                     // An exception is expected.
                     failed = true;
@@ -645,7 +647,7 @@
                 break;
             case "errddl": // a ddlquery that expects error
                 try {
-                    executeDDL(statement, "http://" + host + ":" + port + getPath(Servlets.AQL_DDL));
+                    executeDDL(statement, getEndpoint(Servlets.AQL_DDL));
                 } catch (Exception e) {
                     // expected error happens
                     failed = true;
@@ -685,8 +687,7 @@
             case "cstate": // cluster state query
                 try {
                     fmt = OutputFormat.forCompilationUnit(cUnit);
-                    resultStream = executeClusterStateQuery(fmt,
-                            "http://" + host + ":" + port + getPath(Servlets.CLUSTER_STATE));
+                    resultStream = executeClusterStateQuery(fmt, getEndpoint(Servlets.CLUSTER_STATE));
                     expectedResultFile = expectedResultFileCtxs.get(queryCount.intValue()).getFile();
                     actualResultFile = testCaseCtx.getActualResultFile(cUnit, expectedResultFile, new File(actualPath));
                     actualResultFile.getParentFile().mkdirs();
@@ -850,7 +851,7 @@
                         + cUnit.getName() + "_qbc.adm");
     }
 
-    protected String getPath(Servlets servlet) {
-        return servlet.getPath();
+    protected String getEndpoint(Servlets servlet) {
+        return "http://" + host + ":" + port + servlet.getPath();
     }
 }

-- 
To view, visit https://asterix-gerrit.ics.uci.edu/1127
To unsubscribe, visit https://asterix-gerrit.ics.uci.edu/settings

Gerrit-MessageType: newchange
Gerrit-Change-Id: Ia36967386183777c6d840634850a8f8fc2ea4411
Gerrit-PatchSet: 1
Gerrit-Project: asterixdb
Gerrit-Branch: master
Gerrit-Owner: Till Westmann <ti...@apache.org>

Change in asterixdb[master]: Update SQL++ test harness to use new HTTP API

Posted by "Jenkins (Code Review)" <do...@asterixdb.incubator.apache.org>.
Jenkins has posted comments on this change.

Change subject: Update SQL++ test harness to use new HTTP API
......................................................................


Patch Set 1: Integration-Tests-1

Integration Tests Failed

https://asterix-jenkins.ics.uci.edu/job/asterix-gerrit-integration-tests/503/ : UNSTABLE

-- 
To view, visit https://asterix-gerrit.ics.uci.edu/1127
To unsubscribe, visit https://asterix-gerrit.ics.uci.edu/settings

Gerrit-MessageType: comment
Gerrit-Change-Id: Ia36967386183777c6d840634850a8f8fc2ea4411
Gerrit-PatchSet: 1
Gerrit-Project: asterixdb
Gerrit-Branch: master
Gerrit-Owner: Till Westmann <ti...@apache.org>
Gerrit-Reviewer: Jenkins <je...@fulliautomatix.ics.uci.edu>
Gerrit-HasComments: No

Change in asterixdb[master]: Update SQL++ test harness to use new HTTP API

Posted by "Yingyi Bu (Code Review)" <do...@asterixdb.incubator.apache.org>.
Yingyi Bu has posted comments on this change.

Change subject: Update SQL++ test harness to use new HTTP API
......................................................................


Patch Set 2: Code-Review+2

-- 
To view, visit https://asterix-gerrit.ics.uci.edu/1127
To unsubscribe, visit https://asterix-gerrit.ics.uci.edu/settings

Gerrit-MessageType: comment
Gerrit-Change-Id: Ia36967386183777c6d840634850a8f8fc2ea4411
Gerrit-PatchSet: 2
Gerrit-Project: asterixdb
Gerrit-Branch: master
Gerrit-Owner: Till Westmann <ti...@apache.org>
Gerrit-Reviewer: Jenkins <je...@fulliautomatix.ics.uci.edu>
Gerrit-Reviewer: Yingyi Bu <bu...@gmail.com>
Gerrit-HasComments: No

Change in asterixdb[master]: Update SQL++ test harness to use new HTTP API

Posted by "Till Westmann (Code Review)" <do...@asterixdb.incubator.apache.org>.
Till Westmann has submitted this change and it was merged.

Change subject: Update SQL++ test harness to use new HTTP API
......................................................................


Update SQL++ test harness to use new HTTP API

Change-Id: Ia36967386183777c6d840634850a8f8fc2ea4411
Reviewed-on: https://asterix-gerrit.ics.uci.edu/1127
Sonar-Qube: Jenkins <je...@fulliautomatix.ics.uci.edu>
Tested-by: Jenkins <je...@fulliautomatix.ics.uci.edu>
Integration-Tests: Jenkins <je...@fulliautomatix.ics.uci.edu>
Reviewed-by: Yingyi Bu <bu...@gmail.com>
---
M asterixdb/asterix-app/src/test/java/org/apache/asterix/test/runtime/SqlppExecutionTest.java
M asterixdb/asterix-app/src/test/resources/runtimets/testsuite_sqlpp.xml
M asterixdb/asterix-common/src/test/java/org/apache/asterix/test/aql/ResultExtractor.java
M asterixdb/asterix-common/src/test/java/org/apache/asterix/test/aql/TestExecutor.java
4 files changed, 57 insertions(+), 50 deletions(-)

Approvals:
  Yingyi Bu: Looks good to me, approved
  Jenkins: Verified; No violations found; Verified



diff --git a/asterixdb/asterix-app/src/test/java/org/apache/asterix/test/runtime/SqlppExecutionTest.java b/asterixdb/asterix-app/src/test/java/org/apache/asterix/test/runtime/SqlppExecutionTest.java
index 84542d0..84e644b 100644
--- a/asterixdb/asterix-app/src/test/java/org/apache/asterix/test/runtime/SqlppExecutionTest.java
+++ b/asterixdb/asterix-app/src/test/java/org/apache/asterix/test/runtime/SqlppExecutionTest.java
@@ -86,7 +86,6 @@
             testArgs.add(new Object[] { ctx });
         }
         return testArgs;
-
     }
 
     protected TestCaseContext tcCtx;
diff --git a/asterixdb/asterix-app/src/test/resources/runtimets/testsuite_sqlpp.xml b/asterixdb/asterix-app/src/test/resources/runtimets/testsuite_sqlpp.xml
index b48a99c..01a036c 100644
--- a/asterixdb/asterix-app/src/test/resources/runtimets/testsuite_sqlpp.xml
+++ b/asterixdb/asterix-app/src/test/resources/runtimets/testsuite_sqlpp.xml
@@ -1660,7 +1660,7 @@
     <test-case FilePath="dml">
       <compilation-unit name="insert-duplicated-keys">
         <output-dir compare="Text">insert-duplicated-keys</output-dir>
-        <expected-error>org.apache.hyracks.storage.am.common.exceptions.TreeIndexDuplicateKeyException: Failed to insert key since key already exists</expected-error>
+        <expected-error>Failed to insert key since key already exists</expected-error>
       </compilation-unit>
     </test-case>
     <test-case FilePath="dml">
@@ -1683,7 +1683,7 @@
     <test-case FilePath="dml">
       <compilation-unit name="insert-with-autogenerated-pk_adm_02">
         <output-dir compare="Text">insert-with-autogenerated-pk_adm_02</output-dir>
-        <expected-error>org.apache.hyracks.algebricks.common.exceptions.AlgebricksException: Duplicate field id encountered</expected-error>
+        <expected-error>Duplicate field id encountered</expected-error>
       </compilation-unit>
     </test-case>
     <test-case FilePath="dml">
@@ -1704,13 +1704,13 @@
     <test-case FilePath="dml">
       <compilation-unit name="load-with-autogenerated-pk_adm_02">
         <output-dir compare="Text">load-with-autogenerated-pk_adm_02</output-dir>
-        <expected-error>org.apache.asterix.external.parser.ADMDataParser$ParseException</expected-error>
+        <expected-error>Parse error at (0, 5): This record is closed, you can not add extra fields! new field name: id</expected-error>
       </compilation-unit>
     </test-case>
     <test-case FilePath="dml">
       <compilation-unit name="load-with-autogenerated-pk_adm_03">
         <output-dir compare="Text">load-with-autogenerated-pk_adm_03</output-dir>
-        <expected-error>org.apache.asterix.external.parser.ADMDataParser$ParseException</expected-error>
+        <expected-error>Parse error at (0, 5): This record is closed, you can not add extra fields! new field name: id</expected-error>
       </compilation-unit>
     </test-case>
     <test-case FilePath="dml">
@@ -3121,7 +3121,7 @@
     </test-case>
   </test-group>
   <test-group name="open-index-enforced">
-    <test-group FilePath="open-index-enforced/error-checking">
+    <test-group name="open-index-enforced/error-checking">
       <test-case FilePath="open-index-enforced/error-checking">
         <compilation-unit name="enforced-field-name-collision">
           <output-dir compare="Text">enforced-field-name-collision</output-dir>
@@ -3131,13 +3131,13 @@
       <test-case FilePath="open-index-enforced/error-checking">
         <compilation-unit name="enforced-field-type-collision">
           <output-dir compare="Text">enforced-field-type-collision</output-dir>
-          <expected-error>Error: A field "[value]" is already defined with the type "STRING"</expected-error>
+          <expected-error>A field &quot;[value]&quot; is already defined with the type &quot;STRING&quot;</expected-error>
         </compilation-unit>
       </test-case>
       <test-case FilePath="open-index-enforced/error-checking">
         <compilation-unit name="missing-enforce-statement">
           <output-dir compare="Text">missing-enforce-statement</output-dir>
-          <expected-error>org.apache.hyracks.algebricks.common.exceptions.AlgebricksException: Cannot create typed index on "[value]" field without enforcing it's type</expected-error>
+          <expected-error>Cannot create typed index on "[value]" field without enforcing it's type</expected-error>
         </compilation-unit>
       </test-case>
       <test-case FilePath="open-index-enforced/error-checking">
@@ -3149,11 +3149,11 @@
       <test-case FilePath="open-index-enforced/error-checking">
         <compilation-unit name="index-on-closed-type">
           <output-dir compare="Text">index-on-closed-type</output-dir>
-          <expected-error>org.apache.hyracks.algebricks.common.exceptions.AlgebricksException: Typed index on "[value]" field could be created only for open datatype</expected-error>
+          <expected-error>Typed index on "[value]" field could be created only for open datatype</expected-error>
         </compilation-unit>
       </test-case>
     </test-group>
-    <test-group FilePath="open-index-enforced/index-join">
+    <test-group name="open-index-enforced/index-join">
       <test-case FilePath="open-index-enforced/index-join">
         <compilation-unit name="btree-secondary-equi-join">
           <output-dir compare="Text">btree-secondary-equi-join</output-dir>
@@ -3195,7 +3195,7 @@
         </compilation-unit>
       </test-case>
     </test-group>
-    <test-group FilePath="open-index-enforced/index-leftouterjoin">
+    <test-group name="open-index-enforced/index-leftouterjoin">
       <test-case FilePath="open-index-enforced/index-leftouterjoin">
         <compilation-unit name="probe-pidx-with-join-btree-sidx1">
           <output-dir compare="Text">probe-pidx-with-join-btree-sidx1</output-dir>
@@ -3227,7 +3227,7 @@
         </compilation-unit>
       </test-case>
     </test-group>
-    <test-group FilePath="open-index-enforced/index-selection">
+    <test-group name="open-index-enforced/index-selection">
       <test-case FilePath="open-index-enforced/index-selection">
         <compilation-unit name="btree-index-composite-key">
           <output-dir compare="Text">btree-index-composite-key</output-dir>
@@ -3328,7 +3328,7 @@
     </test-group>
   </test-group>
   <test-group name="nested-open-index">
-    <test-group FilePath="nested-open-index/index-join">
+    <test-group name="nested-open-index/index-join">
       <test-case FilePath="nested-open-index/index-join">
         <compilation-unit name="btree-secondary-equi-join">
           <output-dir compare="Text">btree-secondary-equi-join</output-dir>
@@ -3370,7 +3370,7 @@
         </compilation-unit>
       </test-case>
     </test-group>
-    <test-group FilePath="nested-open-index/index-leftouterjoin">
+    <test-group name="nested-open-index/index-leftouterjoin">
       <test-case FilePath="nested-open-index/index-leftouterjoin">
         <compilation-unit name="probe-pidx-with-join-btree-sidx1">
           <output-dir compare="Text">probe-pidx-with-join-btree-sidx1</output-dir>
@@ -3402,7 +3402,7 @@
         </compilation-unit>
       </test-case>
     </test-group>
-    <test-group FilePath="nested-open-index/index-selection">
+    <test-group name="nested-open-index/index-selection">
       <test-case FilePath="nested-open-index/index-selection">
         <compilation-unit name="btree-index-composite-key">
           <output-dir compare="Text">btree-index-composite-key</output-dir>
@@ -3525,7 +3525,7 @@
     </test-group>
   </test-group>
   <test-group name="nested-index">
-    <test-group FilePath="nested-index/index-join">
+    <test-group name="nested-index/index-join">
       <test-case FilePath="nested-index/index-join">
         <compilation-unit name="btree-primary-equi-join">
           <output-dir compare="Text">btree-primary-equi-join</output-dir>
@@ -3572,7 +3572,7 @@
         </compilation-unit>
       </test-case>
     </test-group>
-    <test-group FilePath="nested-index/index-leftouterjoin">
+    <test-group name="nested-index/index-leftouterjoin">
       <test-case FilePath="nested-index/index-leftouterjoin">
         <compilation-unit name="probe-pidx-with-join-btree-sidx1">
           <output-dir compare="Text">probe-pidx-with-join-btree-sidx1</output-dir>
@@ -3604,7 +3604,7 @@
         </compilation-unit>
       </test-case>
     </test-group>
-    <test-group FilePath="nested-index/index-selection">
+    <test-group name="nested-index/index-selection">
       <test-case FilePath="nested-index/index-selection">
         <compilation-unit name="btree-index-composite-key">
           <output-dir compare="Text">btree-index-composite-key</output-dir>
@@ -4575,7 +4575,7 @@
     <test-case FilePath="open-closed">
       <compilation-unit name="query-issue410">
         <output-dir compare="Text">query-issue410</output-dir>
-        <expected-error>HyracksDataException: ASX0000: Field type DOUBLE can't be promoted to type STRING</expected-error>
+        <expected-error>ASX0000: Field type DOUBLE can't be promoted to type STRING</expected-error>
       </compilation-unit>
     </test-case>
     <test-case FilePath="open-closed">
@@ -7099,19 +7099,19 @@
     <test-case FilePath="load">
       <compilation-unit name="csv_05">
         <output-dir compare="Text">csv_05</output-dir>
-        <expected-error>java.io.IOException: At record:</expected-error>
+        <expected-error>At record: 1, field#: 4 - a quote enclosing a field needs to be placed in the beginning of that field</expected-error>
       </compilation-unit>
     </test-case>
     <test-case FilePath="load">
       <compilation-unit name="csv_06">
         <output-dir compare="Text">csv_06</output-dir>
-        <expected-error>java.io.IOException: At record:</expected-error>
+        <expected-error>At record: 1, field#: 3 - a quote enclosing a field needs to be placed in the beginning of that field</expected-error>
       </compilation-unit>
     </test-case>
     <test-case FilePath="load">
       <compilation-unit name="csv_07">
         <output-dir compare="Text">csv_07</output-dir>
-        <expected-error>java.io.IOException: At record:</expected-error>
+        <expected-error>At record: 1, field#: 3 -  A quote enclosing a field needs to be followed by the delimiter</expected-error>
       </compilation-unit>
     </test-case>
     <test-case FilePath="load">
@@ -7149,7 +7149,7 @@
     <test-case FilePath="load">
       <compilation-unit name="issue650_query">
         <output-dir compare="Text">none</output-dir>
-        <expected-error>org.apache.hyracks.algebricks.common.exceptions.AlgebricksException: Unable to load dataset Users since it does not exist</expected-error>
+        <expected-error>Unable to load dataset Users since it does not exist</expected-error>
       </compilation-unit>
     </test-case>
     <test-case FilePath="load">
@@ -7181,7 +7181,7 @@
     <test-case FilePath="load">
       <compilation-unit name="duplicate-key-error">
         <output-dir compare="Text">none</output-dir>
-        <expected-error>org.apache.hyracks.api.exceptions.HyracksException</expected-error>
+        <expected-error>Input stream given to BTree bulk load has duplicates</expected-error>
       </compilation-unit>
     </test-case>
     <test-case FilePath="load">
diff --git a/asterixdb/asterix-common/src/test/java/org/apache/asterix/test/aql/ResultExtractor.java b/asterixdb/asterix-common/src/test/java/org/apache/asterix/test/aql/ResultExtractor.java
index efadc8d..d99602f 100644
--- a/asterixdb/asterix-common/src/test/java/org/apache/asterix/test/aql/ResultExtractor.java
+++ b/asterixdb/asterix-common/src/test/java/org/apache/asterix/test/aql/ResultExtractor.java
@@ -48,10 +48,13 @@
         tokener.next('{');
         String name;
         String type = null;
+        String status = null;
         String results = "";
         while ((name = getFieldName(tokener)) != null) {
-            if ("requestID".equals(name) || "signature".equals(name) || "status".equals(name)) {
+            if ("requestID".equals(name) || "signature".equals(name)) {
                 getStringField(tokener);
+            } else if ("status".equals(name)) {
+                status = getStringField(tokener);
             } else if ("type".equals(name)) {
                 type = getStringField(tokener);
             } else if ("metrics".equals(name)) {
@@ -72,6 +75,9 @@
             // skip along
         }
         tokener.next('}');
+        if (! "success".equals(status)) {
+            throw new Exception("Unexpected status: '" + status + "'");
+        }
         return IOUtils.toInputStream(results);
     }
 
diff --git a/asterixdb/asterix-common/src/test/java/org/apache/asterix/test/aql/TestExecutor.java b/asterixdb/asterix-common/src/test/java/org/apache/asterix/test/aql/TestExecutor.java
index 4dc206c..90b9d7e 100644
--- a/asterixdb/asterix-common/src/test/java/org/apache/asterix/test/aql/TestExecutor.java
+++ b/asterixdb/asterix-common/src/test/java/org/apache/asterix/test/aql/TestExecutor.java
@@ -30,6 +30,7 @@
 import java.lang.reflect.InvocationTargetException;
 import java.lang.reflect.Method;
 import java.nio.charset.StandardCharsets;
+import java.util.ArrayList;
 import java.util.Arrays;
 import java.util.HashMap;
 import java.util.HashSet;
@@ -297,6 +298,10 @@
         return response.getEntity().getContent();
     }
 
+    public InputStream executeQueryService(String str, String url) throws Exception {
+        return executeQueryService(str, OutputFormat.CLEAN_JSON, url, new ArrayList<>());
+    }
+
     public InputStream executeQueryService(String str, OutputFormat fmt, String url,
             List<CompilationUnit.Parameter> params) throws Exception {
         setFormatParam(params, fmt);
@@ -399,7 +404,7 @@
     }
 
     private InputStream getHandleResult(String handle, OutputFormat fmt) throws Exception {
-        final String url = "http://" + host + ":" + port + getPath(Servlets.QUERY_RESULT);
+        final String url = getEndpoint(Servlets.QUERY_RESULT);
 
         // Create a method instance.
         HttpUriRequest request = RequestBuilder.get(url)
@@ -523,9 +528,10 @@
         switch (ctx.getType()) {
             case "ddl":
                 if (ctx.getFile().getName().endsWith("aql")) {
-                    executeDDL(statement, "http://" + host + ":" + port + getPath(Servlets.AQL_DDL));
+                    executeDDL(statement, getEndpoint(Servlets.AQL_DDL));
                 } else {
-                    executeDDL(statement, "http://" + host + ":" + port + getPath(Servlets.SQLPP_DDL));
+                    InputStream resultStream = executeQueryService(statement, getEndpoint(Servlets.QUERY_SERVICE));
+                    ResultExtractor.extract(resultStream);
                 }
                 break;
             case "update":
@@ -534,9 +540,10 @@
                     statement = statement.replaceAll("nc1://", "127.0.0.1://../../../../../../asterix-app/");
                 }
                 if (ctx.getFile().getName().endsWith("aql")) {
-                    executeUpdate(statement, "http://" + host + ":" + port + getPath(Servlets.AQL_UPDATE));
+                    executeUpdate(statement, getEndpoint(Servlets.AQL_UPDATE));
                 } else {
-                    executeUpdate(statement, "http://" + host + ":" + port + getPath(Servlets.SQLPP_UPDATE));
+                    InputStream resultStream = executeQueryService(statement, getEndpoint(Servlets.QUERY_SERVICE));
+                    ResultExtractor.extract(resultStream);
                 }
                 break;
             case "query":
@@ -553,26 +560,22 @@
                 OutputFormat fmt = OutputFormat.forCompilationUnit(cUnit);
                 if (ctx.getFile().getName().endsWith("aql")) {
                     if (ctx.getType().equalsIgnoreCase("query")) {
-                        resultStream = executeQuery(statement, fmt,
-                                "http://" + host + ":" + port + getPath(Servlets.AQL_QUERY), cUnit.getParameter());
+                        resultStream = executeQuery(statement, fmt, getEndpoint(Servlets.AQL_QUERY),
+                                cUnit.getParameter());
                     } else if (ctx.getType().equalsIgnoreCase("async")) {
-                        resultStream = executeAnyAQLAsync(statement, false, fmt,
-                                "http://" + host + ":" + port + getPath(Servlets.AQL));
+                        resultStream = executeAnyAQLAsync(statement, false, fmt, getEndpoint(Servlets.AQL));
                     } else if (ctx.getType().equalsIgnoreCase("asyncdefer")) {
-                        resultStream = executeAnyAQLAsync(statement, true, fmt,
-                                "http://" + host + ":" + port + getPath(Servlets.AQL));
+                        resultStream = executeAnyAQLAsync(statement, true, fmt, getEndpoint(Servlets.AQL));
                     }
                 } else {
                     if (ctx.getType().equalsIgnoreCase("query")) {
-                        resultStream = executeQueryService(statement, fmt,
-                                "http://" + host + ":" + port + getPath(Servlets.QUERY_SERVICE), cUnit.getParameter());
+                        resultStream = executeQueryService(statement, fmt, getEndpoint(Servlets.QUERY_SERVICE),
+                                cUnit.getParameter());
                         resultStream = ResultExtractor.extract(resultStream);
                     } else if (ctx.getType().equalsIgnoreCase("async")) {
-                        resultStream = executeAnyAQLAsync(statement, false, fmt,
-                                "http://" + host + ":" + port + getPath(Servlets.SQLPP));
+                        resultStream = executeAnyAQLAsync(statement, false, fmt, getEndpoint(Servlets.SQLPP));
                     } else if (ctx.getType().equalsIgnoreCase("asyncdefer")) {
-                        resultStream = executeAnyAQLAsync(statement, true, fmt,
-                                "http://" + host + ":" + port + getPath(Servlets.SQLPP));
+                        resultStream = executeAnyAQLAsync(statement, true, fmt, getEndpoint(Servlets.SQLPP));
                     }
                 }
                 if (queryCount.intValue() >= expectedResultFileCtxs.size()) {
@@ -598,14 +601,14 @@
                 break;
             case "txnqbc": // qbc represents query before crash
                 resultStream = executeQuery(statement, OutputFormat.forCompilationUnit(cUnit),
-                        "http://" + host + ":" + port + getPath(Servlets.AQL_QUERY), cUnit.getParameter());
+                        getEndpoint(Servlets.AQL_QUERY), cUnit.getParameter());
                 qbcFile = getTestCaseQueryBeforeCrashFile(actualPath, testCaseCtx, cUnit);
                 qbcFile.getParentFile().mkdirs();
                 writeOutputToFile(qbcFile, resultStream);
                 break;
             case "txnqar": // qar represents query after recovery
                 resultStream = executeQuery(statement, OutputFormat.forCompilationUnit(cUnit),
-                        "http://" + host + ":" + port + getPath(Servlets.AQL_QUERY), cUnit.getParameter());
+                        getEndpoint(Servlets.AQL_QUERY), cUnit.getParameter());
                 File qarFile = new File(actualPath + File.separator
                         + testCaseCtx.getTestCase().getFilePath().replace(File.separator, "_") + "_" + cUnit.getName()
                         + "_qar.adm");
@@ -616,7 +619,7 @@
                 break;
             case "txneu": // eu represents erroneous update
                 try {
-                    executeUpdate(statement, "http://" + host + ":" + port + getPath(Servlets.AQL_UPDATE));
+                    executeUpdate(statement, getEndpoint(Servlets.AQL_UPDATE));
                 } catch (Exception e) {
                     // An exception is expected.
                     failed = true;
@@ -644,7 +647,7 @@
                 break;
             case "errddl": // a ddlquery that expects error
                 try {
-                    executeDDL(statement, "http://" + host + ":" + port + getPath(Servlets.AQL_DDL));
+                    executeDDL(statement, getEndpoint(Servlets.AQL_DDL));
                 } catch (Exception e) {
                     // expected error happens
                     failed = true;
@@ -684,8 +687,7 @@
             case "cstate": // cluster state query
                 try {
                     fmt = OutputFormat.forCompilationUnit(cUnit);
-                    resultStream = executeClusterStateQuery(fmt,
-                            "http://" + host + ":" + port + getPath(Servlets.CLUSTER_STATE));
+                    resultStream = executeClusterStateQuery(fmt, getEndpoint(Servlets.CLUSTER_STATE));
                     expectedResultFile = expectedResultFileCtxs.get(queryCount.intValue()).getFile();
                     actualResultFile = testCaseCtx.getActualResultFile(cUnit, expectedResultFile, new File(actualPath));
                     actualResultFile.getParentFile().mkdirs();
@@ -849,7 +851,7 @@
                         + cUnit.getName() + "_qbc.adm");
     }
 
-    protected String getPath(Servlets servlet) {
-        return servlet.getPath();
+    protected String getEndpoint(Servlets servlet) {
+        return "http://" + host + ":" + port + servlet.getPath();
     }
 }

-- 
To view, visit https://asterix-gerrit.ics.uci.edu/1127
To unsubscribe, visit https://asterix-gerrit.ics.uci.edu/settings

Gerrit-MessageType: merged
Gerrit-Change-Id: Ia36967386183777c6d840634850a8f8fc2ea4411
Gerrit-PatchSet: 3
Gerrit-Project: asterixdb
Gerrit-Branch: master
Gerrit-Owner: Till Westmann <ti...@apache.org>
Gerrit-Reviewer: Jenkins <je...@fulliautomatix.ics.uci.edu>
Gerrit-Reviewer: Till Westmann <ti...@apache.org>
Gerrit-Reviewer: Yingyi Bu <bu...@gmail.com>

Change in asterixdb[master]: Update SQL++ test harness to use new HTTP API

Posted by "Jenkins (Code Review)" <do...@asterixdb.incubator.apache.org>.
Jenkins has posted comments on this change.

Change subject: Update SQL++ test harness to use new HTTP API
......................................................................


Patch Set 1:

Build Started https://asterix-jenkins.ics.uci.edu/job/asterix-gerrit-notopic/2445/

-- 
To view, visit https://asterix-gerrit.ics.uci.edu/1127
To unsubscribe, visit https://asterix-gerrit.ics.uci.edu/settings

Gerrit-MessageType: comment
Gerrit-Change-Id: Ia36967386183777c6d840634850a8f8fc2ea4411
Gerrit-PatchSet: 1
Gerrit-Project: asterixdb
Gerrit-Branch: master
Gerrit-Owner: Till Westmann <ti...@apache.org>
Gerrit-Reviewer: Jenkins <je...@fulliautomatix.ics.uci.edu>
Gerrit-HasComments: No

Change in asterixdb[master]: Update SQL++ test harness to use new HTTP API

Posted by "Jenkins (Code Review)" <do...@asterixdb.incubator.apache.org>.
Jenkins has posted comments on this change.

Change subject: Update SQL++ test harness to use new HTTP API
......................................................................


Patch Set 2:

Build Started https://asterix-jenkins.ics.uci.edu/job/asterix-gerrit-notopic/2449/

-- 
To view, visit https://asterix-gerrit.ics.uci.edu/1127
To unsubscribe, visit https://asterix-gerrit.ics.uci.edu/settings

Gerrit-MessageType: comment
Gerrit-Change-Id: Ia36967386183777c6d840634850a8f8fc2ea4411
Gerrit-PatchSet: 2
Gerrit-Project: asterixdb
Gerrit-Branch: master
Gerrit-Owner: Till Westmann <ti...@apache.org>
Gerrit-Reviewer: Jenkins <je...@fulliautomatix.ics.uci.edu>
Gerrit-HasComments: No

Change in asterixdb[master]: Update SQL++ test harness to use new HTTP API

Posted by "Jenkins (Code Review)" <do...@asterixdb.incubator.apache.org>.
Jenkins has posted comments on this change.

Change subject: Update SQL++ test harness to use new HTTP API
......................................................................


Patch Set 2: Integration-Tests+1

Integration Tests Successful

https://asterix-jenkins.ics.uci.edu/job/asterix-gerrit-integration-tests/506/ : SUCCESS

-- 
To view, visit https://asterix-gerrit.ics.uci.edu/1127
To unsubscribe, visit https://asterix-gerrit.ics.uci.edu/settings

Gerrit-MessageType: comment
Gerrit-Change-Id: Ia36967386183777c6d840634850a8f8fc2ea4411
Gerrit-PatchSet: 2
Gerrit-Project: asterixdb
Gerrit-Branch: master
Gerrit-Owner: Till Westmann <ti...@apache.org>
Gerrit-Reviewer: Jenkins <je...@fulliautomatix.ics.uci.edu>
Gerrit-HasComments: No

Change in asterixdb[master]: Update SQL++ test harness to use new HTTP API

Posted by "Till Westmann (Code Review)" <do...@asterixdb.incubator.apache.org>.
Hello Jenkins,

I'd like you to reexamine a change.  Please visit

    https://asterix-gerrit.ics.uci.edu/1127

to look at the new patch set (#2).

Change subject: Update SQL++ test harness to use new HTTP API
......................................................................

Update SQL++ test harness to use new HTTP API

Change-Id: Ia36967386183777c6d840634850a8f8fc2ea4411
---
M asterixdb/asterix-app/src/test/java/org/apache/asterix/test/runtime/SqlppExecutionTest.java
M asterixdb/asterix-app/src/test/resources/runtimets/testsuite_sqlpp.xml
M asterixdb/asterix-common/src/test/java/org/apache/asterix/test/aql/ResultExtractor.java
M asterixdb/asterix-common/src/test/java/org/apache/asterix/test/aql/TestExecutor.java
4 files changed, 57 insertions(+), 50 deletions(-)


  git pull ssh://asterix-gerrit.ics.uci.edu:29418/asterixdb refs/changes/27/1127/2
-- 
To view, visit https://asterix-gerrit.ics.uci.edu/1127
To unsubscribe, visit https://asterix-gerrit.ics.uci.edu/settings

Gerrit-MessageType: newpatchset
Gerrit-Change-Id: Ia36967386183777c6d840634850a8f8fc2ea4411
Gerrit-PatchSet: 2
Gerrit-Project: asterixdb
Gerrit-Branch: master
Gerrit-Owner: Till Westmann <ti...@apache.org>
Gerrit-Reviewer: Jenkins <je...@fulliautomatix.ics.uci.edu>

Change in asterixdb[master]: Update SQL++ test harness to use new HTTP API

Posted by "Jenkins (Code Review)" <do...@asterixdb.incubator.apache.org>.
Jenkins has posted comments on this change.

Change subject: Update SQL++ test harness to use new HTTP API
......................................................................


Patch Set 2:

Integration Tests Started https://asterix-jenkins.ics.uci.edu/job/asterix-gerrit-integration-tests/506/

-- 
To view, visit https://asterix-gerrit.ics.uci.edu/1127
To unsubscribe, visit https://asterix-gerrit.ics.uci.edu/settings

Gerrit-MessageType: comment
Gerrit-Change-Id: Ia36967386183777c6d840634850a8f8fc2ea4411
Gerrit-PatchSet: 2
Gerrit-Project: asterixdb
Gerrit-Branch: master
Gerrit-Owner: Till Westmann <ti...@apache.org>
Gerrit-Reviewer: Jenkins <je...@fulliautomatix.ics.uci.edu>
Gerrit-HasComments: No

Change in asterixdb[master]: Update SQL++ test harness to use new HTTP API

Posted by "Jenkins (Code Review)" <do...@asterixdb.incubator.apache.org>.
Jenkins has posted comments on this change.

Change subject: Update SQL++ test harness to use new HTTP API
......................................................................


Patch Set 1:

Integration Tests Started https://asterix-jenkins.ics.uci.edu/job/asterix-gerrit-integration-tests/503/

-- 
To view, visit https://asterix-gerrit.ics.uci.edu/1127
To unsubscribe, visit https://asterix-gerrit.ics.uci.edu/settings

Gerrit-MessageType: comment
Gerrit-Change-Id: Ia36967386183777c6d840634850a8f8fc2ea4411
Gerrit-PatchSet: 1
Gerrit-Project: asterixdb
Gerrit-Branch: master
Gerrit-Owner: Till Westmann <ti...@apache.org>
Gerrit-Reviewer: Jenkins <je...@fulliautomatix.ics.uci.edu>
Gerrit-HasComments: No