You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@flink.apache.org by ch...@apache.org on 2019/05/09 12:31:44 UTC

[flink] branch travis_jdk9_test created (now 7d13faa)

This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a change to branch travis_jdk9_test
in repository https://gitbox.apache.org/repos/asf/flink.git.


      at 7d13faa  disable container e2e tests

This branch includes the following new commits:

     new b3cc24c  Fully enable java 9
     new 5a37646  run all e2e tests
     new 4278094  disable kafka 0.10 test (FLINK-12224)
     new c930d25  disable kafka 0.11 e2e test
     new 69b0e41  disable kinesis e2e test
     new 293c550  retain existing MAVEN_OPTS
     new fe932c0  plus
     new b7f5a37  set hadoop version
     new 40ef7d3  bump zookeeper to Java9 compatible version
     new 6f30122  disable all container tests
     new 78a613b  deduplicate empty .out file check
     new 14b3ee3  [hotfix][tests] Remove forced step logging in AutoClosableProcess
     new 8b3106b  [hotfix][tests] Rework Process IO handling
     new 6e6f648  prevent concurrent checkpoints
     new 7d13faa  disable container e2e tests

The 15 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.



[flink] 03/15: disable kafka 0.10 test (FLINK-12224)

Posted by ch...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a commit to branch travis_jdk9_test
in repository https://gitbox.apache.org/repos/asf/flink.git

commit 4278094116bccbe32b1d56906666c3bb7937831b
Author: Chesnay Schepler <ch...@apache.org>
AuthorDate: Wed Apr 24 17:55:39 2019 +0200

    disable kafka 0.10 test (FLINK-12224)
---
 flink-end-to-end-tests/run-pre-commit-tests.sh | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/flink-end-to-end-tests/run-pre-commit-tests.sh b/flink-end-to-end-tests/run-pre-commit-tests.sh
index 3b81b3f..4a9881c 100755
--- a/flink-end-to-end-tests/run-pre-commit-tests.sh
+++ b/flink-end-to-end-tests/run-pre-commit-tests.sh
@@ -53,7 +53,7 @@ run_test "State Evolution end-to-end test" "$END_TO_END_DIR/test-scripts/test_st
 run_test "Batch Python Wordcount end-to-end test" "$END_TO_END_DIR/test-scripts/test_batch_python_wordcount.sh"
 run_test "Streaming Python Wordcount end-to-end test" "$END_TO_END_DIR/test-scripts/test_streaming_python_wordcount.sh"
 run_test "Wordcount end-to-end test" "$END_TO_END_DIR/test-scripts/test_batch_wordcount.sh"
-run_test "Kafka 0.10 end-to-end test" "$END_TO_END_DIR/test-scripts/test_streaming_kafka010.sh"
+#run_test "Kafka 0.10 end-to-end test" "$END_TO_END_DIR/test-scripts/test_streaming_kafka010.sh"
 run_test "Kafka 0.11 end-to-end test" "$END_TO_END_DIR/test-scripts/test_streaming_kafka011.sh"
 run_test "Modern Kafka end-to-end test" "$END_TO_END_DIR/test-scripts/test_streaming_kafka.sh"
 run_test "Kinesis end-to-end test" "$END_TO_END_DIR/test-scripts/test_streaming_kinesis.sh"


[flink] 11/15: deduplicate empty .out file check

Posted by ch...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a commit to branch travis_jdk9_test
in repository https://gitbox.apache.org/repos/asf/flink.git

commit 78a613b5c38899b0fe0cbf76acaf7cd405fd3847
Author: Chesnay Schepler <ch...@apache.org>
AuthorDate: Wed May 8 11:59:27 2019 +0200

    deduplicate empty .out file check
---
 flink-end-to-end-tests/test-scripts/common_ha.sh                  | 8 ++++----
 .../test-scripts/test_ha_per_job_cluster_datastream.sh            | 6 ++----
 2 files changed, 6 insertions(+), 8 deletions(-)

diff --git a/flink-end-to-end-tests/test-scripts/common_ha.sh b/flink-end-to-end-tests/test-scripts/common_ha.sh
index 6521cb4..154ddb9 100644
--- a/flink-end-to-end-tests/test-scripts/common_ha.sh
+++ b/flink-end-to-end-tests/test-scripts/common_ha.sh
@@ -18,6 +18,8 @@
 # limitations under the License.
 ################################################################################
 
+source "${END_TO_END_DIR}"/test-scripts/common.sh
+
 # flag indicating if we have already cleared up things after a test
 CLEARED=0
 
@@ -59,10 +61,8 @@ function verify_logs() {
     local VERIFY_CHECKPOINTS=$2
 
     # verify that we have no alerts
-    if ! [ `cat ${OUTPUT} | wc -l` -eq 0 ]; then
-        echo "FAILURE: Alerts found at the general purpose job."
-        EXIT_CODE=1
-    fi
+    check_logs_for_non_empty_out_files
+    EXIT_CODE=$(($EXIT_CODE+$?))
 
     # checks that all apart from the first JM recover the failed jobgraph.
     if ! verify_num_occurences_in_logs 'standalonesession' 'Recovered SubmittedJobGraph' ${JM_FAILURES}; then
diff --git a/flink-end-to-end-tests/test-scripts/test_ha_per_job_cluster_datastream.sh b/flink-end-to-end-tests/test-scripts/test_ha_per_job_cluster_datastream.sh
index 61dd8bc..66a6381 100755
--- a/flink-end-to-end-tests/test-scripts/test_ha_per_job_cluster_datastream.sh
+++ b/flink-end-to-end-tests/test-scripts/test_ha_per_job_cluster_datastream.sh
@@ -69,10 +69,8 @@ function verify_logs_per_job() {
     local EXIT_CODE=0
 
     # verify that we have no alerts
-    if ! [ `cat ${OUTPUT} | wc -l` -eq 0 ]; then
-        echo "FAILURE: Alerts found at the general purpose job."
-        EXIT_CODE=1
-    fi
+    check_logs_for_non_empty_out_files
+    EXIT_CODE=$(($EXIT_CODE+$?))
 
     # checks that all apart from the first JM recover the failed jobgraph.
     if ! verify_num_occurences_in_logs 'standalonejob' 'Found 0 checkpoints in ZooKeeper' 1; then


[flink] 15/15: disable container e2e tests

Posted by ch...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a commit to branch travis_jdk9_test
in repository https://gitbox.apache.org/repos/asf/flink.git

commit 7d13faaa23b5bfb82839733b15af6d0207284553
Author: Chesnay Schepler <ch...@apache.org>
AuthorDate: Thu May 9 12:08:10 2019 +0200

    disable container e2e tests
---
 .travis.yml | 3 ---
 1 file changed, 3 deletions(-)

diff --git a/.travis.yml b/.travis.yml
index 82d205c..7a03168 100644
--- a/.travis.yml
+++ b/.travis.yml
@@ -119,8 +119,5 @@ jobs:
       script: ./tools/travis/nightly.sh split_checkpoints.sh
       name: checkpoints
     - env: PROFILE="-Dinclude-kinesis -Djdk9 -Dhadoop.version=2.8.3"
-      script: ./tools/travis/nightly.sh split_container.sh
-      name: container
-    - env: PROFILE="-Dinclude-kinesis -Djdk9 -Dhadoop.version=2.8.3"
       script: ./tools/travis/nightly.sh split_heavy.sh
       name: heavy


[flink] 08/15: set hadoop version

Posted by ch...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a commit to branch travis_jdk9_test
in repository https://gitbox.apache.org/repos/asf/flink.git

commit b7f5a377466f2ecd0f2ab504d1381fb79e27e559
Author: Chesnay Schepler <ch...@apache.org>
AuthorDate: Thu May 2 14:15:03 2019 +0200

    set hadoop version
---
 .travis.yml | 12 ++++++------
 1 file changed, 6 insertions(+), 6 deletions(-)

diff --git a/.travis.yml b/.travis.yml
index 87bda64..82d205c 100644
--- a/.travis.yml
+++ b/.travis.yml
@@ -106,21 +106,21 @@ jobs:
       env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis -Dinclude_hadoop_aws -Dscala-2.11 -Djdk9"
       name: misc - jdk 9
     # E2E profile
-    - env: PROFILE="-De2e-metrics -Dinclude-kinesis -Djdk9"
+    - env: PROFILE="-De2e-metrics -Dinclude-kinesis -Djdk9 -Dhadoop.version=2.8.3"
       script: ./tools/travis/nightly.sh split_misc_hadoopfree.sh
       name: misc
-    - env: PROFILE="-Dinclude-kinesis -Djdk9"
+    - env: PROFILE="-Dinclude-kinesis -Djdk9 -Dhadoop.version=2.8.3"
       script: ./tools/travis/nightly.sh split_ha.sh
       name: ha
-    - env: PROFILE="-Dinclude-kinesis -Djdk9"
+    - env: PROFILE="-Dinclude-kinesis -Djdk9 -Dhadoop.version=2.8.3"
       script: ./tools/travis/nightly.sh split_sticky.sh
       name: sticky
-    - env: PROFILE="-Dinclude-kinesis -Djdk9"
+    - env: PROFILE="-Dinclude-kinesis -Djdk9 -Dhadoop.version=2.8.3"
       script: ./tools/travis/nightly.sh split_checkpoints.sh
       name: checkpoints
-    - env: PROFILE="-Dinclude-kinesis -Djdk9"
+    - env: PROFILE="-Dinclude-kinesis -Djdk9 -Dhadoop.version=2.8.3"
       script: ./tools/travis/nightly.sh split_container.sh
       name: container
-    - env: PROFILE="-Dinclude-kinesis -Djdk9"
+    - env: PROFILE="-Dinclude-kinesis -Djdk9 -Dhadoop.version=2.8.3"
       script: ./tools/travis/nightly.sh split_heavy.sh
       name: heavy


[flink] 01/15: Fully enable java 9

Posted by ch...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a commit to branch travis_jdk9_test
in repository https://gitbox.apache.org/repos/asf/flink.git

commit b3cc24c55f3acb2ed662318adc1afbcaee050a33
Author: Chesnay Schepler <ch...@apache.org>
AuthorDate: Thu Mar 28 14:12:42 2019 +0100

    Fully enable java 9
---
 .travis.yml | 165 +++++-------------------------------------------------------
 1 file changed, 13 insertions(+), 152 deletions(-)

diff --git a/.travis.yml b/.travis.yml
index f8243b4..a4bbb64 100644
--- a/.travis.yml
+++ b/.travis.yml
@@ -72,201 +72,62 @@ stages:
 jdk: "oraclejdk8"
 jobs:
   include:
-    # main profile
-    - if: type in (pull_request, push)
-      stage: compile
-      script: ./tools/travis_controller.sh compile
-      env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis -Dinclude_hadoop_aws -Dscala-2.11"
-      name: compile
-    - if: type in (pull_request, push)
-      stage: test
-      script: ./tools/travis_controller.sh core
-      env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis -Dinclude_hadoop_aws -Dscala-2.11"
-      name: core
-    - if: type in (pull_request, push)
-      script: ./tools/travis_controller.sh libraries
-      env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis -Dinclude_hadoop_aws -Dscala-2.11"
-      name: libraries
-    - if: type in (pull_request, push)
-      script: ./tools/travis_controller.sh connectors
-      env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis -Dinclude_hadoop_aws -Dscala-2.11"
-      name: connectors
-    - if: type in (pull_request, push)
-      script: ./tools/travis_controller.sh tests
-      env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis -Dinclude_hadoop_aws -Dscala-2.11"
-      name: tests
-    - if: type in (pull_request, push)
-      script: ./tools/travis_controller.sh misc
-      env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis -Dinclude_hadoop_aws -Dscala-2.11"
-      name: misc
-    - if: type in (pull_request, push)
-      stage: cleanup
-      script: ./tools/travis_controller.sh cleanup
-      env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis -Dinclude_hadoop_aws -Dscala-2.11"
-      name: cleanup
-    # hadoop 2.4.1 profile
-    - if: type = cron
-      stage: compile
-      script: ./tools/travis_controller.sh compile
-      env: PROFILE="-Dhadoop.version=2.4.1 -Pinclude-kinesis"
-      name: compile - hadoop 2.4.1
-    - if: type = cron
-      stage: test
-      script: ./tools/travis_controller.sh core
-      env: PROFILE="-Dhadoop.version=2.4.1 -Pinclude-kinesis"
-      name: core - hadoop 2.4.1
-    - if: type = cron
-      script: ./tools/travis_controller.sh libraries
-      env: PROFILE="-Dhadoop.version=2.4.1 -Pinclude-kinesis"
-      name: libraries - hadoop 2.4.1
-    - if: type = cron
-      script: ./tools/travis_controller.sh connectors
-      env: PROFILE="-Dhadoop.version=2.4.1 -Pinclude-kinesis"
-      name: connectors - hadoop 2.4.1
-    - if: type = cron
-      script: ./tools/travis_controller.sh tests
-      env: PROFILE="-Dhadoop.version=2.4.1 -Pinclude-kinesis"
-      name: tests - hadoop 2.4.1
-    - if: type = cron
-      script: ./tools/travis_controller.sh misc
-      env: PROFILE="-Dhadoop.version=2.4.1 -Pinclude-kinesis"
-      name: misc - hadoop 2.4.1
-    - if: type = cron
-      stage: cleanup
-      script: ./tools/travis_controller.sh cleanup
-      env: PROFILE="-Dhadoop.version=2.4.1 -Pinclude-kinesis"
-      name: cleanup - hadoop 2.4.1
-    # scala 2.12 profile
-    - if: type = cron
-      stage: compile
-      script: ./tools/travis_controller.sh compile
-      env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis -Dinclude_hadoop_aws -Dscala-2.12"
-      name: compile - scala 2.12
-    - if: type = cron
-      stage: test
-      script: ./tools/travis_controller.sh core
-      env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis -Dinclude_hadoop_aws -Dscala-2.12"
-      name: core - scala 2.12
-    - if: type = cron
-      script: ./tools/travis_controller.sh libraries
-      env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis -Dinclude_hadoop_aws -Dscala-2.12"
-      name: libraries - scala 2.12
-    - if: type = cron
-      script: ./tools/travis_controller.sh connectors
-      env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis -Dinclude_hadoop_aws -Dscala-2.12"
-      name: connectors - scala 2.12
-    - if: type = cron
-      script: ./tools/travis_controller.sh tests
-      env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis -Dinclude_hadoop_aws -Dscala-2.12"
-      name: tests - scala 2.12
-    - if: type = cron
-      script: ./tools/travis_controller.sh misc
-      env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis -Dinclude_hadoop_aws -Dscala-2.12"
-      name: misc - scala 2.12
-    - if: type = cron
-      stage: cleanup
-      script: ./tools/travis_controller.sh cleanup
-      env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis -Dinclude_hadoop_aws -Dscala-2.12"
-      name: cleanup - scala 2.12
     # JDK9 profile
-    - if: type = cron
+    - if: type = push
       jdk: "openjdk9"
       stage: compile
       script: ./tools/travis_controller.sh compile
       env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis -Dinclude_hadoop_aws -Dscala-2.11 -Djdk9"
       name: compile - jdk 9
-    - if: type = cron
+    - if: type = push
       jdk: "openjdk9"
       stage: test
       script: ./tools/travis_controller.sh core
       env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis -Dinclude_hadoop_aws -Dscala-2.11 -Djdk9"
       name: core - jdk 9
-    - if: type = cron
+    - if: type = push
       jdk: "openjdk9"
       script: ./tools/travis_controller.sh libraries
       env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis -Dinclude_hadoop_aws -Dscala-2.11 -Djdk9"
       name: libraries - jdk 9
-    - if: type = cron
+    - if: type = push
       jdk: "openjdk9"
       script: ./tools/travis_controller.sh connectors
       env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis -Dinclude_hadoop_aws -Dscala-2.11 -Djdk9"
       name: connectors - jdk 9
-    - if: type = cron
+    - if: type = push
       jdk: "openjdk9"
       script: ./tools/travis_controller.sh tests
       env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis -Dinclude_hadoop_aws -Dscala-2.11 -Djdk9"
       name: tests - jdk 9
-    - if: type = cron
+    - if: type = push
       jdk: "openjdk9"
       script: ./tools/travis_controller.sh misc
       env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis -Dinclude_hadoop_aws -Dscala-2.11 -Djdk9"
       name: misc
-    - if: type = cron
+    - if: type = push
       jdk: "openjdk9"
       stage: cleanup
       script: ./tools/travis_controller.sh cleanup
       env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis -Dinclude_hadoop_aws -Dscala-2.11 -Djdk9"
       name: cleanup - jdk 9
-    # Documentation 404 check
-    - if: type = cron
-      stage: compile
-      script: ./tools/travis/docs.sh
-      language: ruby
-      rvm: 2.4.0
-      name: Documentation links check
     # E2E profile
     - stage: E2E
-      env: PROFILE="-Dinclude-hadoop -Dhadoop.version=2.8.3 -De2e-metrics -Dinclude-kinesis"
-      script: ./tools/travis/nightly.sh split_misc.sh
-      name: misc - hadoop 2.8
-    - env: PROFILE="-Dinclude-hadoop -Dhadoop.version=2.8.3 -Dinclude-kinesis"
-      script: ./tools/travis/nightly.sh split_ha.sh
-      name: ha - hadoop 2.8
-    - env: PROFILE="-Dinclude-hadoop -Dhadoop.version=2.8.3 -Dinclude-kinesis"
-      script: ./tools/travis/nightly.sh split_sticky.sh
-      name: sticky - hadoop 2.8
-    - env: PROFILE="-Dinclude-hadoop -Dhadoop.version=2.8.3 -Dinclude-kinesis"
-      script: ./tools/travis/nightly.sh split_checkpoints.sh
-      name: checkpoints - hadoop 2.8
-    - env: PROFILE="-Dinclude-hadoop -Dhadoop.version=2.8.3 -Dinclude-kinesis"
-      script: ./tools/travis/nightly.sh split_container.sh
-      name: container - hadoop 2.8
-    - env: PROFILE="-Dinclude-hadoop -Dhadoop.version=2.8.3 -Dinclude-kinesis"
-      script: ./tools/travis/nightly.sh split_heavy.sh
-      name: heavy - hadoop 2.8
-    - env: PROFILE="-Dinclude-hadoop -Dhadoop.version=2.8.3 -Dscala-2.12 -De2e-metrics -Dinclude-kinesis"
-      script: ./tools/travis/nightly.sh split_misc.sh
-      name: misc - scala 2.12
-    - env: PROFILE="-Dinclude-hadoop -Dhadoop.version=2.8.3 -Dscala-2.12 -Dinclude-kinesis"
-      script: ./tools/travis/nightly.sh split_ha.sh
-      name: ha - scala 2.12
-    - env: PROFILE="-Dinclude-hadoop -Dhadoop.version=2.8.3 -Dscala-2.12 -Dinclude-kinesis"
-      script: ./tools/travis/nightly.sh split_sticky.sh
-      name: sticky - scala 2.12
-    - env: PROFILE="-Dinclude-hadoop -Dhadoop.version=2.8.3 -Dscala-2.12 -Dinclude-kinesis"
-      script: ./tools/travis/nightly.sh split_checkpoints.sh
-      name: checkpoints - scala 2.12
-    - env: PROFILE="-Dinclude-hadoop -Dhadoop.version=2.8.3 -Dscala-2.12 -Dinclude-kinesis"
-      script: ./tools/travis/nightly.sh split_container.sh
-      name: container - scala 2.12
-    - env: PROFILE="-Dinclude-hadoop -Dhadoop.version=2.8.3 -Dscala-2.12 -Dinclude-kinesis"
-      script: ./tools/travis/nightly.sh split_heavy.sh
-      name: heavy - scala 2.12
-    - env: PROFILE="-De2e-metrics -Dinclude-kinesis"
+    - env: PROFILE="-De2e-metrics -Dinclude-kinesis -Djdk9"
       script: ./tools/travis/nightly.sh split_misc_hadoopfree.sh
       name: misc
-    - env: PROFILE="-Dinclude-kinesis"
+    - env: PROFILE="-Dinclude-kinesis -Djdk9"
       script: ./tools/travis/nightly.sh split_ha.sh
       name: ha
-    - env: PROFILE="-Dinclude-kinesis"
+    - env: PROFILE="-Dinclude-kinesis -Djdk9"
       script: ./tools/travis/nightly.sh split_sticky.sh
       name: sticky
-    - env: PROFILE="-Dinclude-kinesis"
+    - env: PROFILE="-Dinclude-kinesis -Djdk9"
       script: ./tools/travis/nightly.sh split_checkpoints.sh
       name: checkpoints
-    - env: PROFILE="-Dinclude-kinesis"
+    - env: PROFILE="-Dinclude-kinesis -Djdk9"
       script: ./tools/travis/nightly.sh split_container.sh
       name: container
-    - env: PROFILE="-Dinclude-kinesis"
+    - env: PROFILE="-Dinclude-kinesis -Djdk9"
       script: ./tools/travis/nightly.sh split_heavy.sh
       name: heavy


[flink] 10/15: disable all container tests

Posted by ch...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a commit to branch travis_jdk9_test
in repository https://gitbox.apache.org/repos/asf/flink.git

commit 6f30122563aa8d118ed1bd886ef9a64aaf96d13a
Author: Chesnay Schepler <ch...@apache.org>
AuthorDate: Fri May 3 11:42:34 2019 +0200

    disable all container tests
---
 tools/travis/splits/split_container.sh | 6 +++---
 1 file changed, 3 insertions(+), 3 deletions(-)

diff --git a/tools/travis/splits/split_container.sh b/tools/travis/splits/split_container.sh
index d34d55e..fbeb9a2 100755
--- a/tools/travis/splits/split_container.sh
+++ b/tools/travis/splits/split_container.sh
@@ -43,9 +43,9 @@ echo "Flink distribution directory: $FLINK_DIR"
 
 # run_test "<description>" "$END_TO_END_DIR/test-scripts/<script_name>"
 
-run_test "Wordcount on Docker test" "$END_TO_END_DIR/test-scripts/test_docker_embedded_job.sh"
-run_test "Running Kerberized YARN on Docker test " "$END_TO_END_DIR/test-scripts/test_yarn_kerberos_docker.sh"
-run_test "Run kubernetes test" "$END_TO_END_DIR/test-scripts/test_kubernetes_embedded_job.sh"
+#run_test "Wordcount on Docker test" "$END_TO_END_DIR/test-scripts/test_docker_embedded_job.sh"
+#run_test "Running Kerberized YARN on Docker test " "$END_TO_END_DIR/test-scripts/test_yarn_kerberos_docker.sh"
+#run_test "Run kubernetes test" "$END_TO_END_DIR/test-scripts/test_kubernetes_embedded_job.sh"
 
 printf "\n[PASS] All tests passed\n"
 exit 0


[flink] 06/15: retain existing MAVEN_OPTS

Posted by ch...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a commit to branch travis_jdk9_test
in repository https://gitbox.apache.org/repos/asf/flink.git

commit 293c550ac73dd4b7f87e276379d6bfc481d9f95c
Author: Chesnay Schepler <ch...@apache.org>
AuthorDate: Fri Apr 26 13:19:56 2019 +0200

    retain existing MAVEN_OPTS
---
 tools/travis/setup_maven.sh | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/tools/travis/setup_maven.sh b/tools/travis/setup_maven.sh
index 1de4bd9..5714762 100755
--- a/tools/travis/setup_maven.sh
+++ b/tools/travis/setup_maven.sh
@@ -29,7 +29,7 @@ fi
 
 export M2_HOME="${MAVEN_VERSIONED_DIR}"
 export PATH=${M2_HOME}/bin:${PATH}
-export MAVEN_OPTS="-Dorg.slf4j.simpleLogger.showDateTime=true -Dorg.slf4j.simpleLogger.dateTimeFormat=HH:mm:ss.SSS"
+export MAVEN_OPTS="${MAVEN_OPTS} -Dorg.slf4j.simpleLogger.showDateTime=true -Dorg.slf4j.simpleLogger.dateTimeFormat=HH:mm:ss.SSS"
 
 # just in case: clean up the .m2 home and remove invalid jar files
 if [ -d "${HOME}/.m2/repository/" ]; then


[flink] 05/15: disable kinesis e2e test

Posted by ch...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a commit to branch travis_jdk9_test
in repository https://gitbox.apache.org/repos/asf/flink.git

commit 69b0e4192255dce34792e5ecf4eb671d29c00e1e
Author: Chesnay Schepler <ch...@apache.org>
AuthorDate: Fri Apr 26 12:38:33 2019 +0200

    disable kinesis e2e test
---
 flink-end-to-end-tests/run-pre-commit-tests.sh | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/flink-end-to-end-tests/run-pre-commit-tests.sh b/flink-end-to-end-tests/run-pre-commit-tests.sh
index 57eb8d1..964238c 100755
--- a/flink-end-to-end-tests/run-pre-commit-tests.sh
+++ b/flink-end-to-end-tests/run-pre-commit-tests.sh
@@ -56,7 +56,7 @@ run_test "Wordcount end-to-end test" "$END_TO_END_DIR/test-scripts/test_batch_wo
 #run_test "Kafka 0.10 end-to-end test" "$END_TO_END_DIR/test-scripts/test_streaming_kafka010.sh"
 #run_test "Kafka 0.11 end-to-end test" "$END_TO_END_DIR/test-scripts/test_streaming_kafka011.sh"
 run_test "Modern Kafka end-to-end test" "$END_TO_END_DIR/test-scripts/test_streaming_kafka.sh"
-run_test "Kinesis end-to-end test" "$END_TO_END_DIR/test-scripts/test_streaming_kinesis.sh"
+#run_test "Kinesis end-to-end test" "$END_TO_END_DIR/test-scripts/test_streaming_kinesis.sh"
 run_test "class loading end-to-end test" "$END_TO_END_DIR/test-scripts/test_streaming_classloader.sh"
 run_test "Shaded Hadoop S3A end-to-end test" "$END_TO_END_DIR/test-scripts/test_shaded_hadoop_s3a.sh"
 run_test "Shaded Presto S3 end-to-end test" "$END_TO_END_DIR/test-scripts/test_shaded_presto_s3.sh"


[flink] 02/15: run all e2e tests

Posted by ch...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a commit to branch travis_jdk9_test
in repository https://gitbox.apache.org/repos/asf/flink.git

commit 5a37646c38c32d40afa9bb88d2433193851336d6
Author: Chesnay Schepler <ch...@apache.org>
AuthorDate: Wed Apr 24 17:55:07 2019 +0200

    run all e2e tests
---
 .travis.yml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/.travis.yml b/.travis.yml
index a4bbb64..52c0788 100644
--- a/.travis.yml
+++ b/.travis.yml
@@ -66,7 +66,7 @@ stages:
   - name: compile
   - name: test
   - name: E2E
-    if: type = cron
+    if: type = push
   - name: cleanup
 
 jdk: "oraclejdk8"


[flink] 13/15: [hotfix][tests] Rework Process IO handling

Posted by ch...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a commit to branch travis_jdk9_test
in repository https://gitbox.apache.org/repos/asf/flink.git

commit 8b3106b426e4fc6246fa5a5e89eddb24ea741d9a
Author: zentol <ch...@apache.org>
AuthorDate: Thu Jan 31 10:20:27 2019 +0100

    [hotfix][tests] Rework Process IO handling
---
 .../flink/tests/util/AutoClosableProcess.java      | 110 +++++++++++++++++----
 .../tests/PrometheusReporterEndToEndITCase.java    |  13 +--
 2 files changed, 99 insertions(+), 24 deletions(-)

diff --git a/flink-end-to-end-tests/flink-end-to-end-tests-common/src/main/java/org/apache/flink/tests/util/AutoClosableProcess.java b/flink-end-to-end-tests/flink-end-to-end-tests-common/src/main/java/org/apache/flink/tests/util/AutoClosableProcess.java
index fbeeb1d..533ffd0 100644
--- a/flink-end-to-end-tests/flink-end-to-end-tests-common/src/main/java/org/apache/flink/tests/util/AutoClosableProcess.java
+++ b/flink-end-to-end-tests/flink-end-to-end-tests-common/src/main/java/org/apache/flink/tests/util/AutoClosableProcess.java
@@ -20,16 +20,28 @@ package org.apache.flink.tests.util;
 
 import org.apache.flink.util.Preconditions;
 
+import org.slf4j.Logger;
+import org.slf4j.LoggerFactory;
+
+import java.io.BufferedReader;
 import java.io.IOException;
+import java.io.InputStream;
+import java.io.InputStreamReader;
+import java.io.PrintWriter;
+import java.io.StringWriter;
+import java.nio.charset.StandardCharsets;
 import java.time.Duration;
 import java.util.concurrent.TimeUnit;
 import java.util.concurrent.TimeoutException;
+import java.util.function.Consumer;
 
 /**
  * Utility class to terminate a given {@link Process} when exiting a try-with-resources statement.
  */
 public class AutoClosableProcess implements AutoCloseable {
 
+	private static final Logger LOG = LoggerFactory.getLogger(AutoClosableProcess.class);
+
 	private final Process process;
 
 	public AutoClosableProcess(final Process process) {
@@ -42,35 +54,97 @@ public class AutoClosableProcess implements AutoCloseable {
 	}
 
 	public static AutoClosableProcess runNonBlocking(String... commands) throws IOException {
-		return runNonBlocking(commands);
+		return create(commands).runNonBlocking();
 	}
 
-	public static Process runBlocking(String... commands) throws IOException {
-		return runBlocking(Duration.ofSeconds(30), commands);
+	public static void runBlocking(String... commands) throws IOException {
+		create(commands).runBlocking();
 	}
 
-	public static Process runBlocking(Duration timeout, String... commands) throws IOException {
-		final Process process = createProcess(commands);
+	public static AutoClosableProcessBuilder create(String... commands) {
+		return new AutoClosableProcessBuilder(commands);
+	}
 
-		try (AutoClosableProcess autoProcess = new AutoClosableProcess(process)) {
-			final boolean success = process.waitFor(timeout.toMillis(), TimeUnit.MILLISECONDS);
-			if (!success) {
-				throw new TimeoutException("Process exceeded timeout of " + timeout.getSeconds() + "seconds.");
-			}
-			if (process.exitValue() != 0) {
-				throw new RuntimeException("Process execution failed due error.");
+	/**
+	 * Builder for most sophisticated processes.
+	 */
+	public static final class AutoClosableProcessBuilder {
+		private final String[] commands;
+		private Consumer<String> stdoutProcessor = line -> {
+		};
+		private Consumer<String> stderrProcessor = line -> {
+		};
+
+		AutoClosableProcessBuilder(final String... commands) {
+			this.commands = commands;
+		}
+
+		public AutoClosableProcessBuilder setStdoutProcessor(final Consumer<String> stdoutProcessor) {
+			this.stdoutProcessor = stdoutProcessor;
+			return this;
+		}
+
+		public AutoClosableProcessBuilder setStderrProcessor(final Consumer<String> stderrProcessor) {
+			this.stderrProcessor = stderrProcessor;
+			return this;
+		}
+
+		public void runBlocking() throws IOException {
+			runBlocking(Duration.ofSeconds(30));
+		}
+
+		public void runBlocking(final Duration timeout) throws IOException {
+			final StringWriter sw = new StringWriter();
+			try (final PrintWriter printer = new PrintWriter(sw)) {
+				final Process process = createProcess(commands, stdoutProcessor, line -> {
+					stderrProcessor.accept(line);
+					printer.println(line);
+				});
+
+				try (AutoClosableProcess autoProcess = new AutoClosableProcess(process)) {
+					final boolean success = process.waitFor(timeout.toMillis(), TimeUnit.MILLISECONDS);
+					if (!success) {
+						throw new TimeoutException("Process exceeded timeout of " + timeout.getSeconds() + "seconds.");
+					}
+
+					if (process.exitValue() != 0) {
+						throw new IOException("Process execution failed due error. Error output:" + sw);
+					}
+				} catch (TimeoutException | InterruptedException e) {
+					throw new IOException("Process failed due to timeout.");
+				}
 			}
-		} catch (TimeoutException | InterruptedException e) {
-			throw new RuntimeException("Process failed due to timeout.");
 		}
-		return process;
+
+		public AutoClosableProcess runNonBlocking() throws IOException {
+			return new AutoClosableProcess(createProcess(commands, stdoutProcessor, stderrProcessor));
+		}
 	}
 
-	private static Process createProcess(String... commands) throws IOException {
+	private static Process createProcess(final String[] commands, Consumer<String> stdoutProcessor, Consumer<String> stderrProcessor) throws IOException {
 		final ProcessBuilder processBuilder = new ProcessBuilder();
 		processBuilder.command(commands);
-		processBuilder.inheritIO();
-		return processBuilder.start();
+
+		final Process process = processBuilder.start();
+
+		processStream(process.getInputStream(), stdoutProcessor);
+		processStream(process.getErrorStream(), stderrProcessor);
+
+		return process;
+	}
+
+	private static void processStream(final InputStream stream, final Consumer<String> streamConsumer) {
+		new Thread(() -> {
+			try (BufferedReader bufferedReader = new BufferedReader(new InputStreamReader(stream, StandardCharsets.UTF_8))) {
+				String line;
+				while ((line = bufferedReader.readLine()) != null) {
+					streamConsumer.accept(line);
+				}
+			} catch (IOException e) {
+				LOG.error("Failure while processing process stdout/stderr.", e);
+			}
+		}
+		).start();
 	}
 
 	@Override
diff --git a/flink-end-to-end-tests/flink-metrics-reporter-prometheus-test/src/test/java/org/apache/flink/metrics/prometheus/tests/PrometheusReporterEndToEndITCase.java b/flink-end-to-end-tests/flink-metrics-reporter-prometheus-test/src/test/java/org/apache/flink/metrics/prometheus/tests/PrometheusReporterEndToEndITCase.java
index 5d189de..258d293 100644
--- a/flink-end-to-end-tests/flink-metrics-reporter-prometheus-test/src/test/java/org/apache/flink/metrics/prometheus/tests/PrometheusReporterEndToEndITCase.java
+++ b/flink-end-to-end-tests/flink-metrics-reporter-prometheus-test/src/test/java/org/apache/flink/metrics/prometheus/tests/PrometheusReporterEndToEndITCase.java
@@ -109,12 +109,13 @@ public class PrometheusReporterEndToEndITCase extends TestLogger {
 		Files.createDirectory(tmpPrometheusDir);
 
 		LOG.info("Downloading Prometheus.");
-		runBlocking(
-			Duration.ofMinutes(5),
-			CommandLineWrapper
-				.wget("https://github.com/prometheus/prometheus/releases/download/v" + PROMETHEUS_VERSION + '/' + prometheusArchive.getFileName())
-				.targetDir(tmpPrometheusDir)
-				.build());
+		AutoClosableProcess
+			.create(
+				CommandLineWrapper
+					.wget("https://github.com/prometheus/prometheus/releases/download/v" + PROMETHEUS_VERSION + '/' + prometheusArchive.getFileName())
+					.targetDir(tmpPrometheusDir)
+					.build())
+			.runBlocking(Duration.ofMinutes(5));
 
 		LOG.info("Unpacking Prometheus.");
 		runBlocking(


[flink] 07/15: plus

Posted by ch...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a commit to branch travis_jdk9_test
in repository https://gitbox.apache.org/repos/asf/flink.git

commit fe932c03f2feac29f75a0ce6b8db9a5d3e4b8c8d
Author: Chesnay Schepler <ch...@apache.org>
AuthorDate: Thu May 2 11:20:32 2019 +0200

    plus
---
 .travis.yml | 11 ++---------
 1 file changed, 2 insertions(+), 9 deletions(-)

diff --git a/.travis.yml b/.travis.yml
index 52c0788..87bda64 100644
--- a/.travis.yml
+++ b/.travis.yml
@@ -69,7 +69,7 @@ stages:
     if: type = push
   - name: cleanup
 
-jdk: "oraclejdk8"
+jdk: "openjdk9"
 jobs:
   include:
     # JDK9 profile
@@ -104,15 +104,8 @@ jobs:
       jdk: "openjdk9"
       script: ./tools/travis_controller.sh misc
       env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis -Dinclude_hadoop_aws -Dscala-2.11 -Djdk9"
-      name: misc
-    - if: type = push
-      jdk: "openjdk9"
-      stage: cleanup
-      script: ./tools/travis_controller.sh cleanup
-      env: PROFILE="-Dhadoop.version=2.8.3 -Pinclude-kinesis -Dinclude_hadoop_aws -Dscala-2.11 -Djdk9"
-      name: cleanup - jdk 9
+      name: misc - jdk 9
     # E2E profile
-    - stage: E2E
     - env: PROFILE="-De2e-metrics -Dinclude-kinesis -Djdk9"
       script: ./tools/travis/nightly.sh split_misc_hadoopfree.sh
       name: misc


[flink] 04/15: disable kafka 0.11 e2e test

Posted by ch...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a commit to branch travis_jdk9_test
in repository https://gitbox.apache.org/repos/asf/flink.git

commit c930d2522f6f6f5743e66f78b01ebca894f31787
Author: Chesnay Schepler <ch...@apache.org>
AuthorDate: Fri Apr 26 10:23:06 2019 +0200

    disable kafka 0.11 e2e test
---
 flink-end-to-end-tests/run-pre-commit-tests.sh | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/flink-end-to-end-tests/run-pre-commit-tests.sh b/flink-end-to-end-tests/run-pre-commit-tests.sh
index 4a9881c..57eb8d1 100755
--- a/flink-end-to-end-tests/run-pre-commit-tests.sh
+++ b/flink-end-to-end-tests/run-pre-commit-tests.sh
@@ -54,7 +54,7 @@ run_test "Batch Python Wordcount end-to-end test" "$END_TO_END_DIR/test-scripts/
 run_test "Streaming Python Wordcount end-to-end test" "$END_TO_END_DIR/test-scripts/test_streaming_python_wordcount.sh"
 run_test "Wordcount end-to-end test" "$END_TO_END_DIR/test-scripts/test_batch_wordcount.sh"
 #run_test "Kafka 0.10 end-to-end test" "$END_TO_END_DIR/test-scripts/test_streaming_kafka010.sh"
-run_test "Kafka 0.11 end-to-end test" "$END_TO_END_DIR/test-scripts/test_streaming_kafka011.sh"
+#run_test "Kafka 0.11 end-to-end test" "$END_TO_END_DIR/test-scripts/test_streaming_kafka011.sh"
 run_test "Modern Kafka end-to-end test" "$END_TO_END_DIR/test-scripts/test_streaming_kafka.sh"
 run_test "Kinesis end-to-end test" "$END_TO_END_DIR/test-scripts/test_streaming_kinesis.sh"
 run_test "class loading end-to-end test" "$END_TO_END_DIR/test-scripts/test_streaming_classloader.sh"


[flink] 12/15: [hotfix][tests] Remove forced step logging in AutoClosableProcess

Posted by ch...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a commit to branch travis_jdk9_test
in repository https://gitbox.apache.org/repos/asf/flink.git

commit 14b3ee3b514f080d57d772763ba93afa44197d37
Author: zentol <ch...@apache.org>
AuthorDate: Thu Jan 24 15:02:16 2019 +0100

    [hotfix][tests] Remove forced step logging in AutoClosableProcess
---
 .../flink/tests/util/AutoClosableProcess.java      | 46 +++++++++++-----------
 .../apache/flink/tests/util/FlinkDistribution.java |  6 ++-
 .../tests/PrometheusReporterEndToEndITCase.java    | 13 +++---
 3 files changed, 35 insertions(+), 30 deletions(-)

diff --git a/flink-end-to-end-tests/flink-end-to-end-tests-common/src/main/java/org/apache/flink/tests/util/AutoClosableProcess.java b/flink-end-to-end-tests/flink-end-to-end-tests-common/src/main/java/org/apache/flink/tests/util/AutoClosableProcess.java
index 0235930..fbeeb1d 100644
--- a/flink-end-to-end-tests/flink-end-to-end-tests-common/src/main/java/org/apache/flink/tests/util/AutoClosableProcess.java
+++ b/flink-end-to-end-tests/flink-end-to-end-tests-common/src/main/java/org/apache/flink/tests/util/AutoClosableProcess.java
@@ -20,9 +20,6 @@ package org.apache.flink.tests.util;
 
 import org.apache.flink.util.Preconditions;
 
-import org.slf4j.Logger;
-import org.slf4j.LoggerFactory;
-
 import java.io.IOException;
 import java.time.Duration;
 import java.util.concurrent.TimeUnit;
@@ -33,8 +30,6 @@ import java.util.concurrent.TimeoutException;
  */
 public class AutoClosableProcess implements AutoCloseable {
 
-	private static final Logger LOG = LoggerFactory.getLogger(AutoClosableProcess.class);
-
 	private final Process process;
 
 	public AutoClosableProcess(final Process process) {
@@ -42,35 +37,40 @@ public class AutoClosableProcess implements AutoCloseable {
 		this.process = process;
 	}
 
-	public static AutoClosableProcess runNonBlocking(String step, String... commands) throws IOException {
-		LOG.info("Step Started: " + step);
-		Process process = new ProcessBuilder()
-			.command(commands)
-			.inheritIO()
-			.start();
-		return new AutoClosableProcess(process);
+	public Process getProcess() {
+		return process;
+	}
+
+	public static AutoClosableProcess runNonBlocking(String... commands) throws IOException {
+		return runNonBlocking(commands);
 	}
 
-	public static void runBlocking(String step, String... commands) throws IOException {
-		runBlocking(step, Duration.ofSeconds(30), commands);
+	public static Process runBlocking(String... commands) throws IOException {
+		return runBlocking(Duration.ofSeconds(30), commands);
 	}
 
-	public static void runBlocking(String step, Duration timeout, String... commands) throws IOException {
-		LOG.info("Step started: " + step);
-		Process process = new ProcessBuilder()
-			.command(commands)
-			.inheritIO()
-			.start();
+	public static Process runBlocking(Duration timeout, String... commands) throws IOException {
+		final Process process = createProcess(commands);
 
 		try (AutoClosableProcess autoProcess = new AutoClosableProcess(process)) {
 			final boolean success = process.waitFor(timeout.toMillis(), TimeUnit.MILLISECONDS);
 			if (!success) {
-				throw new TimeoutException();
+				throw new TimeoutException("Process exceeded timeout of " + timeout.getSeconds() + "seconds.");
+			}
+			if (process.exitValue() != 0) {
+				throw new RuntimeException("Process execution failed due error.");
 			}
 		} catch (TimeoutException | InterruptedException e) {
-			throw new RuntimeException(step + " failed due to timeout.");
+			throw new RuntimeException("Process failed due to timeout.");
 		}
-		LOG.info("Step complete: " + step);
+		return process;
+	}
+
+	private static Process createProcess(String... commands) throws IOException {
+		final ProcessBuilder processBuilder = new ProcessBuilder();
+		processBuilder.command(commands);
+		processBuilder.inheritIO();
+		return processBuilder.start();
 	}
 
 	@Override
diff --git a/flink-end-to-end-tests/flink-end-to-end-tests-common/src/main/java/org/apache/flink/tests/util/FlinkDistribution.java b/flink-end-to-end-tests/flink-end-to-end-tests-common/src/main/java/org/apache/flink/tests/util/FlinkDistribution.java
index 8c4a39c..5192bf2 100644
--- a/flink-end-to-end-tests/flink-end-to-end-tests-common/src/main/java/org/apache/flink/tests/util/FlinkDistribution.java
+++ b/flink-end-to-end-tests/flink-end-to-end-tests-common/src/main/java/org/apache/flink/tests/util/FlinkDistribution.java
@@ -125,7 +125,8 @@ public final class FlinkDistribution extends ExternalResource {
 	}
 
 	public void startFlinkCluster() throws IOException {
-		AutoClosableProcess.runBlocking("Start Flink cluster", bin.resolve("start-cluster.sh").toAbsolutePath().toString());
+		LOG.info("Starting Flink cluster.");
+		AutoClosableProcess.runBlocking(bin.resolve("start-cluster.sh").toAbsolutePath().toString());
 
 		final OkHttpClient client = new OkHttpClient();
 
@@ -163,7 +164,8 @@ public final class FlinkDistribution extends ExternalResource {
 	}
 
 	public void stopFlinkCluster() throws IOException {
-		AutoClosableProcess.runBlocking("Stop Flink Cluster", bin.resolve("stop-cluster.sh").toAbsolutePath().toString());
+		LOG.info("Stopping Flink cluster.");
+		AutoClosableProcess.runBlocking(bin.resolve("stop-cluster.sh").toAbsolutePath().toString());
 	}
 
 	public void copyOptJarsToLib(String jarNamePrefix) throws FileNotFoundException, IOException {
diff --git a/flink-end-to-end-tests/flink-metrics-reporter-prometheus-test/src/test/java/org/apache/flink/metrics/prometheus/tests/PrometheusReporterEndToEndITCase.java b/flink-end-to-end-tests/flink-metrics-reporter-prometheus-test/src/test/java/org/apache/flink/metrics/prometheus/tests/PrometheusReporterEndToEndITCase.java
index 269754e..5d189de 100644
--- a/flink-end-to-end-tests/flink-metrics-reporter-prometheus-test/src/test/java/org/apache/flink/metrics/prometheus/tests/PrometheusReporterEndToEndITCase.java
+++ b/flink-end-to-end-tests/flink-metrics-reporter-prometheus-test/src/test/java/org/apache/flink/metrics/prometheus/tests/PrometheusReporterEndToEndITCase.java
@@ -108,15 +108,16 @@ public class PrometheusReporterEndToEndITCase extends TestLogger {
 		final Path prometheusBinary = prometheusBinDir.resolve("prometheus");
 		Files.createDirectory(tmpPrometheusDir);
 
+		LOG.info("Downloading Prometheus.");
 		runBlocking(
-			"Download of Prometheus",
 			Duration.ofMinutes(5),
 			CommandLineWrapper
 				.wget("https://github.com/prometheus/prometheus/releases/download/v" + PROMETHEUS_VERSION + '/' + prometheusArchive.getFileName())
 				.targetDir(tmpPrometheusDir)
 				.build());
 
-		runBlocking("Extraction of Prometheus archive",
+		LOG.info("Unpacking Prometheus.");
+		runBlocking(
 			CommandLineWrapper
 				.tar(prometheusArchive)
 				.extract()
@@ -124,7 +125,8 @@ public class PrometheusReporterEndToEndITCase extends TestLogger {
 				.targetDir(tmpPrometheusDir)
 				.build());
 
-		runBlocking("Set Prometheus scrape interval",
+		LOG.info("Setting Prometheus scrape interval.");
+		runBlocking(
 			CommandLineWrapper
 				.sed("s/\\(scrape_interval:\\).*/\\1 1s/", prometheusConfig)
 				.inPlace()
@@ -141,14 +143,15 @@ public class PrometheusReporterEndToEndITCase extends TestLogger {
 			.map(port -> "'localhost:" + port + "'")
 			.collect(Collectors.joining(", "));
 
-		runBlocking("Set Prometheus scrape targets to (" + scrapeTargets + ")",
+		LOG.info("Setting Prometheus scrape targets to {}.", scrapeTargets);
+		runBlocking(
 			CommandLineWrapper
 				.sed("s/\\(targets:\\).*/\\1 [" + scrapeTargets + "]/", prometheusConfig)
 				.inPlace()
 				.build());
 
+		LOG.info("Starting Prometheus server.");
 		try (AutoClosableProcess prometheus = runNonBlocking(
-			"Start Prometheus server",
 			prometheusBinary.toAbsolutePath().toString(),
 			"--config.file=" + prometheusConfig.toAbsolutePath(),
 			"--storage.tsdb.path=" + prometheusBinDir.resolve("data").toAbsolutePath())) {


[flink] 09/15: bump zookeeper to Java9 compatible version

Posted by ch...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a commit to branch travis_jdk9_test
in repository https://gitbox.apache.org/repos/asf/flink.git

commit 40ef7d35549b3e1c38cff26a132ab674e00360f6
Author: Chesnay Schepler <ch...@apache.org>
AuthorDate: Fri May 3 11:42:21 2019 +0200

    bump zookeeper to Java9 compatible version
---
 pom.xml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/pom.xml b/pom.xml
index 539e9d0..a1a612f 100644
--- a/pom.xml
+++ b/pom.xml
@@ -118,7 +118,7 @@ under the License.
 		<scala.version>2.11.12</scala.version>
 		<scala.binary.version>2.11</scala.binary.version>
 		<chill.version>0.7.6</chill.version>
-		<zookeeper.version>3.4.10</zookeeper.version>
+		<zookeeper.version>3.4.11</zookeeper.version>
 		<curator.version>2.12.0</curator.version>
 		<jackson.version>2.7.9</jackson.version>
 		<metrics.version>3.1.5</metrics.version>


[flink] 14/15: prevent concurrent checkpoints

Posted by ch...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

chesnay pushed a commit to branch travis_jdk9_test
in repository https://gitbox.apache.org/repos/asf/flink.git

commit 6e6f6485bc16c426b5c753ad8579d7c710300dff
Author: Chesnay Schepler <ch...@apache.org>
AuthorDate: Thu May 9 11:50:57 2019 +0200

    prevent concurrent checkpoints
---
 .../apache/flink/streaming/tests/DataStreamAllroundTestJobFactory.java  | 2 ++
 1 file changed, 2 insertions(+)

diff --git a/flink-end-to-end-tests/flink-datastream-allround-test/src/main/java/org/apache/flink/streaming/tests/DataStreamAllroundTestJobFactory.java b/flink-end-to-end-tests/flink-datastream-allround-test/src/main/java/org/apache/flink/streaming/tests/DataStreamAllroundTestJobFactory.java
index 913d030..dac811b 100644
--- a/flink-end-to-end-tests/flink-datastream-allround-test/src/main/java/org/apache/flink/streaming/tests/DataStreamAllroundTestJobFactory.java
+++ b/flink-end-to-end-tests/flink-datastream-allround-test/src/main/java/org/apache/flink/streaming/tests/DataStreamAllroundTestJobFactory.java
@@ -251,6 +251,8 @@ public class DataStreamAllroundTestJobFactory {
 
 		env.enableCheckpointing(checkpointInterval, checkpointingMode);
 
+		env.getCheckpointConfig().setMinPauseBetweenCheckpoints(50);
+
 		boolean enableExternalizedCheckpoints = pt.getBoolean(
 			ENVIRONMENT_EXTERNALIZE_CHECKPOINT.key(),
 			ENVIRONMENT_EXTERNALIZE_CHECKPOINT.defaultValue());