You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@zeppelin.apache.org by mo...@apache.org on 2017/03/20 16:35:28 UTC
zeppelin git commit: [HOTFIX] [ZEPPELIN-2286] Fix CI and split some
test matrix that often exceeds time limits (50min)
Repository: zeppelin
Updated Branches:
refs/heads/master 0828e6fb1 -> 641863d56
[HOTFIX] [ZEPPELIN-2286] Fix CI and split some test matrix that often exceeds time limits (50min)
### What is this PR for?
This PR optimize CI test matrix while keeping the same test coverage.
from
1. RAT (1~2min)
2. All modules with -Pbuild-distr flag, spark 2.1 and scala 2.11 (40~50min)
3. All modules with -Pbuild-distr flag, spark 2.0 and scala 2.11 (40~50min)
4. spark 1.6 and scala 2.10 (7~9min)
5. spark 1.6 and scala 2.11 (7~8min)
7. Selenium (20~23min)
8. python2 (6~7min)
9. python3 (6 ~7min)
Total. 128 ~156min
to
1. RAT (1~2min)
2. Core modules without interpreters (14~15min)
3. Selenium (20~23min)
4. All interpreters except for spark, livy (9~11min)
5. Spark 2.1 and scala 2.11, livy (7~9min)
6. Spark 2.0 and scala 2.11 (7~8min)
7. Spark 1.6 and scala 2.10 (7~8min)
8. Spark 1.6 and scala 2.11 (7~8min)
9. python2 (6~7min)
10. python3 (6 ~7min)
Total. (84 ~98min)
### What type of PR is it?
Improvement | Hot fix.
### Todos
* [x] - Optimize CI test matrix
### What is the Jira issue?
https://issues.apache.org/jira/browse/ZEPPELIN-2286
### How should this be tested?
CI green
### Questions:
* Does the licenses files need update? no
* Is there breaking changes for older versions? no
* Does this needs documentation? no
Author: Lee moon soo <mo...@apache.org>
Closes #2162 from Leemoonsoo/split_ci_metrics and squashes the following commits:
08fb8ea [Lee moon soo] restore zeppelin-server/pom.xml
1b61b2c [Lee moon soo] adjust order of test considering travis scheduling
5234bfa [Lee moon soo] Livy 0.2 test does not work
0e3040a [Lee moon soo] remove explicit LIVY_VER
ec7af74 [Lee moon soo] add -DfailIfNoTests=false
957443f [Lee moon soo] try exclude test in different way
867b877 [Lee moon soo] set livy 0.3 explicitly
4ac3097 [Lee moon soo] other way to exclude spark from core module test
9958f78 [Lee moon soo] exclude spark test from core module test
04eebcb [Lee moon soo] fix profiles
39b7b65 [Lee moon soo] fix option
abe195a [Lee moon soo] add missing env
2e994a6 [Lee moon soo] fix travis.yml
abb54f9 [Lee moon soo] add test profile that test interpretes
08fcddc [Lee moon soo] try differnt way pass params
a80c94e [Lee moon soo] try differnt way set global env
57ffb38 [Lee moon soo] exclude interpreters does not reqruied by zeppelin-server integration test
05bf826 [Lee moon soo] Revert "assume spark interpreter may not exists in certain test metrics"
5d8d15c [Lee moon soo] include root pom in -pl
27da1cb [Lee moon soo] assume spark interpreter may not exists in certain test metrics
78784a8 [Lee moon soo] configure surefire plugin for zeppelin-server
939e0c7 [Lee moon soo] try set scala.version
d5340d0 [Lee moon soo] set fork count 1
76ee8fa [Lee moon soo] Define scala.binary.version
0654623 [Lee moon soo] Prevent download spark distribution when unnecessary
4c8ffd2 [Lee moon soo] Move out spark and livy test to separate test metrics
Project: http://git-wip-us.apache.org/repos/asf/zeppelin/repo
Commit: http://git-wip-us.apache.org/repos/asf/zeppelin/commit/641863d5
Tree: http://git-wip-us.apache.org/repos/asf/zeppelin/tree/641863d5
Diff: http://git-wip-us.apache.org/repos/asf/zeppelin/diff/641863d5
Branch: refs/heads/master
Commit: 641863d5631598fb279ddc39ce490a98f8bc6026
Parents: 0828e6f
Author: Lee moon soo <mo...@apache.org>
Authored: Sun Mar 19 21:24:01 2017 -0700
Committer: Lee moon soo <mo...@apache.org>
Committed: Mon Mar 20 09:35:25 2017 -0700
----------------------------------------------------------------------
.travis.yml | 45 ++++++++++++++++++++++++++++-----------------
1 file changed, 28 insertions(+), 17 deletions(-)
----------------------------------------------------------------------
http://git-wip-us.apache.org/repos/asf/zeppelin/blob/641863d5/.travis.yml
----------------------------------------------------------------------
diff --git a/.travis.yml b/.travis.yml
index ab4e8f6..45ed079 100644
--- a/.travis.yml
+++ b/.travis.yml
@@ -33,41 +33,52 @@ addons:
packages:
- r-base-dev
+env:
+ global:
+ # Interpreters does not required by zeppelin-server integration tests
+ - INTERPRETERS='!hbase,!pig,!jdbc,!file,!flink,!ignite,!kylin,!python,!lens,!cassandra,!elasticsearch,!bigquery,!alluxio,!scio,!livy'
+
matrix:
include:
# Test License compliance using RAT tool
- jdk: "oraclejdk7"
env: SCALA_VER="2.11" SPARK_VER="2.0.2" HADOOP_VER="2.6" PROFILE="-Prat" BUILD_FLAG="clean" TEST_FLAG="org.apache.rat:apache-rat-plugin:check" TEST_PROJECTS=""
- # Test all modules with spark 2.1.0 and scala 2.11
- - sudo: required
- jdk: "oraclejdk7"
- env: SCALA_VER="2.11" SPARK_VER="2.1.0" HADOOP_VER="2.6" LIVY_VER="0.3.0" PROFILE="-Pspark-2.1 -Phadoop-2.6 -Ppyspark -Psparkr -Pscalding -Phelium-dev -Pexamples -Pscala-2.11 -Plivy-0.3" BUILD_FLAG="package -Pbuild-distr -DskipRat" TEST_FLAG="verify -Pusing-packaged-distr -DskipRat" TEST_PROJECTS=""
+ # Test core modules
+ - jdk: "oraclejdk7"
+ env: SCALA_VER="2.11" SPARK_VER="2.1.0" HADOOP_VER="2.6" PROFILE="-Pscalding -Phelium-dev -Pexamples -Pscala-2.11" BUILD_FLAG="package -Pbuild-distr -DskipRat" TEST_FLAG="verify -Pusing-packaged-distr -DskipRat" MODULES="-pl ${INTERPRETERS}" TEST_PROJECTS="-Dtest='!ZeppelinSparkClusterTest,!org.apache.zeppelin.spark.*' -DfailIfNoTests=false"
- # Test all modules with spark 2.0.2 and scala 2.11
+ # Test selenium with spark module for 1.6.3
- jdk: "oraclejdk7"
- env: SCALA_VER="2.11" SPARK_VER="2.0.2" HADOOP_VER="2.6" PROFILE="-Pspark-2.0 -Phadoop-2.6 -Ppyspark -Psparkr -Pscalding -Phelium-dev -Pexamples -Pscala-2.11" BUILD_FLAG="package -Pbuild-distr -DskipRat" TEST_FLAG="verify -Pusing-packaged-distr -DskipRat" TEST_PROJECTS=""
+ env: TEST_SELENIUM="true" SCALA_VER="2.10" SPARK_VER="1.6.3" HADOOP_VER="2.6" PROFILE="-Pspark-1.6 -Phadoop-2.6 -Ppyspark -Phelium-dev -Pexamples" BUILD_FLAG="package -DskipTests -DskipRat" TEST_FLAG="verify -DskipRat" TEST_PROJECTS="-pl .,zeppelin-interpreter,zeppelin-zengine,zeppelin-server,zeppelin-display,spark-dependencies,spark -Dtest=org.apache.zeppelin.AbstractFunctionalSuite -DfailIfNoTests=false"
- # Test spark module for 1.6.3 with scala 2.10
- - sudo: required
- jdk: "oraclejdk7"
- env: SCALA_VER="2.10" SPARK_VER="1.6.3" HADOOP_VER="2.6" LIVY_VER="0.2.0" PROFILE="-Pspark-1.6 -Phadoop-2.6 -Ppyspark -Psparkr -Pscala-2.10 -Plivy-0.2" BUILD_FLAG="package -DskipTests -DskipRat" TEST_FLAG="test -DskipRat" MODULES="-pl zeppelin-interpreter,zeppelin-zengine,zeppelin-server,zeppelin-display,spark-dependencies,spark" TEST_PROJECTS="-Dtest=ZeppelinSparkClusterTest,org.apache.zeppelin.spark.* -DfailIfNoTests=false"
+ # Test interpreter modules
+ - jdk: "oraclejdk7"
+ env: SCALA_VER="2.10" PROFILE="-Pscalding" BUILD_FLAG="package -DskipTests -DskipRat" TEST_FLAG="test -DskipRat" MODULES="-pl $(echo .,zeppelin-interpreter,${INTERPRETERS} | sed 's/!//g')" TEST_PROJECTS=""
- # Test spark module for 1.6.3 with scala 2.11
+ # Test spark module for 2.1.0 with scala 2.11, livy
- jdk: "oraclejdk7"
- env: SCALA_VER="2.11" SPARK_VER="1.6.3" HADOOP_VER="2.6" PROFILE="-Pspark-1.6 -Phadoop-2.6 -Ppyspark -Psparkr -Pscala-2.11 -Dscala.version=2.11.7 -Dscala.binary.version=2.11" BUILD_FLAG="package -DskipTests -DskipRat" TEST_FLAG="test -DskipRat" MODULES="-pl zeppelin-interpreter,zeppelin-zengine,zeppelin-server,zeppelin-display,spark-dependencies,spark" TEST_PROJECTS="-Dtest=ZeppelinSparkClusterTest,org.apache.zeppelin.spark.* -DfailIfNoTests=false"
+ env: SCALA_VER="2.11" SPARK_VER="2.1.0" HADOOP_VER="2.6" PROFILE="-Pspark-2.1 -Phadoop-2.6 -Ppyspark -Psparkr -Pscala-2.11" BUILD_FLAG="package -DskipTests -DskipRat" TEST_FLAG="test -DskipRat" MODULES="-pl .,zeppelin-interpreter,zeppelin-zengine,zeppelin-server,zeppelin-display,spark-dependencies,spark,livy" TEST_PROJECTS="-Dtest=ZeppelinSparkClusterTest,org.apache.zeppelin.spark.*,org.apache.zeppelin.livy.* -DfailIfNoTests=false"
- # Test selenium with spark module for 1.6.3
+ # Test spark module for 2.0.2 with scala 2.11
+ - jdk: "oraclejdk7"
+ env: SCALA_VER="2.11" SPARK_VER="2.0.2" HADOOP_VER="2.6" PROFILE="-Pspark-2.0 -Phadoop-2.6 -Ppyspark -Psparkr -Pscala-2.11" BUILD_FLAG="package -DskipTests -DskipRat" TEST_FLAG="test -DskipRat" MODULES="-pl .,zeppelin-interpreter,zeppelin-zengine,zeppelin-server,zeppelin-display,spark-dependencies,spark" TEST_PROJECTS="-Dtest=ZeppelinSparkClusterTest,org.apache.zeppelin.spark.* -DfailIfNoTests=false"
+
+ # Test spark module for 1.6.3 with scala 2.10
+ - jdk: "oraclejdk7"
+ env: SCALA_VER="2.10" SPARK_VER="1.6.3" HADOOP_VER="2.6" PROFILE="-Pspark-1.6 -Phadoop-2.6 -Ppyspark -Psparkr -Pscala-2.10" BUILD_FLAG="package -DskipTests -DskipRat" TEST_FLAG="test -DskipRat" MODULES="-pl .,zeppelin-interpreter,zeppelin-zengine,zeppelin-server,zeppelin-display,spark-dependencies,spark" TEST_PROJECTS="-Dtest=ZeppelinSparkClusterTest,org.apache.zeppelin.spark.*,org.apache.zeppelin.spark.* -DfailIfNoTests=false"
+
+ # Test spark module for 1.6.3 with scala 2.11
- jdk: "oraclejdk7"
- env: TEST_SELENIUM="true" SCALA_VER="2.10" SPARK_VER="1.6.3" HADOOP_VER="2.6" PROFILE="-Pspark-1.6 -Phadoop-2.6 -Ppyspark -Phelium-dev -Pexamples" BUILD_FLAG="package -DskipTests -DskipRat" TEST_FLAG="verify -DskipRat" TEST_PROJECTS="-pl zeppelin-interpreter,zeppelin-zengine,zeppelin-server,zeppelin-display,spark-dependencies,spark -Dtest=org.apache.zeppelin.AbstractFunctionalSuite -DfailIfNoTests=false"
+ env: SCALA_VER="2.11" SPARK_VER="1.6.3" HADOOP_VER="2.6" PROFILE="-Pspark-1.6 -Phadoop-2.6 -Ppyspark -Psparkr -Pscala-2.11" BUILD_FLAG="package -DskipTests -DskipRat" TEST_FLAG="test -DskipRat" MODULES="-pl .,zeppelin-interpreter,zeppelin-zengine,zeppelin-server,zeppelin-display,spark-dependencies,spark" TEST_PROJECTS="-Dtest=ZeppelinSparkClusterTest,org.apache.zeppelin.spark.* -DfailIfNoTests=false"
# Test python/pyspark with python 2
- jdk: "oraclejdk7"
- env: PYTHON="2" SCALA_VER="2.10" SPARK_VER="1.6.1" HADOOP_VER="2.6" PROFILE="-Pspark-1.6 -Phadoop-2.6 -Ppyspark" BUILD_FLAG="package -pl spark,python -am -DskipTests -DskipRat" TEST_FLAG="test -DskipRat" MODULES="-pl zeppelin-interpreter,zeppelin-display,spark-dependencies,spark,python" TEST_PROJECTS="-Dtest=org.apache.zeppelin.spark.PySpark*Test,org.apache.zeppelin.python.* -Dpyspark.test.exclude='' -DfailIfNoTests=false"
+ env: PYTHON="2" SCALA_VER="2.10" SPARK_VER="1.6.1" HADOOP_VER="2.6" PROFILE="-Pspark-1.6 -Phadoop-2.6 -Ppyspark" BUILD_FLAG="package -am -DskipTests -DskipRat" TEST_FLAG="test -DskipRat" MODULES="-pl .,zeppelin-interpreter,zeppelin-display,spark-dependencies,spark,python" TEST_PROJECTS="-Dtest=org.apache.zeppelin.spark.PySpark*Test,org.apache.zeppelin.python.* -Dpyspark.test.exclude='' -DfailIfNoTests=false"
# Test python/pyspark with python 3
- jdk: "oraclejdk7"
- env: PYTHON="3" SCALA_VER="2.11" SPARK_VER="2.0.0" HADOOP_VER="2.6" PROFILE="-Pspark-2.0 -Phadoop-2.6 -Ppyspark -Pscala-2.11" BUILD_FLAG="package -pl spark,python -am -DskipTests -DskipRat" TEST_FLAG="test -DskipRat" MODULES="-pl zeppelin-interpreter,zeppelin-display,spark-dependencies,spark,python" TEST_PROJECTS="-Dtest=org.apache.zeppelin.spark.PySpark*Test,org.apache.zeppelin.python.* -Dpyspark.test.exclude='' -DfailIfNoTests=false"
+ env: PYTHON="3" SCALA_VER="2.11" SPARK_VER="2.0.0" HADOOP_VER="2.6" PROFILE="-Pspark-2.0 -Phadoop-2.6 -Ppyspark -Pscala-2.11" BUILD_FLAG="package -am -DskipTests -DskipRat" TEST_FLAG="test -DskipRat" MODULES="-pl .,zeppelin-interpreter,zeppelin-display,spark-dependencies,spark,python" TEST_PROJECTS="-Dtest=org.apache.zeppelin.spark.PySpark*Test,org.apache.zeppelin.python.* -Dpyspark.test.exclude='' -DfailIfNoTests=false"
before_install:
- echo "MAVEN_OPTS='-Xms1024M -Xmx2048M -XX:MaxPermSize=1024m -XX:-UseGCOverheadLimit -Dorg.slf4j.simpleLogger.defaultLogLevel=warn'" >> ~/.mavenrc
@@ -82,7 +93,7 @@ install:
- mvn $BUILD_FLAG $MODULES $PROFILE -B
before_script:
- - travis_retry ./testing/downloadSpark.sh $SPARK_VER $HADOOP_VER
+ - if [[ -n $SPARK_VER ]]; then travis_retry ./testing/downloadSpark.sh $SPARK_VER $HADOOP_VER; fi
- if [[ -n $LIVY_VER ]]; then ./testing/downloadLivy.sh $LIVY_VER; fi
- if [[ -n $LIVY_VER ]]; then export LIVY_HOME=`pwd`/livy-server-$LIVY_VER; fi
- if [[ -n $LIVY_VER ]]; then export SPARK_HOME=`pwd`/spark-$SPARK_VER-bin-hadoop$HADOOP_VER; fi