You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@zeppelin.apache.org by pd...@apache.org on 2020/12/09 11:30:35 UTC
[zeppelin] branch branch-0.9 updated: [ZEPPELIN-4385] Migrate to
GitHub actions
This is an automated email from the ASF dual-hosted git repository.
pdallig pushed a commit to branch branch-0.9
in repository https://gitbox.apache.org/repos/asf/zeppelin.git
The following commit(s) were added to refs/heads/branch-0.9 by this push:
new 795f58b [ZEPPELIN-4385] Migrate to GitHub actions
795f58b is described below
commit 795f58b19e22656862772e2570669955d70ae72d
Author: Philipp Dallig <ph...@gmail.com>
AuthorDate: Wed Dec 2 15:57:43 2020 +0100
[ZEPPELIN-4385] Migrate to GitHub actions
This PR will change the CI-System from [Travis](https://travis-ci.com/) to [GitHub Actions](https://github.com/features/actions)
**Advantages:**
- [high usage limits](https://docs.github.com/en/free-pro-teamlatest/actions/reference/usage-limits-billing-and-administration#usage-limits)
- No CI setup is required from the contributors
- good integration with GitHub Pull request
**Disadvantage:**
- At the moment we have some flappy tests and external dependencies. Both can cause a failed test, but we cannot restart a single job, only the entire workflow.
**Miscellaneous:**
- CI is triggered with every push event
- CI is triggered when PR event is open and synchronized
- Selenium test is updated to python3
- The Livy-Python2 test is dropped because Python2 has reached [the end of its lifetime](https://www.python.org/doc/sunset-python-2/)
- Improvement
* [ ] - Feedback for job splitting
* [x] - What is the purpose of the Jenkins job?
* https://issues.apache.org/jira/browse/ZEPPELIN-4385
* Strongly recommended: add automated unit tests for any new or changed behavior
* Outline any manual steps to test the PR here.
* Does the licenses files need update? No
* Is there breaking changes for older versions? No
* Does this needs documentation? No
Author: Philipp Dallig <ph...@gmail.com>
Closes #3986 from Reamer/github_actions and squashes the following commits:
eb13433a8 [Philipp Dallig] Goodbye travis
3de27c699 [Philipp Dallig] Adjust README
8302f5349 [Philipp Dallig] Adjust template for pull request
76d9bb22e [Philipp Dallig] Adjustments documentation
08155783e [Philipp Dallig] Remove travis_check.py
e865f2fc1 [Philipp Dallig] Remove "travis" in source code
6790b6e42 [Philipp Dallig] Add core.yml
27330791f [Philipp Dallig] Rewrite selenium tests to python3
1772024fc [Philipp Dallig] Add frontend tests
c81f48492 [Philipp Dallig] Add github actions workflow (rat)
18d5dbecb [Philipp Dallig] some cleanup
(cherry picked from commit 9e3b4eb9ae537e5304baf202d761bcad46ced5b4)
Signed-off-by: Philipp Dallig <ph...@gmail.com>
---
.github/PULL_REQUEST_TEMPLATE | 1 -
.github/workflows/core.yml | 366 +++++++++++++++++++++
.github/workflows/frontend.yml | 97 ++++++
.github/workflows/rat.yml | 23 ++
README.md | 4 +-
.../contribution/how_to_contribute_code.md | 2 +-
docs/development/writing_zeppelin_interpreter.md | 2 +-
livy/README.md | 6 +-
pom.xml | 2 +-
.../apache/zeppelin/python/IPythonInterpreter.java | 11 +-
.../org/apache/zeppelin/python/PythonUtils.java | 4 +-
.../zeppelin/python/IPythonInterpreterTest.java | 4 +-
.../apache/zeppelin/r/ShinyInterpreterTest.java | 13 +-
.../zeppelin/shell/TerminalInterpreterTest.java | 8 +-
testing/env_python_3 with_flink_1_10.yml | 25 ++
testing/env_python_3 with_flink_1_11.yml | 25 ++
testing/env_python_3.yml | 23 ++
testing/env_python_3_with_R.yml | 33 ++
testing/env_python_3_with_R_and_tensorflow.yml | 35 ++
testing/install_external_dependencies.sh | 66 ----
travis_check.py | 175 ----------
.../integration/InterpreterModeActionsIT.java | 12 +-
.../zeppelin/integration/SparkParagraphIT.java | 2 +-
.../zeppelin/integration/JdbcIntegrationTest.java | 1 +
.../integration/ZeppelinSparkClusterTest.java | 2 +-
.../K8sStandardInterpreterLauncherTest.java | 4 +-
26 files changed, 664 insertions(+), 282 deletions(-)
diff --git a/.github/PULL_REQUEST_TEMPLATE b/.github/PULL_REQUEST_TEMPLATE
index 25b263a..addaf21 100644
--- a/.github/PULL_REQUEST_TEMPLATE
+++ b/.github/PULL_REQUEST_TEMPLATE
@@ -14,7 +14,6 @@ First time? Check out the contributing guide - https://zeppelin.apache.org/contr
* Put link here, and add [ZEPPELIN-*Jira number*] in PR title, eg. [ZEPPELIN-533]
### How should this be tested?
-* First time? Setup Travis CI as described on https://zeppelin.apache.org/contribution/contributions.html#continuous-integration
* Strongly recommended: add automated unit tests for any new or changed behavior
* Outline any manual steps to test the PR here.
diff --git a/.github/workflows/core.yml b/.github/workflows/core.yml
new file mode 100644
index 0000000..4faf67b
--- /dev/null
+++ b/.github/workflows/core.yml
@@ -0,0 +1,366 @@
+name: core
+on:
+ push:
+ pull_request:
+ branches:
+ - master
+ - branch-*
+ types: [opened, synchronize]
+
+env:
+ MAVEN_OPTS: "-Xms1024M -Xmx2048M -XX:MaxMetaspaceSize=1024m -XX:-UseGCOverheadLimit -Dorg.slf4j.simpleLogger.log.org.apache.maven.cli.transfer.Slf4jMavenTransferListener=warn"
+ ZEPPELIN_HELIUM_REGISTRY: helium
+ SPARK_PRINT_LAUNCH_COMMAND: "true"
+# Use the bash login, because we are using miniconda
+defaults:
+ run:
+ shell: bash -l {0}
+
+jobs:
+ test-core-modules:
+ runs-on: ubuntu-18.04
+ strategy:
+ matrix:
+ hadoop: [hadoop2, hadoop3]
+ steps:
+ - name: Checkout
+ uses: actions/checkout@v2
+ - name: Set up JDK 8
+ uses: actions/setup-java@v1
+ with:
+ java-version: 8
+ - name: Cache local Maven repository
+ uses: actions/cache@v2
+ with:
+ path: |
+ ~/.m2/repository
+ !~/.m2/repository/org/apache/zeppelin/
+ key: ${{ runner.os }}-zeppelin-${{ hashFiles('**/pom.xml') }}
+ restore-keys: |
+ ${{ runner.os }}-zeppelin-
+ - name: Setup conda environment with python 3.7 and R
+ uses: conda-incubator/setup-miniconda@v2
+ with:
+ activate-environment: python_3_with_R
+ environment-file: testing/env_python_3_with_R.yml
+ python-version: 3.7
+ auto-activate-base: false
+ channel-priority: strict
+ - name: Make IRkernel available to Jupyter
+ run: |
+ R -e "IRkernel::installspec()"
+ conda list
+ conda info
+ - name: install plugins and interpreter
+ run: |
+ mvn install -Pbuild-distr -DskipRat -DskipTests -pl zeppelin-server,zeppelin-web,spark/spark-dependencies,markdown,angular,shell -am -Phelium-dev -Pexamples -P${{ matrix.hadoop }} -B
+ mvn package -DskipRat -T 2C -pl zeppelin-plugins -amd -DskipTests -B
+ - name: run tests with ${{ matrix.hadoop }}
+ run: mvn verify -Pusing-packaged-distr -DskipRat -pl zeppelin-server,zeppelin-web,spark/spark-dependencies,markdown,angular,shell -am -Phelium-dev -Pexamples -P${{ matrix.hadoop }} -Dtests.to.exclude=**/org/apache/zeppelin/spark/* -DfailIfNoTests=false
+ test-interpreter-modules:
+ runs-on: ubuntu-18.04
+ env:
+ INTERPRETERS: 'beam,hbase,pig,jdbc,file,flink,ignite,kylin,lens,cassandra,elasticsearch,bigquery,alluxio,scio,livy,groovy,sap,java,geode,neo4j,hazelcastjet,submarine,sparql,mongodb'
+ steps:
+ - name: Checkout
+ uses: actions/checkout@v2
+ - name: Set up JDK 8
+ uses: actions/setup-java@v1
+ with:
+ java-version: 8
+ - name: Cache local Maven repository
+ uses: actions/cache@v2
+ with:
+ path: |
+ ~/.m2/repository
+ !~/.m2/repository/org/apache/zeppelin/
+ key: ${{ runner.os }}-zeppelin-${{ hashFiles('**/pom.xml') }}
+ restore-keys: |
+ ${{ runner.os }}-zeppelin-
+ - name: Setup conda environment with python 3.7 and R
+ uses: conda-incubator/setup-miniconda@v2
+ with:
+ activate-environment: python_3_with_R_and_tensorflow
+ environment-file: testing/env_python_3_with_R_and_tensorflow.yml
+ python-version: 3.7
+ auto-activate-base: false
+ - name: Make IRkernel available to Jupyter
+ run: |
+ R -e "IRkernel::installspec()"
+ - name: verify interpreter
+ run: mvn verify -DskipRat -am -pl .,zeppelin-interpreter,zeppelin-interpreter-shaded,${INTERPRETERS} -Pscala-2.10 -B
+ test-zeppelin-client-integration-test:
+ runs-on: ubuntu-18.04
+ steps:
+ - name: Checkout
+ uses: actions/checkout@v2
+ - name: Set up JDK 8
+ uses: actions/setup-java@v1
+ with:
+ java-version: 8
+ - name: Cache local Maven repository
+ uses: actions/cache@v2
+ with:
+ path: |
+ ~/.m2/repository
+ !~/.m2/repository/org/apache/zeppelin/
+ key: ${{ runner.os }}-zeppelin-${{ hashFiles('**/pom.xml') }}
+ restore-keys: |
+ ${{ runner.os }}-zeppelin-
+ - name: Setup conda environment with python 3.7 and R
+ uses: conda-incubator/setup-miniconda@v2
+ with:
+ activate-environment: python_3_with_R
+ environment-file: testing/env_python_3_with_R.yml
+ python-version: 3.7
+ auto-activate-base: false
+ - name: Make IRkernel available to Jupyter
+ run: |
+ R -e "IRkernel::installspec()"
+ - name: install environment
+ run: |
+ mvn install -DskipTests -DskipRat -Pintegration -pl zeppelin-interpreter-integration,zeppelin-web,spark/spark-dependencies,markdown,flink/interpreter,jdbc,shell -am
+ mvn package -DskipRat -T 2C -pl zeppelin-plugins -amd -DskipTests -B
+ - name: run tests
+ run: mvn test -DskipRat -pl zeppelin-interpreter-integration -Pintegration -Dtest=ZeppelinClientIntegrationTest,ZeppelinClientWithAuthIntegrationTest,ZSessionIntegrationTest
+ test-flink-and-flink-integration-test:
+ runs-on: ubuntu-18.04
+ strategy:
+ matrix:
+ flink: [ flink_1_10, flink_1_11]
+ steps:
+ - name: Checkout
+ uses: actions/checkout@v2
+ - name: Set up JDK 8
+ uses: actions/setup-java@v1
+ with:
+ java-version: 8
+ - name: Cache local Maven repository
+ uses: actions/cache@v2
+ with:
+ path: |
+ ~/.m2/repository
+ !~/.m2/repository/org/apache/zeppelin/
+ key: ${{ runner.os }}-zeppelin-${{ hashFiles('**/pom.xml') }}
+ restore-keys: |
+ ${{ runner.os }}-zeppelin-
+ - name: Setup conda environment with python 3.7 and
+ uses: conda-incubator/setup-miniconda@v2
+ with:
+ activate-environment: python_3_with_flink
+ environment-file: testing/env_python_3 with_${{ matrix.flink }}.yml
+ python-version: 3.7
+ auto-activate-base: false
+ - name: install environment
+ run: |
+ mvn install -DskipTests -DskipRat -am -pl flink/interpreter,zeppelin-interpreter-integration -Pflink-1.10 -Pintegration -B
+ mvn clean package -T 2C -pl zeppelin-plugins -amd -DskipTests -B
+ - name: run tests
+ run: mvn test -DskipRat -pl flink/interpreter,zeppelin-interpreter-integration -Pflink-1.10 -Pintegration -B -Dtest=org.apache.zeppelin.flink.*,FlinkIntegrationTest110,ZeppelinFlinkClusterTest110
+ run-spark-intergration-test:
+ runs-on: ubuntu-18.04
+ steps:
+ - name: Checkout
+ uses: actions/checkout@v2
+ - name: Set up JDK 8
+ uses: actions/setup-java@v1
+ with:
+ java-version: 8
+ - name: Cache local Maven repository
+ uses: actions/cache@v2
+ with:
+ path: |
+ ~/.m2/repository
+ !~/.m2/repository/org/apache/zeppelin/
+ key: ${{ runner.os }}-zeppelin-${{ hashFiles('**/pom.xml') }}
+ restore-keys: |
+ ${{ runner.os }}-zeppelin-
+ - name: Setup conda environment with python 3.7 and R
+ uses: conda-incubator/setup-miniconda@v2
+ with:
+ activate-environment: python_3_with_R
+ environment-file: testing/env_python_3_with_R.yml
+ python-version: 3.7
+ auto-activate-base: false
+ - name: Make IRkernel available to Jupyter
+ run: |
+ R -e "IRkernel::installspec()"
+ - name: install environment
+ run: |
+ mvn install -DskipTests -DskipRat -pl zeppelin-interpreter-integration,zeppelin-web,spark/spark-dependencies,markdown -am -Phadoop2 -Pintegration -B
+ mvn clean package -T 2C -pl zeppelin-plugins -amd -DskipTests -B
+ - name: run tests
+ run: mvn test -DskipRat -pl zeppelin-interpreter-integration,zeppelin-web,spark/spark-dependencies,markdown -am -Phadoop2 -Pintegration -B -Dtest=ZeppelinSparkClusterTest24,SparkIntegrationTest24,ZeppelinSparkClusterTest23,SparkIntegrationTest23,ZeppelinSparkClusterTest22,SparkIntegrationTest22,ZeppelinSparkClusterTest30,SparkIntegrationTest30 -DfailIfNoTests=false
+ jdbcIntegrationTest-and-unit-test-of-Spark-2-4-with-Scala-2-11:
+ runs-on: ubuntu-18.04
+ steps:
+ # user/password => root/root
+ - name: Start mysql
+ run: sudo systemctl start mysql.service
+ - name: Checkout
+ uses: actions/checkout@v2
+ - name: Set up JDK 8
+ uses: actions/setup-java@v1
+ with:
+ java-version: 8
+ - name: Cache local Maven repository
+ uses: actions/cache@v2
+ with:
+ path: |
+ ~/.m2/repository
+ !~/.m2/repository/org/apache/zeppelin/
+ key: ${{ runner.os }}-zeppelin-${{ hashFiles('**/pom.xml') }}
+ restore-keys: |
+ ${{ runner.os }}-zeppelin-
+ - name: Setup conda environment with python 3.7 and R
+ uses: conda-incubator/setup-miniconda@v2
+ with:
+ activate-environment: python_3_with_R
+ environment-file: testing/env_python_3_with_R.yml
+ python-version: 3.7
+ auto-activate-base: false
+ - name: Make IRkernel available to Jupyter
+ run: |
+ R -e "IRkernel::installspec()"
+ - name: install environment
+ run: |
+ mvn install -DskipTests -DskipRat -pl zeppelin-interpreter-integration,jdbc,zeppelin-web,spark/spark-dependencies,markdown -am -Pspark-2.4 -Pspark-scala-2.11 -Phadoop2 -Pintegration -B
+ mvn clean package -T 2C -pl zeppelin-plugins -amd -DskipTests -B
+ - name: run tests
+ run: mvn test -DskipRat -pl zeppelin-interpreter-integration,jdbc,zeppelin-web,spark/spark-dependencies,markdown -am -Pspark-2.4 -Pspark-scala-2.11 -Phadoop2 -Pintegration -B -Dtest=JdbcIntegrationTest,org.apache.zeppelin.spark.*,org.apache.zeppelin.kotlin.* -DfailIfNoTests=false
+
+ spark-2-4-and-scale-2-12:
+ runs-on: ubuntu-18.04
+ steps:
+ - name: Checkout
+ uses: actions/checkout@v2
+ - name: Set up JDK 8
+ uses: actions/setup-java@v1
+ with:
+ java-version: 8
+ - name: Cache local Maven repository
+ uses: actions/cache@v2
+ with:
+ path: |
+ ~/.m2/repository
+ !~/.m2/repository/org/apache/zeppelin/
+ key: ${{ runner.os }}-zeppelin-${{ hashFiles('**/pom.xml') }}
+ restore-keys: |
+ ${{ runner.os }}-zeppelin-
+ - name: Setup conda environment with python 3.7 and R
+ uses: conda-incubator/setup-miniconda@v2
+ with:
+ activate-environment: python_3_with_R
+ environment-file: testing/env_python_3_with_R.yml
+ python-version: 3.7
+ auto-activate-base: false
+ - name: Make IRkernel available to Jupyter
+ run: |
+ R -e "IRkernel::installspec()"
+ - name: install environment
+ run: |
+ mvn install -DskipTests -DskipRat -pl spark/spark-dependencies -am -Pspark-2.4 -Pspark-scala-2.12 -Phadoop2 -B
+ - name: run tests
+ run: mvn test -DskipRat -pl spark/spark-dependencies -am -Pspark-2.4 -Pspark-scala-2.12 -Phadoop2 -B -Dtest=org.apache.zeppelin.spark.*,org.apache.zeppelin.kotlin.* -DfailIfNoTests=false
+
+ spark-2-3-and-scale-2-11-and-other-interpreter:
+ runs-on: ubuntu-18.04
+ steps:
+ - name: Checkout
+ uses: actions/checkout@v2
+ - name: Set up JDK 8
+ uses: actions/setup-java@v1
+ with:
+ java-version: 8
+ - name: Cache local Maven repository
+ uses: actions/cache@v2
+ with:
+ path: |
+ ~/.m2/repository
+ !~/.m2/repository/org/apache/zeppelin/
+ key: ${{ runner.os }}-zeppelin-${{ hashFiles('**/pom.xml') }}
+ restore-keys: |
+ ${{ runner.os }}-zeppelin-
+ - name: Setup conda environment with python 3.7 and R
+ uses: conda-incubator/setup-miniconda@v2
+ with:
+ activate-environment: python_3_with_R
+ environment-file: testing/env_python_3_with_R.yml
+ python-version: 3.7
+ auto-activate-base: false
+ - name: Make IRkernel available to Jupyter
+ run: |
+ R -e "IRkernel::installspec()"
+ - name: install environment
+ run: |
+ mvn install -DskipTests -DskipRat -pl spark/spark-dependencies -am -Pspark-2.3 -Pspark-scala-2.11 -Phadoop2 -B
+ - name: run tests
+ run: mvn test -DskipRat -pl spark/spark-dependencies -am -Pspark-2.3 -Pspark-scala-2.11 -Phadoop2 -B -Dtest=org.apache.zeppelin.spark.*,apache.zeppelin.python.*,apache.zeppelin.jupyter.*,apache.zeppelin.r.* -DfailIfNoTests=false
+
+ spark-2-2-and-scale-2-10-and-other-interpreter:
+ runs-on: ubuntu-18.04
+ steps:
+ - name: Checkout
+ uses: actions/checkout@v2
+ - name: Set up JDK 8
+ uses: actions/setup-java@v1
+ with:
+ java-version: 8
+ - name: Cache local Maven repository
+ uses: actions/cache@v2
+ with:
+ path: |
+ ~/.m2/repository
+ !~/.m2/repository/org/apache/zeppelin/
+ key: ${{ runner.os }}-zeppelin-${{ hashFiles('**/pom.xml') }}
+ restore-keys: |
+ ${{ runner.os }}-zeppelin-
+ - name: Setup conda environment with python 3.7 and R
+ uses: conda-incubator/setup-miniconda@v2
+ with:
+ activate-environment: python_3_with_R
+ environment-file: testing/env_python_3_with_R.yml
+ python-version: 3.7
+ auto-activate-base: false
+ - name: Make IRkernel available to Jupyter
+ run: |
+ R -e "IRkernel::installspec()"
+ - name: install environment
+ run: mvn install -DskipTests -DskipRat -pl spark/spark-dependencies -am -Pspark-2.2 -Pspark-scala-2.10 -Phadoop2 -B
+ - name: run tests
+ run: mvn test -DskipRat -pl spark/spark-dependencies -am -Pspark-2.2 -Pspark-scala-2.10 -Phadoop2 -B -Dtest=org.apache.zeppelin.spark.*,apache.zeppelin.python.*,apache.zeppelin.jupyter.*,apache.zeppelin.r.* -DfailIfNoTests=false
+ test-livy-0-5-with-spark-2-2-0-under-python3:
+ runs-on: ubuntu-18.04
+ steps:
+ - name: Checkout
+ uses: actions/checkout@v2
+ - name: Set up JDK 8
+ uses: actions/setup-java@v1
+ with:
+ java-version: 8
+ - name: Cache local Maven repository
+ uses: actions/cache@v2
+ with:
+ path: |
+ ~/.m2/repository
+ !~/.m2/repository/org/apache/zeppelin/
+ key: ${{ runner.os }}-zeppelin-${{ hashFiles('**/pom.xml') }}
+ restore-keys: |
+ ${{ runner.os }}-zeppelin-
+ - name: Setup conda environment with python 3.7 and R
+ uses: conda-incubator/setup-miniconda@v2
+ with:
+ activate-environment: python_3_with_R
+ environment-file: testing/env_python_3_with_R.yml
+ python-version: 3.7
+ auto-activate-base: false
+ - name: Make IRkernel available to Jupyter
+ run: |
+ R -e "IRkernel::installspec()"
+ - name: install environment
+ run: |
+ mvn install -DskipTests -DskipRat -pl livy -am -B
+ ./testing/downloadSpark.sh "2.2.0" "2.6"
+ ./testing/downloadLivy.sh "0.5.0-incubating"
+ - name: run tests
+ run: mvn verify -DskipRat -pl livy -am -B
diff --git a/.github/workflows/frontend.yml b/.github/workflows/frontend.yml
new file mode 100644
index 0000000..0c95575
--- /dev/null
+++ b/.github/workflows/frontend.yml
@@ -0,0 +1,97 @@
+name: frontend
+on:
+ push:
+ pull_request:
+ branches:
+ - master
+ - branch-*
+ types: [opened, synchronize]
+
+env:
+ MAVEN_OPTS: "-Xms1024M -Xmx2048M -XX:MaxMetaspaceSize=1024m -XX:-UseGCOverheadLimit -Dorg.slf4j.simpleLogger.log.org.apache.maven.cli.transfer.Slf4jMavenTransferListener=warn"
+ ZEPPELIN_HELIUM_REGISTRY: helium
+ SPARK_PRINT_LAUNCH_COMMAND: "true"
+ INTERPRETERS: '!beam,!hbase,!pig,!jdbc,!file,!flink,!ignite,!kylin,!lens,!cassandra,!elasticsearch,!bigquery,!alluxio,!scio,!livy,!groovy,!sap,!java,!geode,!neo4j,!hazelcastjet,!submarine,!sparql,!mongodb'
+
+jobs:
+ run-e2e-tests-in-zeppelin-web:
+ runs-on: ubuntu-18.04
+ steps:
+ - name: Checkout
+ uses: actions/checkout@v2
+ - name: Set up JDK 8
+ uses: actions/setup-java@v1
+ with:
+ java-version: 8
+ - name: Cache local Maven repository
+ uses: actions/cache@v2
+ with:
+ path: |
+ ~/.m2/repository
+ !~/.m2/repository/org/apache/zeppelin/
+ key: ${{ runner.os }}-zeppelin-${{ hashFiles('**/pom.xml') }}
+ restore-keys: |
+ ${{ runner.os }}-zeppelin-
+ - name: Install application
+ run: mvn -B install -DskipTests -DskipRat -pl ${INTERPRETERS} -Phadoop2 -Pscala-2.11
+ - name: Run headless test
+ run: xvfb-run --auto-servernum --server-args="-screen 0 1024x768x24" mvn verify -DskipRat -pl zeppelin-web -Phadoop2 -Pscala-2.11 -Pweb-e2e -B
+ run-tests-in-zeppelin-web-angular:
+ runs-on: ubuntu-18.04
+ steps:
+ - name: Checkout
+ uses: actions/checkout@v2
+ - name: Set up JDK 8
+ uses: actions/setup-java@v1
+ with:
+ java-version: 8
+ - name: Cache local Maven repository
+ uses: actions/cache@v2
+ with:
+ path: |
+ ~/.m2/repository
+ !~/.m2/repository/org/apache/zeppelin/
+ key: ${{ runner.os }}-zeppelin-${{ hashFiles('**/pom.xml') }}
+ restore-keys: |
+ ${{ runner.os }}-zeppelin-
+ - name: Run headless test
+ run: xvfb-run --auto-servernum --server-args="-screen 0 1024x768x24" mvn package -DskipRat -pl zeppelin-web-angular -Pweb-angular -B
+
+ test-selenium-with-spark-module-for-spark-2-3:
+ runs-on: ubuntu-18.04
+ defaults:
+ run:
+ shell: bash -l {0}
+ steps:
+ - name: Checkout
+ uses: actions/checkout@v2
+ - name: Set up JDK 8
+ uses: actions/setup-java@v1
+ with:
+ java-version: 8
+ - name: Cache local Maven repository
+ uses: actions/cache@v2
+ with:
+ path: |
+ ~/.m2/repository
+ !~/.m2/repository/org/apache/zeppelin/
+ key: ${{ runner.os }}-zeppelin-${{ hashFiles('**/pom.xml') }}
+ restore-keys: |
+ ${{ runner.os }}-zeppelin-
+ - name: Setup conda environment with python 3.7 and R
+ uses: conda-incubator/setup-miniconda@v2
+ with:
+ activate-environment: python_3_with_R
+ environment-file: testing/env_python_3_with_R.yml
+ python-version: 3.7
+ auto-activate-base: false
+ - name: Make IRkernel available to Jupyter
+ run: |
+ R -e "IRkernel::installspec()"
+ - name: install environment
+ run: |
+ mvn clean install -DskipTests -DskipRat -pl ${INTERPRETERS} -Pspark-2.3 -Phadoop2 -Phelium-dev -Pexamples -Pintegration -Pspark-scala-2.11 -B
+ mvn clean package -T 2C -pl zeppelin-plugins -amd -B
+ ./testing/downloadSpark.sh "2.3.2" "2.6"
+ - name: run tests
+ run: xvfb-run --auto-servernum --server-args="-screen 0 1600x1024x16" mvn verify -DskipRat -Pspark-2.3 -Phadoop2 -Phelium-dev -Pexamples -Pintegration -Pspark-scala-2.11 -B -pl zeppelin-integration -DfailIfNoTests=false
diff --git a/.github/workflows/rat.yml b/.github/workflows/rat.yml
new file mode 100644
index 0000000..bbd310c
--- /dev/null
+++ b/.github/workflows/rat.yml
@@ -0,0 +1,23 @@
+name: rat
+on:
+ push:
+ pull_request:
+ branches:
+ - master
+ - branch-*
+ types: [opened, synchronize]
+
+jobs:
+ license-check:
+ runs-on: ubuntu-18.04
+ env:
+ MAVEN_OPTS: "-Dorg.slf4j.simpleLogger.log.org.apache.maven.cli.transfer.Slf4jMavenTransferListener=warn"
+ steps:
+ - name: Checkout
+ uses: actions/checkout@v2
+ - name: Set up JDK 8
+ uses: actions/setup-java@v1
+ with:
+ java-version: 8
+ - name: Check Rat
+ run: mvn apache-rat:check -Prat -B
diff --git a/README.md b/README.md
index acf57c3..8b4f22a 100644
--- a/README.md
+++ b/README.md
@@ -2,7 +2,7 @@
**Documentation:** [User Guide](https://zeppelin.apache.org/docs/latest/index.html)<br/>
**Mailing Lists:** [User and Dev mailing list](https://zeppelin.apache.org/community.html)<br/>
-**Continuous Integration:** [![Build Status](https://travis-ci.org/apache/zeppelin.svg?branch=master)](https://travis-ci.org/apache/zeppelin) <br/>
+**Continuous Integration:** ![core](https://github.com/apache/zeppelin/workflows/core/badge.svg) ![frontend](https://github.com/apache/zeppelin/workflows/frontend/badge.svg) ![rat](https://github.com/apache/zeppelin/workflows/rat/badge.svg) <br/>
**Contributing:** [Contribution Guide](https://zeppelin.apache.org/contribution/contributions.html)<br/>
**Issue Tracker:** [Jira](https://issues.apache.org/jira/browse/ZEPPELIN)<br/>
**License:** [Apache 2.0](https://github.com/apache/zeppelin/blob/master/LICENSE)
@@ -25,5 +25,3 @@ Please go to [install](https://zeppelin.apache.org/docs/latest/quickstart/instal
### Build from source
Please check [Build from source](https://zeppelin.apache.org/docs/latest/setup/basics/how_to_build.html) to build Zeppelin from source.
-
-
diff --git a/docs/development/contribution/how_to_contribute_code.md b/docs/development/contribution/how_to_contribute_code.md
index e6d1151..3683428 100644
--- a/docs/development/contribution/how_to_contribute_code.md
+++ b/docs/development/contribution/how_to_contribute_code.md
@@ -151,7 +151,7 @@ TEST_SELENIUM=true mvn test -Dtest=ParagraphActionsIT -DfailIfNoTests=false \
-pl 'zeppelin-interpreter,zeppelin-zengine,zeppelin-server'
```
-You'll need Firefox web browser installed in your development environment. While CI server uses [Firefox 31.0](https://ftp.mozilla.org/pub/firefox/releases/31.0/) to run selenium test, it is good idea to install the same version (disable auto update to keep the version).
+You'll need Firefox web browser installed in your development environment.
## Where to Start
diff --git a/docs/development/writing_zeppelin_interpreter.md b/docs/development/writing_zeppelin_interpreter.md
index 27843ac..33ecee1 100644
--- a/docs/development/writing_zeppelin_interpreter.md
+++ b/docs/development/writing_zeppelin_interpreter.md
@@ -309,7 +309,7 @@ We welcome contribution to a new interpreter. Please follow these few steps:
- First, check out the general contribution guide [here](https://zeppelin.apache.org/contribution/contributions.html).
- Follow the steps in [Make your own Interpreter](#make-your-own-interpreter) section and [Editor setting for Interpreter](#editor-setting-for-interpreter) above.
- Add your interpreter as in the [Configure your interpreter](#configure-your-interpreter) section above; also add it to the example template [zeppelin-site.xml.template](https://github.com/apache/zeppelin/blob/master/conf/zeppelin-site.xml.template).
- - Add tests! They are run by [Travis](https://travis-ci.org/apache/zeppelin) for all changes and it is important that they are self-contained.
+ - Add tests! They are run for all changes and it is important that they are self-contained.
- Include your interpreter as a module in [`pom.xml`](https://github.com/apache/zeppelin/blob/master/pom.xml).
- Add documentation on how to use your interpreter under `docs/interpreter/`. Follow the Markdown style as this [example](https://github.com/apache/zeppelin/blob/master/docs/interpreter/elasticsearch.md). Make sure you list config settings and provide working examples on using your interpreter in code boxes in Markdown. Link to images as appropriate (images should go to `docs/assets/themes/zeppelin/img/docs-img/`). And add a link to your documentation in the navigation menu (`docs/_incl [...]
- Most importantly, ensure licenses of the transitive closure of all dependencies are list in [license file](https://github.com/apache/zeppelin/blob/master/zeppelin-distribution/src/bin_license/LICENSE).
diff --git a/livy/README.md b/livy/README.md
index e0908a8..403c0d6 100644
--- a/livy/README.md
+++ b/livy/README.md
@@ -5,13 +5,11 @@ Livy interpreter for Apache Zeppelin
You can follow the instructions at [Livy Quick Start](http://livy.io/quickstart.html) to set up livy.
# Run Integration Tests
-You can add integration test to [LivyInterpreter.java](https://github.com/apache/zeppelin/blob/master/livy/src/test/java/org/apache/zeppelin/livy/LivyInterpreterIT.java)
-Either you can run the integration test on travis where enviroment will be setup or you can run it in local. You need to download livy-0.2 and spark-1.5.2 to local, then use the following
-script to run the integration test.
+You can add integration test to [LivyInterpreter.java](https://github.com/apache/zeppelin/blob/master/livy/src/test/java/org/apache/zeppelin/livy/LivyInterpreterIT.java) and run the integration test either via the CI environment or locally. You need to download livy-0.2 and spark-1.5.2 to local, then use the following script to run the integration test.
```bash
#!/usr/bin/env bash
export LIVY_HOME=<path_of_livy_0.2.0>
export SPARK_HOME=<path_of_spark-1.5.2>
mvn clean verify -pl livy -DfailIfNoTests=false -DskipRat
-```
\ No newline at end of file
+```
diff --git a/pom.xml b/pom.xml
index 5a035f6..3f20b41 100644
--- a/pom.xml
+++ b/pom.xml
@@ -1705,7 +1705,7 @@
<exclude>**/.idea/</exclude>
<exclude>**/*.iml</exclude>
<exclude>.git/</exclude>
- <exclude>.github/*</exclude>
+ <exclude>.github/</exclude>
<exclude>.gitignore</exclude>
<exclude>git.properties</exclude>
<exclude>.repository/</exclude>
diff --git a/python/src/main/java/org/apache/zeppelin/python/IPythonInterpreter.java b/python/src/main/java/org/apache/zeppelin/python/IPythonInterpreter.java
index 8cd7fba..e482585 100644
--- a/python/src/main/java/org/apache/zeppelin/python/IPythonInterpreter.java
+++ b/python/src/main/java/org/apache/zeppelin/python/IPythonInterpreter.java
@@ -34,6 +34,7 @@ import py4j.GatewayServer;
import java.io.File;
import java.io.IOException;
import java.io.InputStream;
+import java.nio.charset.StandardCharsets;
import java.util.List;
import java.util.Map;
import java.util.Properties;
@@ -127,7 +128,7 @@ public class IPythonInterpreter extends JupyterKernelInterpreter {
private void initPythonInterpreter(String gatewayHost, int gatewayPort) throws IOException {
InputStream input =
getClass().getClassLoader().getResourceAsStream("python/zeppelin_ipython.py");
- List<String> lines = IOUtils.readLines(input);
+ List<String> lines = IOUtils.readLines(input, StandardCharsets.UTF_8);
ExecuteResponse response = jupyterKernelClient.block_execute(ExecuteRequest.newBuilder()
.setCode(StringUtils.join(lines, System.lineSeparator())
.replace("${JVM_GATEWAY_PORT}", gatewayPort + "")
@@ -138,7 +139,7 @@ public class IPythonInterpreter extends JupyterKernelInterpreter {
input =
getClass().getClassLoader().getResourceAsStream("python/zeppelin_context.py");
- lines = IOUtils.readLines(input);
+ lines = IOUtils.readLines(input, StandardCharsets.UTF_8);
response = jupyterKernelClient.block_execute(ExecuteRequest.newBuilder()
.setCode(StringUtils.join(lines, System.lineSeparator())).build());
if (response.getStatus() != ExecuteStatus.SUCCESS) {
@@ -154,13 +155,13 @@ public class IPythonInterpreter extends JupyterKernelInterpreter {
if (additionalPythonInitFile != null) {
input = getClass().getClassLoader().getResourceAsStream(additionalPythonInitFile);
- lines = IOUtils.readLines(input);
+ lines = IOUtils.readLines(input, StandardCharsets.UTF_8);
response = jupyterKernelClient.block_execute(ExecuteRequest.newBuilder()
.setCode(StringUtils.join(lines, System.lineSeparator())
.replace("${JVM_GATEWAY_PORT}", gatewayPort + "")
.replace("${JVM_GATEWAY_ADDRESS}", gatewayHost)).build());
if (response.getStatus() != ExecuteStatus.SUCCESS) {
- LOGGER.error("Fail to run additional Python init file\n" + response.getOutput());
+ LOGGER.error("Fail to run additional Python init file\n{}", response.getOutput());
throw new IOException("Fail to run additional Python init file: "
+ additionalPythonInitFile + "\n" + response.getOutput());
}
@@ -196,7 +197,7 @@ public class IPythonInterpreter extends JupyterKernelInterpreter {
if (usePy4JAuth) {
envs.put("PY4J_GATEWAY_SECRET", this.py4jGatewaySecret);
}
- LOGGER.info("PYTHONPATH:" + envs.get("PYTHONPATH"));
+ LOGGER.info("PYTHONPATH: {}", envs.get("PYTHONPATH"));
return envs;
}
diff --git a/python/src/main/java/org/apache/zeppelin/python/PythonUtils.java b/python/src/main/java/org/apache/zeppelin/python/PythonUtils.java
index 996518b..15b71f0 100644
--- a/python/src/main/java/org/apache/zeppelin/python/PythonUtils.java
+++ b/python/src/main/java/org/apache/zeppelin/python/PythonUtils.java
@@ -70,8 +70,8 @@ public class PythonUtils {
}
public static String getLocalIP(Properties properties) {
- // zeppelin.python.gatewayserver_address is only for unit test on travis.
- // Because the FQDN would fail unit test on travis ci.
+ // zeppelin.python.gatewayserver_address is only for unit test.
+ // Because the FQDN would fail unit test.
String gatewayserver_address =
properties.getProperty("zeppelin.python.gatewayserver_address");
if (gatewayserver_address != null) {
diff --git a/python/src/test/java/org/apache/zeppelin/python/IPythonInterpreterTest.java b/python/src/test/java/org/apache/zeppelin/python/IPythonInterpreterTest.java
index 08e0c8d..72ab25b 100644
--- a/python/src/test/java/org/apache/zeppelin/python/IPythonInterpreterTest.java
+++ b/python/src/test/java/org/apache/zeppelin/python/IPythonInterpreterTest.java
@@ -298,7 +298,7 @@ public class IPythonInterpreterTest extends BasePythonInterpreterTest {
assertEquals(context.out.toInterpreterResultMessage().get(0).getData(),
InterpreterResult.Code.SUCCESS, result.code());
interpreterResultMessages = context.out.toInterpreterResultMessage();
-
+
assertEquals(context.out.toString(), 5, interpreterResultMessages.size());
// the first message is the warning text message.
assertEquals(InterpreterResult.Type.HTML, interpreterResultMessages.get(1).getType());
@@ -377,7 +377,7 @@ public class IPythonInterpreterTest extends BasePythonInterpreterTest {
// We ensure that running and auto completion are not hanging.
InterpreterResult res = interpretFuture.get(20000, TimeUnit.MILLISECONDS);
List<InterpreterCompletion> autoRes = completionFuture.get(3000, TimeUnit.MILLISECONDS);
- assertTrue(res.code().name().equals("SUCCESS"));
+ assertEquals("SUCCESS", res.code().name());
assertTrue(autoRes.size() > 0);
}
diff --git a/rlang/src/test/java/org/apache/zeppelin/r/ShinyInterpreterTest.java b/rlang/src/test/java/org/apache/zeppelin/r/ShinyInterpreterTest.java
index 2819ca5..f9c33e8 100644
--- a/rlang/src/test/java/org/apache/zeppelin/r/ShinyInterpreterTest.java
+++ b/rlang/src/test/java/org/apache/zeppelin/r/ShinyInterpreterTest.java
@@ -34,6 +34,7 @@ import org.junit.Before;
import org.junit.Test;
import java.io.IOException;
+import java.nio.charset.StandardCharsets;
import java.util.HashMap;
import java.util.List;
import java.util.Properties;
@@ -78,12 +79,12 @@ public class ShinyInterpreterTest {
InterpreterContext context = getInterpreterContext();
context.getLocalProperties().put("type", "ui");
InterpreterResult result =
- interpreter.interpret(IOUtils.toString(getClass().getResource("/ui.R")), context);
+ interpreter.interpret(IOUtils.toString(getClass().getResource("/ui.R"), StandardCharsets.UTF_8), context);
assertEquals(InterpreterResult.Code.SUCCESS, result.code());
context = getInterpreterContext();
context.getLocalProperties().put("type", "server");
- result = interpreter.interpret(IOUtils.toString(getClass().getResource("/server.R")), context);
+ result = interpreter.interpret(IOUtils.toString(getClass().getResource("/server.R"), StandardCharsets.UTF_8), context);
assertEquals(InterpreterResult.Code.SUCCESS, result.code());
final InterpreterContext context2 = getInterpreterContext();
@@ -121,13 +122,13 @@ public class ShinyInterpreterTest {
context.getLocalProperties().put("type", "ui");
context.getLocalProperties().put("app", "app2");
result =
- interpreter.interpret(IOUtils.toString(getClass().getResource("/ui.R")), context);
+ interpreter.interpret(IOUtils.toString(getClass().getResource("/ui.R"), StandardCharsets.UTF_8), context);
assertEquals(InterpreterResult.Code.SUCCESS, result.code());
context = getInterpreterContext();
context.getLocalProperties().put("type", "server");
context.getLocalProperties().put("app", "app2");
- result = interpreter.interpret(IOUtils.toString(getClass().getResource("/server.R")), context);
+ result = interpreter.interpret(IOUtils.toString(getClass().getResource("/server.R"), StandardCharsets.UTF_8), context);
assertEquals(InterpreterResult.Code.SUCCESS, result.code());
final InterpreterContext context3 = getInterpreterContext();
@@ -183,12 +184,12 @@ public class ShinyInterpreterTest {
InterpreterContext context = getInterpreterContext();
context.getLocalProperties().put("type", "ui");
InterpreterResult result =
- interpreter.interpret(IOUtils.toString(getClass().getResource("/invalid_ui.R")), context);
+ interpreter.interpret(IOUtils.toString(getClass().getResource("/invalid_ui.R"), StandardCharsets.UTF_8), context);
assertEquals(InterpreterResult.Code.SUCCESS, result.code());
context = getInterpreterContext();
context.getLocalProperties().put("type", "server");
- result = interpreter.interpret(IOUtils.toString(getClass().getResource("/server.R")), context);
+ result = interpreter.interpret(IOUtils.toString(getClass().getResource("/server.R"), StandardCharsets.UTF_8), context);
assertEquals(InterpreterResult.Code.SUCCESS, result.code());
final InterpreterContext context2 = getInterpreterContext();
diff --git a/shell/src/test/java/org/apache/zeppelin/shell/TerminalInterpreterTest.java b/shell/src/test/java/org/apache/zeppelin/shell/TerminalInterpreterTest.java
index 8dda4de..ac9a3f5 100644
--- a/shell/src/test/java/org/apache/zeppelin/shell/TerminalInterpreterTest.java
+++ b/shell/src/test/java/org/apache/zeppelin/shell/TerminalInterpreterTest.java
@@ -74,13 +74,13 @@ public class TerminalInterpreterTest extends BaseInterpreterTest {
try {
// mock connect terminal
boolean running = terminal.terminalThreadIsRunning();
- assertEquals(running, true);
+ assertTrue(running);
URI uri = URI.create("ws://localhost:" + terminal.getTerminalPort() + "/terminal/");
webSocketContainer = ContainerProvider.getWebSocketContainer();
// Attempt Connect
- session = (Session) webSocketContainer.connectToServer(TerminalSocketTest.class, uri);
+ session = webSocketContainer.connectToServer(TerminalSocketTest.class, uri);
// Send Start terminal service message
String terminalReadyCmd = String.format("{\"type\":\"TERMINAL_READY\"," +
@@ -154,13 +154,13 @@ public class TerminalInterpreterTest extends BaseInterpreterTest {
try {
// mock connect terminal
boolean running = terminal.terminalThreadIsRunning();
- assertEquals(running, true);
+ assertTrue(running);
URI uri = URI.create("ws://localhost:" + terminal.getTerminalPort() + "/terminal/");
webSocketContainer = ContainerProvider.getWebSocketContainer();
// Attempt Connect
- session = (Session) webSocketContainer.connectToServer(TerminalSocketTest.class, uri);
+ session = webSocketContainer.connectToServer(TerminalSocketTest.class, uri);
// Send Start terminal service message
String terminalReadyCmd = String.format("{\"type\":\"TERMINAL_READY\"," +
diff --git a/testing/env_python_3 with_flink_1_10.yml b/testing/env_python_3 with_flink_1_10.yml
new file mode 100644
index 0000000..06d505d
--- /dev/null
+++ b/testing/env_python_3 with_flink_1_10.yml
@@ -0,0 +1,25 @@
+name: python_3_with_flink
+channels:
+ - conda-forge
+ - defaults
+dependencies:
+ - pycodestyle
+ - numpy=1
+ - pandas=0.25
+ - scipy=1
+ - grpcio
+ - hvplot
+ - protobuf=3
+ - pandasql=0.7.3
+ - ipython=7
+ - matplotlib=3
+ - ipykernel=5
+ - jupyter_client=5
+ - bokeh=1.3.4
+ - panel
+ - holoviews
+ - pyyaml=3
+ - pip
+ - pip:
+ - bkzep==0.6.1
+ - apache-flink==1.10.1
diff --git a/testing/env_python_3 with_flink_1_11.yml b/testing/env_python_3 with_flink_1_11.yml
new file mode 100644
index 0000000..e23bedb
--- /dev/null
+++ b/testing/env_python_3 with_flink_1_11.yml
@@ -0,0 +1,25 @@
+name: python_3_with_flink
+channels:
+ - conda-forge
+ - defaults
+dependencies:
+ - pycodestyle
+ - numpy=1
+ - pandas=0.25
+ - scipy=1
+ - grpcio
+ - hvplot
+ - protobuf=3
+ - pandasql=0.7.3
+ - ipython=7
+ - matplotlib=3
+ - ipykernel=5
+ - jupyter_client=5
+ - bokeh=1.3.4
+ - panel
+ - holoviews
+ - pyyaml=3
+ - pip
+ - pip:
+ - bkzep==0.6.1
+ - apache-flink==1.11.1
diff --git a/testing/env_python_3.yml b/testing/env_python_3.yml
new file mode 100644
index 0000000..84a6c71
--- /dev/null
+++ b/testing/env_python_3.yml
@@ -0,0 +1,23 @@
+name: python_3
+channels:
+ - conda-forge
+ - defaults
+dependencies:
+ - pycodestyle=2.5.0
+ - numpy=1.17.3
+ - pandas=0.25.0
+ - scipy=1.3.1
+ - grpcio=1.22.0
+ - hvplot=0.5.2
+ - protobuf=3.10.0
+ - pandasql=0.7.3
+ - ipython=7.8.0
+ - matplotlib=3.0.3
+ - ipykernel=5.1.2
+ - jupyter_client=5.3.4
+ - bokeh=1.3.4
+ - panel=0.6.0
+ - holoviews=1.12.3
+ - pip
+ - pip:
+ - bkzep==0.6.1
diff --git a/testing/env_python_3_with_R.yml b/testing/env_python_3_with_R.yml
new file mode 100644
index 0000000..17ca6b8
--- /dev/null
+++ b/testing/env_python_3_with_R.yml
@@ -0,0 +1,33 @@
+name: python_3_with_R
+channels:
+ - conda-forge
+ - defaults
+dependencies:
+ - pycodestyle
+ - numpy=1
+ - pandas=0.25
+ - scipy=1
+ - grpcio
+ - hvplot
+ - protobuf=3
+ - pandasql=0.7.3
+ - ipython=7
+ - matplotlib=3
+ - ipykernel=5
+ - jupyter_client=5
+ - bokeh=1.3.4
+ - panel
+ - holoviews
+ - pyyaml=3
+ - pip
+ - pip:
+ - bkzep==0.6.1
+
+ - r-base=3
+ - r-evaluate
+ - r-base64enc
+ - r-knitr
+ - r-ggplot2
+ - r-irkernel
+ - r-shiny
+ - r-googlevis
diff --git a/testing/env_python_3_with_R_and_tensorflow.yml b/testing/env_python_3_with_R_and_tensorflow.yml
new file mode 100644
index 0000000..47e5d34
--- /dev/null
+++ b/testing/env_python_3_with_R_and_tensorflow.yml
@@ -0,0 +1,35 @@
+name: python_3_with_R_and_tensorflow
+channels:
+ - conda-forge
+ - defaults
+dependencies:
+ - pycodestyle
+ - numpy=1
+ - pandas=0.25
+ - scipy=1
+ - grpcio
+ - hvplot
+ - protobuf=3
+ - pandasql=0.7.3
+ - ipython=7
+ - matplotlib=3
+ - ipykernel=5
+ - jupyter_client=5
+ - bokeh=1.3.4
+ - panel
+ - holoviews
+ - pyyaml=3
+ - pip
+ - pip:
+ - bkzep==0.6.1
+
+ - r-base=3
+ - r-evaluate
+ - r-base64enc
+ - r-knitr
+ - r-ggplot2
+ - r-irkernel
+ - r-shiny
+ - r-googlevis
+
+ - tensorflow=1.13
diff --git a/testing/install_external_dependencies.sh b/testing/install_external_dependencies.sh
deleted file mode 100755
index b155d1e..0000000
--- a/testing/install_external_dependencies.sh
+++ /dev/null
@@ -1,66 +0,0 @@
-#!/bin/bash
-
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements. See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to You under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License. You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-#
-
-# Script for installing R / Python dependencies for Travis CI
-set -ev
-touch ~/.environ
-
-# Install conda for Python and R dependencies
-if [[ -n "$PYTHON" ]] ; then
- wget "https://repo.continuum.io/miniconda/Miniconda${PYTHON}-4.6.14-Linux-x86_64.sh" -O miniconda.sh
-else
- wget "https://repo.continuum.io/miniconda/Miniconda3-4.6.14-Linux-x86_64.sh" -O miniconda.sh
-fi
-bash miniconda.sh -b -p "$HOME/miniconda"
-rm -fv miniconda.sh
-echo "export PATH='$HOME/miniconda/bin:$PATH'" >> ~/.environ
-source ~/.environ
-hash -r
-conda config --set always_yes yes --set changeps1 no
-conda update -q conda
-conda info -a
-conda config --add channels conda-forge
-
-if [[ -n "$PYTHON" ]] ; then
- if [[ "$PYTHON" == "2" ]] ; then
- pip install -q numpy==1.14.5 pandas==0.21.1 matplotlib==2.1.1 scipy==1.2.1 grpcio==1.19.0 bkzep==0.6.1 hvplot==0.5.2 \
- protobuf==3.7.0 pandasql==0.7.3 ipython==5.8.0 ipykernel==4.10.0 bokeh==1.3.4 panel==0.6.0 holoviews==1.12.3
- else
- pip install -q pycodestyle==2.5.0
- pip install -q numpy==1.17.3 pandas==0.25.0 scipy==1.3.1 grpcio==1.19.0 bkzep==0.6.1 hvplot==0.5.2 protobuf==3.10.0 \
- pandasql==0.7.3 ipython==7.8.0 matplotlib==3.0.3 ipykernel==5.1.2 jupyter_client==5.3.4 bokeh==1.3.4 panel==0.6.0 holoviews==1.12.3 pycodestyle==2.5.0
- fi
-
- if [[ -n "$TENSORFLOW" ]] ; then
- check_results=$(conda search -c conda-forge tensorflow)
- echo "search tensorflow = $check_results"
- pip install -q "tensorflow==${TENSORFLOW}"
- fi
-
- if [[ -n "${FLINK}" ]]; then
- pip install -q "apache-flink==${FLINK}"
- fi
-fi
-
-# Install R dependencies if R is true
-if [[ "$R" == "true" ]] ; then
- conda install -y --quiet r-base=3 r-evaluate r-base64enc r-knitr r-ggplot2 r-irkernel r-shiny=1.5.0 r-googlevis
- R -e "IRkernel::installspec()"
- echo "R_LIBS=~/miniconda/lib/R/library" > ~/.Renviron
- echo "export R_LIBS=~/miniconda/lib/R/library" >> ~/.environ
-fi
diff --git a/travis_check.py b/travis_check.py
deleted file mode 100644
index 009ba9a..0000000
--- a/travis_check.py
+++ /dev/null
@@ -1,175 +0,0 @@
-#
-# Licensed to the Apache Software Foundation (ASF) under one or more
-# contributor license agreements. See the NOTICE file distributed with
-# this work for additional information regarding copyright ownership.
-# The ASF licenses this file to You under the Apache License, Version 2.0
-# (the "License"); you may not use this file except in compliance with
-# the License. You may obtain a copy of the License at
-#
-# http://www.apache.org/licenses/LICENSE-2.0
-#
-# Unless required by applicable law or agreed to in writing, software
-# distributed under the License is distributed on an "AS IS" BASIS,
-# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
-# See the License for the specific language governing permissions and
-# limitations under the License.
-
-#
-# This script checks build status of given pullrequest identified by author and commit hash.
-#
-# usage)
-# python travis_check.py [author] [commit hash] [check interval (optional)]
-#
-# example)
-# # full hash
-# python travis_check.py Leemoonsoo 1f2549a38f440ebfbfe2d32a041684e3e39b496c
-#
-# # with short hash
-# python travis_check.py Leemoonsoo 1f2549a
-#
-# # with custom check interval
-# python travis_check.py Leemoonsoo 1f2549a 5,60,60
-
-import os, sys, getopt, traceback, json, requests, time, urllib3, re
-
-# disable SNIMissingWarning. see https://urllib3.readthedocs.io/en/latest/advanced-usage.html#ssl-warnings
-urllib3.disable_warnings()
-
-author = sys.argv[1]
-commit = sys.argv[2]
-
-# check interval in sec
-check = [5, 60, 300, 300, 300, 300, 300, 300, 300, 300, 300, 300, 600, 600, 600, 600, 600, 600]
-
-if len(sys.argv) > 3:
- check = map(lambda x: int(x), sys.argv[3].split(","))
-
-def info(msg):
- print("[" + time.strftime("%Y-%m-%d %H:%M:%S") + "] " + msg)
- sys.stdout.flush()
-
-info("Author: " + author + ", commit: " + commit)
-
-
-def getBuildStatus(author, commit):
- travisApi = "https://api.travis-ci.org/"
-
- # get latest 25 builds
- resp = requests.get(url=travisApi + "/repos/" + author + "/zeppelin/builds")
- data = json.loads(resp.text)
- build = None
-
- if len(data) == 0:
- return build
-
- for b in data:
- if b["commit"][:len(commit)] == commit:
- resp = requests.get(url=travisApi + "/repos/" + author + "/zeppelin/builds/" + str(b["id"]))
- build = json.loads(resp.text)
- break
-
- return build
-
-def status(index, msg, jobId):
- return '{:20}'.format("[" + str(index+1) + "] " + msg) + "https://travis-ci.org/" + author + "/zeppelin/jobs/" + str(jobId)
-
-
-# load full build log and summarize
-def logSummary(url):
- # test start pattern "Running org.apache.zeppelin.scheduler.ParallelSchedulerTest"
- testStartPattern = re.compile("^Running[ ](.*)")
- # test end pattern "Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.554 sec - in org.apache.zeppelin.scheduler.JobTest"
- testEndPattern = re.compile("^Tests [^0-9]*([0-9]+)[^0-9]*([0-9]+)[^0-9]*([0-9]+)[^0-9]*([0-9]+)[^-]*[-][ ]in[ ](.*)")
-
- tests = {}
- resp = requests.get(url=url)
- lines = resp.text.splitlines()
- lastNonEmptyLine = ""
- indent = '{:10}'.format("")
-
- for line in lines:
- if not len(line.strip()) == 0:
- lastNonEmptyLine = line
-
- mStart = testStartPattern.match(line)
- if mStart:
- testName = mStart.group(1)
- tests[testName] = {
- "start": mStart
- }
- continue
-
- mEnd = testEndPattern.match(line)
- if mEnd:
- testName = mEnd.group(5)
- tests[testName]["end"] = mEnd
- continue
-
- for testName, test in tests.items():
- if not "end" in test:
- print(indent + "Test " + testName + " never finished")
- else:
- failures = int(test["end"].group(2))
- errors = int(test["end"].group(3))
- if failures > 0 or errors > 0:
- print(indent + test["end"].group(0))
-
- if not lastNonEmptyLine.startswith("Done"):
- print(indent + lastNonEmptyLine)
- print(indent + "Please check full log at " + url)
-
-def printBuildStatus(build):
- failure = 0
- running = 0
-
- for index, job in enumerate(build["matrix"]):
- result = job["result"]
- jobId = job["id"]
-
- if job["started_at"] == None and result == None:
- print(status(index, "Not started", jobId))
- running = running + 1
- elif job["started_at"] != None and job["finished_at"] == None:
- print(status(index, "Running ...", jobId))
- running = running + 1
- elif job["started_at"] != None and job["finished_at"] != None:
- if result == None:
- print(status(index, "Not completed", jobId))
- failure = failure + 1
- logSummary("https://api.travis-ci.org/v3/job/" + str(jobId) + "/log.txt")
- elif result == 0:
- print(status(index, "OK", jobId))
- else:
- print(status(index, "Error " + str(result), jobId))
- failure = failure + 1
- logSummary("https://api.travis-ci.org/v3/job/" + str(jobId) + "/log.txt")
- else:
- print(status(index, "Unknown state", jobId))
- failure = failure + 1
-
- return failure, running
-
-
-for sleep in check:
- info("--------------------------------")
- time.sleep(sleep)
- info("Get build status ...")
- build = getBuildStatus(author, commit)
- if build == None:
- info("Can't find build for commit " + commit + " from " + author)
- sys.exit(2)
-
- print("Build https://travis-ci.org/" + author + "/zeppelin/builds/" + str(build["id"]))
- failure, running = printBuildStatus(build)
-
- print(str(failure) + " job(s) failed, " + str(running) + " job(s) running/pending")
-
- if failure != 0:
- sys.exit(1)
-
- if failure == 0 and running == 0:
- info("CI Green!")
- sys.exit(0)
-
-info("Timeout")
-sys.exit(1)
diff --git a/zeppelin-integration/src/test/java/org/apache/zeppelin/integration/InterpreterModeActionsIT.java b/zeppelin-integration/src/test/java/org/apache/zeppelin/integration/InterpreterModeActionsIT.java
index 026a8ae..716c7df 100644
--- a/zeppelin-integration/src/test/java/org/apache/zeppelin/integration/InterpreterModeActionsIT.java
+++ b/zeppelin-integration/src/test/java/org/apache/zeppelin/integration/InterpreterModeActionsIT.java
@@ -206,7 +206,7 @@ public class InterpreterModeActionsIT extends AbstractZeppelinIT {
waitForParagraph(1, "READY");
interpreterModeActionsIT.setPythonParagraph(1, "user=\"user1\"");
waitForParagraph(2, "READY");
- interpreterModeActionsIT.setPythonParagraph(2, "print user");
+ interpreterModeActionsIT.setPythonParagraph(2, "print(user)");
collector.checkThat("The output field paragraph contains",
driver.findElement(By.xpath(
getParagraphXPath(2) + "//div[contains(@class, 'text plainTextContent')]")).getText(),
@@ -237,7 +237,7 @@ public class InterpreterModeActionsIT extends AbstractZeppelinIT {
waitForParagraph(1, "READY");
interpreterModeActionsIT.setPythonParagraph(1, "user=\"user2\"");
waitForParagraph(2, "READY");
- interpreterModeActionsIT.setPythonParagraph(2, "print user");
+ interpreterModeActionsIT.setPythonParagraph(2, "print(user)");
collector.checkThat("The output field paragraph contains",
driver.findElement(By.xpath(
getParagraphXPath(2) + "//div[contains(@class, 'text plainTextContent')]")).getText(),
@@ -363,7 +363,7 @@ public class InterpreterModeActionsIT extends AbstractZeppelinIT {
waitForParagraph(1, "READY");
interpreterModeActionsIT.setPythonParagraph(1, "user=\"user1\"");
waitForParagraph(2, "READY");
- interpreterModeActionsIT.setPythonParagraph(2, "print user");
+ interpreterModeActionsIT.setPythonParagraph(2, "print(user)");
collector.checkThat("The output field paragraph contains",
driver.findElement(By.xpath(
@@ -396,7 +396,7 @@ public class InterpreterModeActionsIT extends AbstractZeppelinIT {
waitForParagraph(1, "READY");
interpreterModeActionsIT.setPythonParagraph(1, "user=\"user2\"");
waitForParagraph(2, "READY");
- interpreterModeActionsIT.setPythonParagraph(2, "print user");
+ interpreterModeActionsIT.setPythonParagraph(2, "print(user)");
collector.checkThat("The output field paragraph contains",
driver.findElement(By.xpath(
getParagraphXPath(2) + "//div[contains(@class, 'text plainTextContent')]")).getText(),
@@ -640,7 +640,7 @@ public class InterpreterModeActionsIT extends AbstractZeppelinIT {
waitForParagraph(1, "READY");
interpreterModeActionsIT.setPythonParagraph(1, "user=\"user1\"");
waitForParagraph(2, "READY");
- interpreterModeActionsIT.setPythonParagraph(2, "print user");
+ interpreterModeActionsIT.setPythonParagraph(2, "print(user)");
collector.checkThat("The output field paragraph contains",
driver.findElement(By.xpath(
@@ -673,7 +673,7 @@ public class InterpreterModeActionsIT extends AbstractZeppelinIT {
waitForParagraph(1, "READY");
interpreterModeActionsIT.setPythonParagraph(1, "user=\"user2\"");
waitForParagraph(2, "READY");
- interpreterModeActionsIT.setPythonParagraph(2, "print user");
+ interpreterModeActionsIT.setPythonParagraph(2, "print(user)");
collector.checkThat("The output field paragraph contains",
driver.findElement(By.xpath(
diff --git a/zeppelin-integration/src/test/java/org/apache/zeppelin/integration/SparkParagraphIT.java b/zeppelin-integration/src/test/java/org/apache/zeppelin/integration/SparkParagraphIT.java
index 1804fc4..90ebec5 100644
--- a/zeppelin-integration/src/test/java/org/apache/zeppelin/integration/SparkParagraphIT.java
+++ b/zeppelin-integration/src/test/java/org/apache/zeppelin/integration/SparkParagraphIT.java
@@ -110,7 +110,7 @@ public class SparkParagraphIT extends AbstractZeppelinIT {
try {
setTextOfParagraph(1, "%pyspark\\n" +
"for x in range(0, 3):\\n" +
- " print \"test loop %d\" % (x)");
+ " print(\"test loop %d\" % (x))");
runParagraph(1);
diff --git a/zeppelin-interpreter-integration/src/test/java/org/apache/zeppelin/integration/JdbcIntegrationTest.java b/zeppelin-interpreter-integration/src/test/java/org/apache/zeppelin/integration/JdbcIntegrationTest.java
index 81ad481..8b7ab29 100644
--- a/zeppelin-interpreter-integration/src/test/java/org/apache/zeppelin/integration/JdbcIntegrationTest.java
+++ b/zeppelin-interpreter-integration/src/test/java/org/apache/zeppelin/integration/JdbcIntegrationTest.java
@@ -65,6 +65,7 @@ public class JdbcIntegrationTest {
interpreterSetting.setProperty("default.driver", "com.mysql.jdbc.Driver");
interpreterSetting.setProperty("default.url", "jdbc:mysql://localhost:3306/");
interpreterSetting.setProperty("default.user", "root");
+ interpreterSetting.setProperty("default.password", "root");
Dependency dependency = new Dependency("mysql:mysql-connector-java:5.1.46");
interpreterSetting.setDependencies(Lists.newArrayList(dependency));
diff --git a/zeppelin-interpreter-integration/src/test/java/org/apache/zeppelin/integration/ZeppelinSparkClusterTest.java b/zeppelin-interpreter-integration/src/test/java/org/apache/zeppelin/integration/ZeppelinSparkClusterTest.java
index e1c2931..765d7d8 100644
--- a/zeppelin-interpreter-integration/src/test/java/org/apache/zeppelin/integration/ZeppelinSparkClusterTest.java
+++ b/zeppelin-interpreter-integration/src/test/java/org/apache/zeppelin/integration/ZeppelinSparkClusterTest.java
@@ -68,7 +68,7 @@ public abstract class ZeppelinSparkClusterTest extends AbstractTestRestApi {
public static final String SPARK_MASTER_PROPERTY_NAME = "spark.master";
//This is for only run setupSparkInterpreter one time for each spark version, otherwise
- //each test method will run setupSparkInterpreter which will cost a long time and may cause travis
+ //each test method will run setupSparkInterpreter which will cost a long time and may cause a
//ci timeout.
//TODO(zjffdu) remove this after we upgrade it to junit 4.13 (ZEPPELIN-3341)
private static Set<String> verifiedSparkVersions = new HashSet<>();
diff --git a/zeppelin-plugins/launcher/k8s-standard/src/test/java/org/apache/zeppelin/interpreter/launcher/K8sStandardInterpreterLauncherTest.java b/zeppelin-plugins/launcher/k8s-standard/src/test/java/org/apache/zeppelin/interpreter/launcher/K8sStandardInterpreterLauncherTest.java
index 25b7800..4afeb0f 100644
--- a/zeppelin-plugins/launcher/k8s-standard/src/test/java/org/apache/zeppelin/interpreter/launcher/K8sStandardInterpreterLauncherTest.java
+++ b/zeppelin-plugins/launcher/k8s-standard/src/test/java/org/apache/zeppelin/interpreter/launcher/K8sStandardInterpreterLauncherTest.java
@@ -29,9 +29,7 @@ import org.junit.Before;
import org.junit.Test;
/**
- * In the future, test may use minikube on travis for end-to-end test
- * https://github.com/LiliC/travis-minikube
- * https://blog.travis-ci.com/2017-10-26-running-kubernetes-on-travis-ci-with-minikube
+ * In the future, test may use minikube for end-to-end test
*/
public class K8sStandardInterpreterLauncherTest {
@Before