You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by do...@apache.org on 2023/06/08 03:08:23 UTC
[spark] branch master updated: Revert "[SPARK-43840][INFRA] Switch `scala-213` GitHub Action Job to `scala-212`"
This is an automated email from the ASF dual-hosted git repository.
dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push:
new caf905d7b37 Revert "[SPARK-43840][INFRA] Switch `scala-213` GitHub Action Job to `scala-212`"
caf905d7b37 is described below
commit caf905d7b37198a40a299c17033063fe3dd3eb6a
Author: yangjie01 <ya...@baidu.com>
AuthorDate: Wed Jun 7 20:08:09 2023 -0700
Revert "[SPARK-43840][INFRA] Switch `scala-213` GitHub Action Job to `scala-212`"
### What changes were proposed in this pull request?
This pr revert change of SPARK-43840, Spark 3.5.0 still use Scala 2.12 as default, so we need build check for Scala 2.13 for pull request.
### Why are the changes needed?
Restore pipeline check for Scala 2.13.
### Does this PR introduce _any_ user-facing change?
No
### How was this patch tested?
Pass GitHub Actions
Closes #41506 from LuciferYang/r-43840.
Authored-by: yangjie01 <ya...@baidu.com>
Signed-off-by: Dongjoon Hyun <do...@apache.org>
---
.github/workflows/build_and_test.yml | 18 +++++++++---------
1 file changed, 9 insertions(+), 9 deletions(-)
diff --git a/.github/workflows/build_and_test.yml b/.github/workflows/build_and_test.yml
index 8aa0f42916e..a373b0e76e7 100644
--- a/.github/workflows/build_and_test.yml
+++ b/.github/workflows/build_and_test.yml
@@ -86,7 +86,7 @@ jobs:
sparkr=`./dev/is-changed.py -m sparkr`
tpcds=`./dev/is-changed.py -m sql`
docker=`./dev/is-changed.py -m docker-integration-tests`
- # 'build', 'scala-212', and 'java-11-17' are always true for now.
+ # 'build', 'scala-213', and 'java-11-17' are always true for now.
# It does not save significant time and most of PRs trigger the build.
precondition="
{
@@ -95,7 +95,7 @@ jobs:
\"sparkr\": \"$sparkr\",
\"tpcds-1g\": \"$tpcds\",
\"docker-integration-tests\": \"$docker\",
- \"scala-212\": \"true\",
+ \"scala-213\": \"true\",
\"java-11-17\": \"true\",
\"lint\" : \"true\",
\"k8s-integration-tests\" : \"true\",
@@ -728,10 +728,10 @@ jobs:
./build/mvn $MAVEN_CLI_OPTS -DskipTests -Pyarn -Pmesos -Pkubernetes -Pvolcano -Phive -Phive-thriftserver -Phadoop-cloud -Djava.version=${JAVA_VERSION/-ea} install
rm -rf ~/.m2/repository/org/apache/spark
- scala-212:
+ scala-213:
needs: precondition
- if: fromJson(needs.precondition.outputs.required).scala-212 == 'true'
- name: Scala 2.12 build with SBT
+ if: fromJson(needs.precondition.outputs.required).scala-213 == 'true'
+ name: Scala 2.13 build with SBT
runs-on: ubuntu-22.04
steps:
- name: Checkout Spark repository
@@ -761,9 +761,9 @@ jobs:
uses: actions/cache@v3
with:
path: ~/.cache/coursier
- key: scala-212-coursier-${{ hashFiles('**/pom.xml', '**/plugins.sbt') }}
+ key: scala-213-coursier-${{ hashFiles('**/pom.xml', '**/plugins.sbt') }}
restore-keys: |
- scala-212-coursier-
+ scala-213-coursier-
- name: Install Java 8
uses: actions/setup-java@v3
with:
@@ -771,8 +771,8 @@ jobs:
java-version: 8
- name: Build with SBT
run: |
- ./dev/change-scala-version.sh 2.12
- ./build/sbt -Pyarn -Pmesos -Pkubernetes -Pvolcano -Phive -Phive-thriftserver -Phadoop-cloud -Pkinesis-asl -Pdocker-integration-tests -Pkubernetes-integration-tests -Pspark-ganglia-lgpl -Pscala-2.12 compile Test/compile
+ ./dev/change-scala-version.sh 2.13
+ ./build/sbt -Pyarn -Pmesos -Pkubernetes -Pvolcano -Phive -Phive-thriftserver -Phadoop-cloud -Pkinesis-asl -Pdocker-integration-tests -Pkubernetes-integration-tests -Pspark-ganglia-lgpl -Pscala-2.13 compile Test/compile
# Any TPC-DS related updates on this job need to be applied to tpcds-1g-gen job of benchmark.yml as well
tpcds-1g:
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org