You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2022/10/13 06:57:01 UTC

[GitHub] [spark-docker] Yikun opened a new pull request, #9: [SPARK-40783][INFRA] Enable Spark on K8s integration test

Yikun opened a new pull request, #9:
URL: https://github.com/apache/spark-docker/pull/9

   ### What changes were proposed in this pull request?
   This patch enable the Spark on K8s integration test:
   
   - **scala2.12-java11-python3-ubuntu**: Run scala / PySpark basic test
   - **scala2.12-java11-ubuntu**: Run scala basic test
   - **scala2.12-java11-r-ubuntu**: Run scala / SparkR basic test
   - **scala2.12-java11-python3-r-ubuntu**: Run all K8s integration test
   
   ### Why are the changes needed?
   To ensure the quality of official dockerfiles.
   
   ### Does this PR introduce _any_ user-facing change?
   No
   
   
   ### How was this patch tested?
   CI passed
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark-docker] Yikun commented on pull request #9: [SPARK-40783][INFRA] Enable Spark on K8s integration test

Posted by GitBox <gi...@apache.org>.
Yikun commented on PR #9:
URL: https://github.com/apache/spark-docker/pull/9#issuecomment-1278446516

   @HyukjinKwon Thanks, will merge this!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark-docker] Yikun commented on a diff in pull request #9: [SPARK-40783][INFRA] Enable Spark on K8s integration test

Posted by GitBox <gi...@apache.org>.
Yikun commented on code in PR #9:
URL: https://github.com/apache/spark-docker/pull/9#discussion_r994517454


##########
.github/workflows/main.yml:
##########
@@ -55,50 +60,151 @@ jobs:
         uses: actions/checkout@v2
 
       - name: Set up QEMU
-        uses: docker/setup-qemu-action@v1
+        uses: docker/setup-qemu-action@v2
 
       - name: Set up Docker Buildx
-        uses: docker/setup-buildx-action@v1
-
-      - name: Login to GHCR
-        uses: docker/login-action@v2
-        with:
-          registry: ghcr.io
-          username: ${{ github.actor }}
-          password: ${{ secrets.GITHUB_TOKEN }}
+        uses: docker/setup-buildx-action@v2

Review Comment:
   ```suggestion
           uses: docker/setup-buildx-action@v2
           with:
               driver-opts: network=host
   ```
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark-docker] Yikun commented on pull request #9: [SPARK-40783][INFRA] Enable Spark on K8s integration test

Posted by GitBox <gi...@apache.org>.
Yikun commented on PR #9:
URL: https://github.com/apache/spark-docker/pull/9#issuecomment-1277354305

   https://github.com/Yikun/spark-docker/pull/4
   
   hmm, it was passed in my local repo, but failed in upstream repo, a little bit wired.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark-docker] Yikun commented on a diff in pull request #9: [SPARK-40783][INFRA] Enable Spark on K8s integration test

Posted by GitBox <gi...@apache.org>.
Yikun commented on code in PR #9:
URL: https://github.com/apache/spark-docker/pull/9#discussion_r994503937


##########
.github/workflows/main.yml:
##########
@@ -55,50 +60,151 @@ jobs:
         uses: actions/checkout@v2
 
       - name: Set up QEMU
-        uses: docker/setup-qemu-action@v1
+        uses: docker/setup-qemu-action@v2
 
       - name: Set up Docker Buildx
-        uses: docker/setup-buildx-action@v1
-
-      - name: Login to GHCR
-        uses: docker/login-action@v2
-        with:
-          registry: ghcr.io
-          username: ${{ github.actor }}
-          password: ${{ secrets.GITHUB_TOKEN }}
+        uses: docker/setup-buildx-action@v2
 
       - name: Generate tags
         run: |
           TAG=scala${{ matrix.scala_version }}-java${{ matrix.java_version }}-${{ matrix.image_suffix }}
 
           REPO_OWNER=$(echo "${{ github.repository_owner }}" | tr '[:upper:]' '[:lower:]')
-          TEST_REPO=ghcr.io/$REPO_OWNER/spark-docker
+          TEST_REPO=localhost:5000/$REPO_OWNER/spark-docker
           IMAGE_NAME=spark
           IMAGE_PATH=${{ matrix.spark_version }}/$TAG
           UNIQUE_IMAGE_TAG=${{ matrix.spark_version }}-$TAG
+          IMAGE_URL=$TEST_REPO/$IMAGE_NAME:$UNIQUE_IMAGE_TAG
 
-          # Unique image tag in each version: scala2.12-java11-python3-ubuntu
+          # Unique image tag in each version: 3.3.0-scala2.12-java11-python3-ubuntu
           echo "UNIQUE_IMAGE_TAG=${UNIQUE_IMAGE_TAG}" >> $GITHUB_ENV
           # Test repo: ghcr.io/apache/spark-docker
           echo "TEST_REPO=${TEST_REPO}" >> $GITHUB_ENV
           # Image name: spark
           echo "IMAGE_NAME=${IMAGE_NAME}" >> $GITHUB_ENV
           # Image dockerfile path: 3.3.0/scala2.12-java11-python3-ubuntu
           echo "IMAGE_PATH=${IMAGE_PATH}" >> $GITHUB_ENV
+          # Image URL: ghcr.io/apache/spark-docker/spark:3.3.0-scala2.12-java11-python3-ubuntu
+          echo "IMAGE_URL=${IMAGE_URL}" >> $GITHUB_ENV
 
       - name: Print Image tags
         run: |
           echo "UNIQUE_IMAGE_TAG: "${UNIQUE_IMAGE_TAG}
           echo "TEST_REPO: "${TEST_REPO}
           echo "IMAGE_NAME: "${IMAGE_NAME}
           echo "IMAGE_PATH: "${IMAGE_PATH}
+          echo "IMAGE_URL: "${IMAGE_URL}
 
-      - name: Build and push test image
+      - name: Build image
         uses: docker/build-push-action@v2
         with:
           context: ${{ env.IMAGE_PATH }}
-          tags: ${{ env.TEST_REPO }}/${{ env.IMAGE_NAME }}:${{ env.UNIQUE_IMAGE_TAG }}
+          tags: ${{ env.IMAGE_URL }}
           platforms: linux/amd64,linux/arm64
+          load: true

Review Comment:
   ```suggestion
             push: true
   ```
   



##########
.github/workflows/main.yml:
##########
@@ -55,50 +60,151 @@ jobs:
         uses: actions/checkout@v2
 
       - name: Set up QEMU
-        uses: docker/setup-qemu-action@v1
+        uses: docker/setup-qemu-action@v2
 
       - name: Set up Docker Buildx
-        uses: docker/setup-buildx-action@v1
-
-      - name: Login to GHCR
-        uses: docker/login-action@v2
-        with:
-          registry: ghcr.io
-          username: ${{ github.actor }}
-          password: ${{ secrets.GITHUB_TOKEN }}
+        uses: docker/setup-buildx-action@v2

Review Comment:
   ```suggestion
           uses: docker/setup-buildx-action@v2
           with:
               driver-opts: network=host
   ```
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark-docker] martin-g commented on a diff in pull request #9: [SPARK-40783][INFRA] Enable Spark on K8s integration test

Posted by GitBox <gi...@apache.org>.
martin-g commented on code in PR #9:
URL: https://github.com/apache/spark-docker/pull/9#discussion_r994270823


##########
.github/workflows/main.yml:
##########
@@ -76,29 +69,136 @@ jobs:
           IMAGE_NAME=spark
           IMAGE_PATH=${{ matrix.spark_version }}/$TAG
           UNIQUE_IMAGE_TAG=${{ matrix.spark_version }}-$TAG
+          IMAGE_URL=$TEST_REPO/$IMAGE_NAME:$UNIQUE_IMAGE_TAG
 
-          # Unique image tag in each version: scala2.12-java11-python3-ubuntu
+          # Unique image tag in each version: 3.3.0-scala2.12-java11-python3-ubuntu
           echo "UNIQUE_IMAGE_TAG=${UNIQUE_IMAGE_TAG}" >> $GITHUB_ENV
           # Test repo: ghcr.io/apache/spark-docker
           echo "TEST_REPO=${TEST_REPO}" >> $GITHUB_ENV
           # Image name: spark
           echo "IMAGE_NAME=${IMAGE_NAME}" >> $GITHUB_ENV
           # Image dockerfile path: 3.3.0/scala2.12-java11-python3-ubuntu
           echo "IMAGE_PATH=${IMAGE_PATH}" >> $GITHUB_ENV
+          # Image URL: ghcr.io/apache/spark-docker/spark:3.3.0-scala2.12-java11-python3-ubuntu
+          echo "IMAGE_URL=${IMAGE_URL}" >> $GITHUB_ENV
 
       - name: Print Image tags
         run: |
           echo "UNIQUE_IMAGE_TAG: "${UNIQUE_IMAGE_TAG}
           echo "TEST_REPO: "${TEST_REPO}
           echo "IMAGE_NAME: "${IMAGE_NAME}
           echo "IMAGE_PATH: "${IMAGE_PATH}
+          echo "IMAGE_URL: "${IMAGE_URL}
 
-      - name: Build and push test image
+      - name: Build image
         uses: docker/build-push-action@v2
         with:
           context: ${{ env.IMAGE_PATH }}
-          tags: ${{ env.TEST_REPO }}/${{ env.IMAGE_NAME }}:${{ env.UNIQUE_IMAGE_TAG }}
+          tags: ${{ env.IMAGE_URL }}
           platforms: linux/amd64,linux/arm64
 
-      - name: Image digest
-        run: echo ${{ steps.docker_build.outputs.digest }}
+      - name: Test - Checkout Spark repository
+        uses: actions/checkout@v2
+        with:
+          fetch-depth: 0
+          repository: apache/spark
+          ref: v${{ matrix.spark_version }}
+          path: ${{ github.workspace }}/spark
+
+      - name: Test - Cherry pick commits
+        # Apache Spark enable resource limited k8s IT since v3.3.1, cherrpick patches for old release

Review Comment:
   ```suggestion
           # Apache Spark enable resource limited k8s IT since v3.3.1, cherry-pick patches for old release
   ```



##########
.github/workflows/main.yml:
##########
@@ -76,29 +69,136 @@ jobs:
           IMAGE_NAME=spark
           IMAGE_PATH=${{ matrix.spark_version }}/$TAG
           UNIQUE_IMAGE_TAG=${{ matrix.spark_version }}-$TAG
+          IMAGE_URL=$TEST_REPO/$IMAGE_NAME:$UNIQUE_IMAGE_TAG
 
-          # Unique image tag in each version: scala2.12-java11-python3-ubuntu
+          # Unique image tag in each version: 3.3.0-scala2.12-java11-python3-ubuntu
           echo "UNIQUE_IMAGE_TAG=${UNIQUE_IMAGE_TAG}" >> $GITHUB_ENV
           # Test repo: ghcr.io/apache/spark-docker
           echo "TEST_REPO=${TEST_REPO}" >> $GITHUB_ENV
           # Image name: spark
           echo "IMAGE_NAME=${IMAGE_NAME}" >> $GITHUB_ENV
           # Image dockerfile path: 3.3.0/scala2.12-java11-python3-ubuntu
           echo "IMAGE_PATH=${IMAGE_PATH}" >> $GITHUB_ENV
+          # Image URL: ghcr.io/apache/spark-docker/spark:3.3.0-scala2.12-java11-python3-ubuntu
+          echo "IMAGE_URL=${IMAGE_URL}" >> $GITHUB_ENV
 
       - name: Print Image tags
         run: |
           echo "UNIQUE_IMAGE_TAG: "${UNIQUE_IMAGE_TAG}
           echo "TEST_REPO: "${TEST_REPO}
           echo "IMAGE_NAME: "${IMAGE_NAME}
           echo "IMAGE_PATH: "${IMAGE_PATH}
+          echo "IMAGE_URL: "${IMAGE_URL}
 
-      - name: Build and push test image
+      - name: Build image
         uses: docker/build-push-action@v2
         with:
           context: ${{ env.IMAGE_PATH }}
-          tags: ${{ env.TEST_REPO }}/${{ env.IMAGE_NAME }}:${{ env.UNIQUE_IMAGE_TAG }}
+          tags: ${{ env.IMAGE_URL }}
           platforms: linux/amd64,linux/arm64
 
-      - name: Image digest
-        run: echo ${{ steps.docker_build.outputs.digest }}
+      - name: Test - Checkout Spark repository
+        uses: actions/checkout@v2

Review Comment:
   there is a newer version - v3



##########
.github/workflows/main.yml:
##########
@@ -76,29 +69,136 @@ jobs:
           IMAGE_NAME=spark
           IMAGE_PATH=${{ matrix.spark_version }}/$TAG
           UNIQUE_IMAGE_TAG=${{ matrix.spark_version }}-$TAG
+          IMAGE_URL=$TEST_REPO/$IMAGE_NAME:$UNIQUE_IMAGE_TAG
 
-          # Unique image tag in each version: scala2.12-java11-python3-ubuntu
+          # Unique image tag in each version: 3.3.0-scala2.12-java11-python3-ubuntu
           echo "UNIQUE_IMAGE_TAG=${UNIQUE_IMAGE_TAG}" >> $GITHUB_ENV
           # Test repo: ghcr.io/apache/spark-docker
           echo "TEST_REPO=${TEST_REPO}" >> $GITHUB_ENV
           # Image name: spark
           echo "IMAGE_NAME=${IMAGE_NAME}" >> $GITHUB_ENV
           # Image dockerfile path: 3.3.0/scala2.12-java11-python3-ubuntu
           echo "IMAGE_PATH=${IMAGE_PATH}" >> $GITHUB_ENV
+          # Image URL: ghcr.io/apache/spark-docker/spark:3.3.0-scala2.12-java11-python3-ubuntu
+          echo "IMAGE_URL=${IMAGE_URL}" >> $GITHUB_ENV
 
       - name: Print Image tags
         run: |
           echo "UNIQUE_IMAGE_TAG: "${UNIQUE_IMAGE_TAG}
           echo "TEST_REPO: "${TEST_REPO}
           echo "IMAGE_NAME: "${IMAGE_NAME}
           echo "IMAGE_PATH: "${IMAGE_PATH}
+          echo "IMAGE_URL: "${IMAGE_URL}
 
-      - name: Build and push test image
+      - name: Build image
         uses: docker/build-push-action@v2
         with:
           context: ${{ env.IMAGE_PATH }}
-          tags: ${{ env.TEST_REPO }}/${{ env.IMAGE_NAME }}:${{ env.UNIQUE_IMAGE_TAG }}
+          tags: ${{ env.IMAGE_URL }}
           platforms: linux/amd64,linux/arm64
 
-      - name: Image digest
-        run: echo ${{ steps.docker_build.outputs.digest }}
+      - name: Test - Checkout Spark repository
+        uses: actions/checkout@v2
+        with:
+          fetch-depth: 0
+          repository: apache/spark
+          ref: v${{ matrix.spark_version }}
+          path: ${{ github.workspace }}/spark
+
+      - name: Test - Cherry pick commits
+        # Apache Spark enable resource limited k8s IT since v3.3.1, cherrpick patches for old release
+        # https://github.com/apache/spark/pull/36087#issuecomment-1251756266
+        if: matrix.spark_version == '3.3.0'
+        working-directory: ${{ github.workspace }}/spark
+        run: |
+          # SPARK-38802: Add driverRequestCores/executorRequestCores supported
+          # https://github.com/apache/spark/commit/83963828b54bffe99527a004057272bc584cbc26
+          git -c user.name='Apache Spark Test Account' -c user.email='sparktestacc@gmail.com' cherry-pick 83963828b54bffe99527a004057272bc584cbc26
+          # SPARK-38803: Lower minio cpu to 250m
+          # https://github.com/apache/spark/commit/5ea2b386eb866e20540660cdb6ed43792cb29969
+          git -c user.name='Apache Spark Test Account' -c user.email='sparktestacc@gmail.com' cherry-pick 5ea2b386eb866e20540660cdb6ed43792cb29969
+
+      - name: Test - Install Java ${{ inputs.java }}
+        uses: actions/setup-java@v1
+        with:
+          java-version: ${{ matrix.java_version }}
+
+      - name: Test - Cache Scala, SBT and Maven
+        uses: actions/cache@v2

Review Comment:
   v3



##########
.github/workflows/main.yml:
##########
@@ -76,29 +69,136 @@ jobs:
           IMAGE_NAME=spark
           IMAGE_PATH=${{ matrix.spark_version }}/$TAG
           UNIQUE_IMAGE_TAG=${{ matrix.spark_version }}-$TAG
+          IMAGE_URL=$TEST_REPO/$IMAGE_NAME:$UNIQUE_IMAGE_TAG
 
-          # Unique image tag in each version: scala2.12-java11-python3-ubuntu
+          # Unique image tag in each version: 3.3.0-scala2.12-java11-python3-ubuntu
           echo "UNIQUE_IMAGE_TAG=${UNIQUE_IMAGE_TAG}" >> $GITHUB_ENV
           # Test repo: ghcr.io/apache/spark-docker
           echo "TEST_REPO=${TEST_REPO}" >> $GITHUB_ENV
           # Image name: spark
           echo "IMAGE_NAME=${IMAGE_NAME}" >> $GITHUB_ENV
           # Image dockerfile path: 3.3.0/scala2.12-java11-python3-ubuntu
           echo "IMAGE_PATH=${IMAGE_PATH}" >> $GITHUB_ENV
+          # Image URL: ghcr.io/apache/spark-docker/spark:3.3.0-scala2.12-java11-python3-ubuntu
+          echo "IMAGE_URL=${IMAGE_URL}" >> $GITHUB_ENV
 
       - name: Print Image tags
         run: |
           echo "UNIQUE_IMAGE_TAG: "${UNIQUE_IMAGE_TAG}
           echo "TEST_REPO: "${TEST_REPO}
           echo "IMAGE_NAME: "${IMAGE_NAME}
           echo "IMAGE_PATH: "${IMAGE_PATH}
+          echo "IMAGE_URL: "${IMAGE_URL}
 
-      - name: Build and push test image
+      - name: Build image
         uses: docker/build-push-action@v2
         with:
           context: ${{ env.IMAGE_PATH }}
-          tags: ${{ env.TEST_REPO }}/${{ env.IMAGE_NAME }}:${{ env.UNIQUE_IMAGE_TAG }}
+          tags: ${{ env.IMAGE_URL }}
           platforms: linux/amd64,linux/arm64
 
-      - name: Image digest
-        run: echo ${{ steps.docker_build.outputs.digest }}
+      - name: Test - Checkout Spark repository
+        uses: actions/checkout@v2
+        with:
+          fetch-depth: 0
+          repository: apache/spark
+          ref: v${{ matrix.spark_version }}
+          path: ${{ github.workspace }}/spark
+
+      - name: Test - Cherry pick commits
+        # Apache Spark enable resource limited k8s IT since v3.3.1, cherrpick patches for old release
+        # https://github.com/apache/spark/pull/36087#issuecomment-1251756266
+        if: matrix.spark_version == '3.3.0'
+        working-directory: ${{ github.workspace }}/spark
+        run: |
+          # SPARK-38802: Add driverRequestCores/executorRequestCores supported
+          # https://github.com/apache/spark/commit/83963828b54bffe99527a004057272bc584cbc26
+          git -c user.name='Apache Spark Test Account' -c user.email='sparktestacc@gmail.com' cherry-pick 83963828b54bffe99527a004057272bc584cbc26
+          # SPARK-38803: Lower minio cpu to 250m
+          # https://github.com/apache/spark/commit/5ea2b386eb866e20540660cdb6ed43792cb29969
+          git -c user.name='Apache Spark Test Account' -c user.email='sparktestacc@gmail.com' cherry-pick 5ea2b386eb866e20540660cdb6ed43792cb29969
+
+      - name: Test - Install Java ${{ inputs.java }}
+        uses: actions/setup-java@v1

Review Comment:
   v3 ?



##########
.github/workflows/main.yml:
##########
@@ -76,29 +69,136 @@ jobs:
           IMAGE_NAME=spark
           IMAGE_PATH=${{ matrix.spark_version }}/$TAG
           UNIQUE_IMAGE_TAG=${{ matrix.spark_version }}-$TAG
+          IMAGE_URL=$TEST_REPO/$IMAGE_NAME:$UNIQUE_IMAGE_TAG
 
-          # Unique image tag in each version: scala2.12-java11-python3-ubuntu
+          # Unique image tag in each version: 3.3.0-scala2.12-java11-python3-ubuntu
           echo "UNIQUE_IMAGE_TAG=${UNIQUE_IMAGE_TAG}" >> $GITHUB_ENV
           # Test repo: ghcr.io/apache/spark-docker
           echo "TEST_REPO=${TEST_REPO}" >> $GITHUB_ENV
           # Image name: spark
           echo "IMAGE_NAME=${IMAGE_NAME}" >> $GITHUB_ENV
           # Image dockerfile path: 3.3.0/scala2.12-java11-python3-ubuntu
           echo "IMAGE_PATH=${IMAGE_PATH}" >> $GITHUB_ENV
+          # Image URL: ghcr.io/apache/spark-docker/spark:3.3.0-scala2.12-java11-python3-ubuntu
+          echo "IMAGE_URL=${IMAGE_URL}" >> $GITHUB_ENV
 
       - name: Print Image tags
         run: |
           echo "UNIQUE_IMAGE_TAG: "${UNIQUE_IMAGE_TAG}
           echo "TEST_REPO: "${TEST_REPO}
           echo "IMAGE_NAME: "${IMAGE_NAME}
           echo "IMAGE_PATH: "${IMAGE_PATH}
+          echo "IMAGE_URL: "${IMAGE_URL}
 
-      - name: Build and push test image
+      - name: Build image
         uses: docker/build-push-action@v2
         with:
           context: ${{ env.IMAGE_PATH }}
-          tags: ${{ env.TEST_REPO }}/${{ env.IMAGE_NAME }}:${{ env.UNIQUE_IMAGE_TAG }}
+          tags: ${{ env.IMAGE_URL }}
           platforms: linux/amd64,linux/arm64
 
-      - name: Image digest
-        run: echo ${{ steps.docker_build.outputs.digest }}
+      - name: Test - Checkout Spark repository
+        uses: actions/checkout@v2
+        with:
+          fetch-depth: 0
+          repository: apache/spark
+          ref: v${{ matrix.spark_version }}
+          path: ${{ github.workspace }}/spark
+
+      - name: Test - Cherry pick commits
+        # Apache Spark enable resource limited k8s IT since v3.3.1, cherrpick patches for old release
+        # https://github.com/apache/spark/pull/36087#issuecomment-1251756266
+        if: matrix.spark_version == '3.3.0'

Review Comment:
   ```suggestion
           if: startsWith(matrix.spark_version, '3.3.')
   ```
   to make it future proof ?!



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark-docker] Yikun commented on a diff in pull request #9: [SPARK-40783][INFRA] Enable Spark on K8s integration test

Posted by GitBox <gi...@apache.org>.
Yikun commented on code in PR #9:
URL: https://github.com/apache/spark-docker/pull/9#discussion_r994503937


##########
.github/workflows/main.yml:
##########
@@ -55,50 +60,151 @@ jobs:
         uses: actions/checkout@v2
 
       - name: Set up QEMU
-        uses: docker/setup-qemu-action@v1
+        uses: docker/setup-qemu-action@v2
 
       - name: Set up Docker Buildx
-        uses: docker/setup-buildx-action@v1
-
-      - name: Login to GHCR
-        uses: docker/login-action@v2
-        with:
-          registry: ghcr.io
-          username: ${{ github.actor }}
-          password: ${{ secrets.GITHUB_TOKEN }}
+        uses: docker/setup-buildx-action@v2
 
       - name: Generate tags
         run: |
           TAG=scala${{ matrix.scala_version }}-java${{ matrix.java_version }}-${{ matrix.image_suffix }}
 
           REPO_OWNER=$(echo "${{ github.repository_owner }}" | tr '[:upper:]' '[:lower:]')
-          TEST_REPO=ghcr.io/$REPO_OWNER/spark-docker
+          TEST_REPO=localhost:5000/$REPO_OWNER/spark-docker
           IMAGE_NAME=spark
           IMAGE_PATH=${{ matrix.spark_version }}/$TAG
           UNIQUE_IMAGE_TAG=${{ matrix.spark_version }}-$TAG
+          IMAGE_URL=$TEST_REPO/$IMAGE_NAME:$UNIQUE_IMAGE_TAG
 
-          # Unique image tag in each version: scala2.12-java11-python3-ubuntu
+          # Unique image tag in each version: 3.3.0-scala2.12-java11-python3-ubuntu
           echo "UNIQUE_IMAGE_TAG=${UNIQUE_IMAGE_TAG}" >> $GITHUB_ENV
           # Test repo: ghcr.io/apache/spark-docker
           echo "TEST_REPO=${TEST_REPO}" >> $GITHUB_ENV
           # Image name: spark
           echo "IMAGE_NAME=${IMAGE_NAME}" >> $GITHUB_ENV
           # Image dockerfile path: 3.3.0/scala2.12-java11-python3-ubuntu
           echo "IMAGE_PATH=${IMAGE_PATH}" >> $GITHUB_ENV
+          # Image URL: ghcr.io/apache/spark-docker/spark:3.3.0-scala2.12-java11-python3-ubuntu
+          echo "IMAGE_URL=${IMAGE_URL}" >> $GITHUB_ENV
 
       - name: Print Image tags
         run: |
           echo "UNIQUE_IMAGE_TAG: "${UNIQUE_IMAGE_TAG}
           echo "TEST_REPO: "${TEST_REPO}
           echo "IMAGE_NAME: "${IMAGE_NAME}
           echo "IMAGE_PATH: "${IMAGE_PATH}
+          echo "IMAGE_URL: "${IMAGE_URL}
 
-      - name: Build and push test image
+      - name: Build image
         uses: docker/build-push-action@v2
         with:
           context: ${{ env.IMAGE_PATH }}
-          tags: ${{ env.TEST_REPO }}/${{ env.IMAGE_NAME }}:${{ env.UNIQUE_IMAGE_TAG }}
+          tags: ${{ env.IMAGE_URL }}
           platforms: linux/amd64,linux/arm64
+          load: true

Review Comment:
   ```suggestion
             push: true
   ```
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark-docker] Yikun commented on a diff in pull request #9: [SPARK-40783][INFRA] Enable Spark on K8s integration test

Posted by GitBox <gi...@apache.org>.
Yikun commented on code in PR #9:
URL: https://github.com/apache/spark-docker/pull/9#discussion_r994426545


##########
.github/workflows/main.yml:
##########
@@ -76,29 +69,136 @@ jobs:
           IMAGE_NAME=spark
           IMAGE_PATH=${{ matrix.spark_version }}/$TAG
           UNIQUE_IMAGE_TAG=${{ matrix.spark_version }}-$TAG
+          IMAGE_URL=$TEST_REPO/$IMAGE_NAME:$UNIQUE_IMAGE_TAG
 
-          # Unique image tag in each version: scala2.12-java11-python3-ubuntu
+          # Unique image tag in each version: 3.3.0-scala2.12-java11-python3-ubuntu
           echo "UNIQUE_IMAGE_TAG=${UNIQUE_IMAGE_TAG}" >> $GITHUB_ENV
           # Test repo: ghcr.io/apache/spark-docker
           echo "TEST_REPO=${TEST_REPO}" >> $GITHUB_ENV
           # Image name: spark
           echo "IMAGE_NAME=${IMAGE_NAME}" >> $GITHUB_ENV
           # Image dockerfile path: 3.3.0/scala2.12-java11-python3-ubuntu
           echo "IMAGE_PATH=${IMAGE_PATH}" >> $GITHUB_ENV
+          # Image URL: ghcr.io/apache/spark-docker/spark:3.3.0-scala2.12-java11-python3-ubuntu
+          echo "IMAGE_URL=${IMAGE_URL}" >> $GITHUB_ENV
 
       - name: Print Image tags
         run: |
           echo "UNIQUE_IMAGE_TAG: "${UNIQUE_IMAGE_TAG}
           echo "TEST_REPO: "${TEST_REPO}
           echo "IMAGE_NAME: "${IMAGE_NAME}
           echo "IMAGE_PATH: "${IMAGE_PATH}
+          echo "IMAGE_URL: "${IMAGE_URL}
 
-      - name: Build and push test image
+      - name: Build image
         uses: docker/build-push-action@v2
         with:
           context: ${{ env.IMAGE_PATH }}
-          tags: ${{ env.TEST_REPO }}/${{ env.IMAGE_NAME }}:${{ env.UNIQUE_IMAGE_TAG }}
+          tags: ${{ env.IMAGE_URL }}
           platforms: linux/amd64,linux/arm64
 
-      - name: Image digest
-        run: echo ${{ steps.docker_build.outputs.digest }}
+      - name: Test - Checkout Spark repository
+        uses: actions/checkout@v2
+        with:
+          fetch-depth: 0
+          repository: apache/spark
+          ref: v${{ matrix.spark_version }}
+          path: ${{ github.workspace }}/spark
+
+      - name: Test - Cherry pick commits
+        # Apache Spark enable resource limited k8s IT since v3.3.1, cherrpick patches for old release
+        # https://github.com/apache/spark/pull/36087#issuecomment-1251756266
+        if: matrix.spark_version == '3.3.0'

Review Comment:
   Nope, it already in 3.3.1+ and 3.4.0+, so only 3.3.0 is enough!



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] [spark-docker] Yikun closed pull request #9: [SPARK-40783][INFRA] Enable Spark on K8s integration test

Posted by GitBox <gi...@apache.org>.
Yikun closed pull request #9: [SPARK-40783][INFRA] Enable Spark on K8s integration test
URL: https://github.com/apache/spark-docker/pull/9


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org