You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "Yikun (via GitHub)" <gi...@apache.org> on 2023/10/13 14:42:18 UTC

Re: [PR] [WIP] Add support for java 17 and explicit Python versions from spark 3.5.0 onwards [spark-docker]

Yikun commented on code in PR #56:
URL: https://github.com/apache/spark-docker/pull/56#discussion_r1358359041


##########
add-dockerfiles.sh:
##########
@@ -44,12 +48,20 @@ for TAG in $TAGS; do
     if echo $TAG | grep -q "r-"; then
         OPTS+=" --sparkr"
     fi
+    
+    if echo $TAG | grep -q "java17"; then
+        OPTS+=" --java-version 17 --image eclipse-temurin:17-jre-jammy"
+    fi
+    if echo $TAG | grep -q "java11"; then

Review Comment:
   elif?



##########
add-dockerfiles.sh:
##########
@@ -44,12 +48,20 @@ for TAG in $TAGS; do
     if echo $TAG | grep -q "r-"; then
         OPTS+=" --sparkr"
     fi
+    
+    if echo $TAG | grep -q "java17"; then
+        OPTS+=" --java-version 17 --image eclipse-temurin:17-jre-jammy"

Review Comment:
   Greate!



##########
3.5.0/scala2.12-java11-python3-r-ubuntu/Dockerfile:
##########
@@ -20,7 +20,10 @@ USER root
 
 RUN set -ex; \
     apt-get update; \
-    apt-get install -y python3 python3-pip; \
+    apt install -y software-properties-common; \
+    add-apt-repository ppa:deadsnakes/ppa; \
+    apt install python3.10; \

Review Comment:
   Is there any special reason why we use the python 3.10? I prefer to use os default python3 version from matainence cost view.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org