You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by "ASF GitHub Bot (Jira)" <ji...@apache.org> on 2023/02/11 15:35:00 UTC

[jira] [Updated] (HUDI-5578) Upgrade base docker image for java 8

     [ https://issues.apache.org/jira/browse/HUDI-5578?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

ASF GitHub Bot updated HUDI-5578:
---------------------------------
    Labels: pull-request-available  (was: )

> Upgrade base docker image for java 8
> ------------------------------------
>
>                 Key: HUDI-5578
>                 URL: https://issues.apache.org/jira/browse/HUDI-5578
>             Project: Apache Hudi
>          Issue Type: Improvement
>          Components: dev-experience
>            Reporter: kazdy
>            Assignee: kazdy
>            Priority: Major
>              Labels: pull-request-available
>             Fix For: 1.0.0
>
>
> The existing Hudi docker image uses a pretty old Debian distribution, therefore it comes with Python 3.5.3. When using it with Spark 3.2.1,  running "pyspark" command in container shell results in below error:
> {code:java}
> root@e9fb3f81bdc9:/opt# pyspark
> Python 3.5.3 (default, Nov  4 2021, 15:29:10) 
> [GCC 6.3.0 20170516] on linux
> Type "help", "copyright", "credits" or "license" for more information.
> Traceback (most recent call last):
>   File "/opt/spark/python/pyspark/shell.py", line 29, in <module>
>     from pyspark.context import SparkContext
>   File "/opt/spark/python/pyspark/__init__.py", line 53, in <module>
>     from pyspark.rdd import RDD, RDDBarrier
>   File "/opt/spark/python/pyspark/rdd.py", line 48, in <module>
>     from pyspark.traceback_utils import SCCallSiteSync
>   File "/opt/spark/python/pyspark/traceback_utils.py", line 23, in <module>
>     CallSite = namedtuple("CallSite", "function file linenum")
>   File "/opt/spark/python/pyspark/serializers.py", line 390, in namedtuple
>     for k, v in _old_namedtuple_kwdefaults.items():
> AttributeError: 'NoneType' object has no attribute 'items' {code}
> The image I used was: [https://hub.docker.com/r/apachehudi/hudi-hadoop_3.1.0-hive_3.1.2-sparkadhoc_3.2.1/tags]
> Spark 3.2.1 requires Python 3.6+, Spark 3.3 requires Python 3.7+.
> Base image for Java 8 uses openjdk:8u212-jdk-slim-stretch.
> The goal is to upgrade it to [8u342-jdk-slim-bullseye.|https://hub.docker.com/layers/library/openjdk/8u342-jdk-slim-bullseye/images/sha256-ecb89bb055c1ee4db9da38713b953f6daafefe575c77c6439eabbb85e3168402?context=explore]
> This will also sync with existing java 11 base image distro.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)