You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Thomas Graves (JIRA)" <ji...@apache.org> on 2018/11/30 18:51:00 UTC

[jira] [Resolved] (SPARK-26201) python broadcast.value on driver fails with disk encryption enabled

     [ https://issues.apache.org/jira/browse/SPARK-26201?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Thomas Graves resolved SPARK-26201.
-----------------------------------
       Resolution: Fixed
         Assignee: Sanket Chintapalli
    Fix Version/s: 3.0.0
                   2.4.1
                   2.3.3

> python broadcast.value on driver fails with disk encryption enabled
> -------------------------------------------------------------------
>
>                 Key: SPARK-26201
>                 URL: https://issues.apache.org/jira/browse/SPARK-26201
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 2.3.2
>            Reporter: Thomas Graves
>            Assignee: Sanket Chintapalli
>            Priority: Major
>             Fix For: 2.3.3, 2.4.1, 3.0.0
>
>
> I was trying python with rpc and disk encryption enabled and when I tried a python broadcast variable and just read the value back on the driver side the job failed with:
>  
> Traceback (most recent call last): File "broadcast.py", line 37, in <module> words_new.value File "/pyspark.zip/pyspark/broadcast.py", line 137, in value File "pyspark.zip/pyspark/broadcast.py", line 122, in load_from_path File "pyspark.zip/pyspark/broadcast.py", line 128, in load EOFError: Ran out of input
> To reproduce use configs: --conf spark.network.crypto.enabled=true --conf spark.io.encryption.enabled=true
>  
> Code:
> words_new = sc.broadcast(["scala", "java", "hadoop", "spark", "akka"])
>  words_new.value
>  print(words_new.value)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org