You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Edwin Biemond (JIRA)" <ji...@apache.org> on 2019/06/03 10:54:00 UTC

[jira] [Created] (SPARK-27927) driver pod hangs with pyspark 2.4.3 and master on kubenetes

Edwin Biemond created SPARK-27927:
-------------------------------------

             Summary: driver pod hangs with pyspark 2.4.3 and master on kubenetes
                 Key: SPARK-27927
                 URL: https://issues.apache.org/jira/browse/SPARK-27927
             Project: Spark
          Issue Type: Bug
          Components: Kubernetes
    Affects Versions: 2.4.3, 3.0.0
         Environment: k8s 1.11.9

spark 2.4.3 and master branch.
            Reporter: Edwin Biemond


When we run a simple pyspark on spark 2.4.3 or 3.0.0 the driver pods hangs and never calls the shutdown hook. 
{code:java}
#!/usr/bin/env python

from __future__ import print_function

import os
import os.path
import sys

# Are we really in Spark?
from pyspark.sql import SparkSession
spark = SparkSession.builder.appName('hello_world').getOrCreate()
print('Our Spark version is {}'.format(spark.version))
print('Spark context information: {} parallelism={} python version={}'.format(
str(spark.sparkContext),
spark.sparkContext.defaultParallelism,
spark.sparkContext.pythonVer
))
{code}
When we run this on kubernetes the driver and executer are just hanging. We see the output of this python script. 
{noformat}
bash-4.2# cat stdout.log
Our Spark version is 2.4.3
Spark context information: <SparkContext master=k8s://https://kubernetes.default.svc:443 appName=hello_world> parallelism=2 python version=3.6{noformat}
What works
 * a simple python with a print works fine on 2.4.3 and 3.0.0
 * same setup on 2.4.0
 * 2.4.3 spark-submit with the above pyspark

 

 

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org