You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "alex (Jira)" <ji...@apache.org> on 2019/09/13 20:22:00 UTC

[jira] [Updated] (SPARK-29077) submitting a SparkSession job fails on a spark://localhost:7077 url on Mac

     [ https://issues.apache.org/jira/browse/SPARK-29077?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

alex updated SPARK-29077:
-------------------------
    Description: 
When creating a spark context running on a local host getting connection refused error, however using the actual host name for example: "spark://myhostname.local:7077" works
{code:java}
performance-meter {
    spark {
        appname = "test-harness"
        master = "spark://localhost:7077"
    }
} {code}
{code:java}
val configRoot = "performance-meter"
val sparkSession = SparkSession.builder
  .appName(conf.getString(s"${configRoot}.spark.appname"))
  .master(conf.getString(s"${configRoot}.spark.master")) {code}
 

This appears to be due to some Macs having multiple network interfaces, at least is the case on my Mac. Recommended fix that seems to work locally:

in file: /usr/local/Cellar/apache-spark/2.4.3/libexec/sbin/start-master.sh & /usr/local/Cellar/apache-spark/2.4.3/libexec/sbin/start-slaves.sh add a case section for "Darwin"
{code:java}
if [ "$SPARK_MASTER_HOST" = "" ]; then
  case `uname` in
      (SunOS)
    SPARK_MASTER_HOST="`/usr/sbin/check-hostname | awk '{print $NF}'`"
    ;;
     (Darwin)
    SPARK_MASTER_HOST="localhost" # 13-Sep-2019 alexshagiev add Mac (Darwin) case to ensure spark binds on local host interface instead of external port.
    ;;
      (*)
    SPARK_MASTER_HOST="`hostname -f`"
    ;;
  esac
fi
{code}
 

 

 

 

  was:
When creating a spark context running on a local host getting connection refused error, however using the actual host name for example: "spark://myhostname.local:7077" works
{code:java}
performance-meter {
    spark {
        appname = "test-harness"
        master = "spark://localhost:7077"
    }
} {code}
{code:java}
val configRoot = "performance-meter"
val sparkSession = SparkSession.builder
  .appName(conf.getString(s"${configRoot}.spark.appname"))
  .master(conf.getString(s"${configRoot}.spark.master")) {code}
 

This appears to be due to some Macs having multiple network interfaces, at least is the case on my Mac. Recommended fix that seems to work locally:

in file: /usr/local/Cellar/apache-spark/2.4.3/libexec/sbin/start-master.sh & /usr/local/Cellar/apache-spark/2.4.3/libexec/sbin/start-slaves.sh add a section

 

 


> submitting  a SparkSession job fails on a spark://localhost:7077 url on Mac
> ---------------------------------------------------------------------------
>
>                 Key: SPARK-29077
>                 URL: https://issues.apache.org/jira/browse/SPARK-29077
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Submit
>    Affects Versions: 2.4.3
>         Environment: Darwin <hostname.local> 18.7.0 Darwin Kernel Version 18.7.0: Thu Jun 20 
>  PDT 2019; root:xnu-4903.270.47~4/RELEASE_X86_64 x86_64
>            Reporter: alex
>            Priority: Major
>
> When creating a spark context running on a local host getting connection refused error, however using the actual host name for example: "spark://myhostname.local:7077" works
> {code:java}
> performance-meter {
>     spark {
>         appname = "test-harness"
>         master = "spark://localhost:7077"
>     }
> } {code}
> {code:java}
> val configRoot = "performance-meter"
> val sparkSession = SparkSession.builder
>   .appName(conf.getString(s"${configRoot}.spark.appname"))
>   .master(conf.getString(s"${configRoot}.spark.master")) {code}
>  
> This appears to be due to some Macs having multiple network interfaces, at least is the case on my Mac. Recommended fix that seems to work locally:
> in file: /usr/local/Cellar/apache-spark/2.4.3/libexec/sbin/start-master.sh & /usr/local/Cellar/apache-spark/2.4.3/libexec/sbin/start-slaves.sh add a case section for "Darwin"
> {code:java}
> if [ "$SPARK_MASTER_HOST" = "" ]; then
>   case `uname` in
>       (SunOS)
>     SPARK_MASTER_HOST="`/usr/sbin/check-hostname | awk '{print $NF}'`"
>     ;;
>      (Darwin)
>     SPARK_MASTER_HOST="localhost" # 13-Sep-2019 alexshagiev add Mac (Darwin) case to ensure spark binds on local host interface instead of external port.
>     ;;
>       (*)
>     SPARK_MASTER_HOST="`hostname -f`"
>     ;;
>   esac
> fi
> {code}
>  
>  
>  
>  



--
This message was sent by Atlassian Jira
(v8.3.2#803003)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org