You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "YSMAL Vincent (JIRA)" <ji...@apache.org> on 2016/09/09 12:24:21 UTC

[jira] [Commented] (SPARK-6680) Be able to specifie IP for spark-shell(spark driver) blocker for Docker integration

    [ https://issues.apache.org/jira/browse/SPARK-6680?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15476947#comment-15476947 ] 

YSMAL Vincent commented on SPARK-6680:
--------------------------------------

HI, using docker you can get rid of this alias on hostname, by using the {code}--hostname spark-master{code} option in docker container.


> Be able to specifie IP for spark-shell(spark driver) blocker for Docker integration
> -----------------------------------------------------------------------------------
>
>                 Key: SPARK-6680
>                 URL: https://issues.apache.org/jira/browse/SPARK-6680
>             Project: Spark
>          Issue Type: New Feature
>          Components: Deploy
>    Affects Versions: 1.3.0
>         Environment: Docker.
>            Reporter: Egor Pakhomov
>            Priority: Minor
>              Labels: core, deploy, docker
>
> Suppose I have 3 docker containers - spark_master, spark_worker and spark_shell. In docker for public IP of this container there is an alias like "fgsdfg454534". It only visible in this container. When spark use it for communication other containers receive this alias and don't know what to do with it. Thats why I used SPARK_LOCAL_IP for master and worker. But it doesn't work for spark driver(for spark shell - other types of drivers I haven't try). Spark driver sent everyone "fgsdfg454534" alias about itself and then nobody can address it. I've overcome it in https://github.com/epahomov/docker-spark, but it would be better if it would be solved on spark code level.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org