You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Oz Ben-Ami (JIRA)" <ji...@apache.org> on 2018/01/15 16:03:00 UTC

[jira] [Comment Edited] (SPARK-23078) Allow Submitting Spark Thrift Server in Cluster Mode

    [ https://issues.apache.org/jira/browse/SPARK-23078?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16326366#comment-16326366 ] 

Oz Ben-Ami edited comment on SPARK-23078 at 1/15/18 4:02 PM:
-------------------------------------------------------------

[~mgaido] In Kubernetes you can just create a Service (think in-cluster DNS+load balancing) which automatically connects to it, whichever node it's on


was (Author: ozzieba):
[~mgaido] In Kubernetes you can just create a Service which automatically connects to it, whichever node it's on

> Allow Submitting Spark Thrift Server in Cluster Mode
> ----------------------------------------------------
>
>                 Key: SPARK-23078
>                 URL: https://issues.apache.org/jira/browse/SPARK-23078
>             Project: Spark
>          Issue Type: Improvement
>          Components: Kubernetes, Spark Submit, SQL
>    Affects Versions: 2.2.0, 2.2.1, 2.3.0
>            Reporter: Oz Ben-Ami
>            Priority: Minor
>
> Since SPARK-5176, SparkSubmit has blacklisted the Thrift Server from running in Cluster mode, since at the time it was not able to do so successfully. I have confirmed that Spark Thrift Server can run on Cluster mode in Kubernetes, by commenting out [https://github.com/apache-spark-on-k8s/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala#L331.] I have not had a chance to test against YARN. Since Kubernetes does not have Client mode, this change is necessary to run Spark Thrift Service in Kubernetes.
> [~foxish] [~coderanger]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org