You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Nan Zhu <zh...@gmail.com> on 2017/05/02 15:43:25 UTC

--jars does not take remote jar?

Hi, all

For some reason, I tried to pass in a HDFS path to the --jars option in
spark-submit

According to the document,
http://spark.apache.org/docs/latest/submitting-applications.html#advanced-dependency-management,
--jars would accept remote path

However, in the implementation,
https://github.com/apache/spark/blob/c622a87c44e0621e1b3024fdca9b2aa3c508615b/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala#L757,
it does not look like so....

Did I miss anything?

Best,

Nan

Re: --jars does not take remote jar?

Posted by Nan Zhu <zh...@gmail.com>.
I see.....Thanks!

On Tue, May 2, 2017 at 9:12 AM, Marcelo Vanzin <va...@cloudera.com> wrote:

> On Tue, May 2, 2017 at 9:07 AM, Nan Zhu <zh...@gmail.com> wrote:
> > I have no easy way to pass jar path to those forked Spark
> > applications? (except that I download jar from a remote path to a local
> temp
> > dir after resolving some permission issues, etc.?)
>
> Yes, that's the only way currently in client mode.
>
> --
> Marcelo
>

Re: --jars does not take remote jar?

Posted by Marcelo Vanzin <va...@cloudera.com>.
On Tue, May 2, 2017 at 9:07 AM, Nan Zhu <zh...@gmail.com> wrote:
> I have no easy way to pass jar path to those forked Spark
> applications? (except that I download jar from a remote path to a local temp
> dir after resolving some permission issues, etc.?)

Yes, that's the only way currently in client mode.

-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Re: --jars does not take remote jar?

Posted by Nan Zhu <zh...@gmail.com>.
Thanks for the reply! If I have an application master which starts some
Spark applications by forking processes (in yarn-client mode)

Essentially I have no easy way to pass jar path to those forked Spark
applications? (except that I download jar from a remote path to a local
temp dir after resolving some permission issues, etc.?)

On Tue, May 2, 2017 at 9:00 AM, Marcelo Vanzin <va...@cloudera.com> wrote:

> Remote jars are added to executors' classpaths, but not the driver's.
> In YARN cluster mode, they would also be added to the driver's class
> path.
>
> On Tue, May 2, 2017 at 8:43 AM, Nan Zhu <zh...@gmail.com> wrote:
> > Hi, all
> >
> > For some reason, I tried to pass in a HDFS path to the --jars option in
> > spark-submit
> >
> > According to the document,
> > http://spark.apache.org/docs/latest/submitting-
> applications.html#advanced-dependency-management,
> > --jars would accept remote path
> >
> > However, in the implementation,
> > https://github.com/apache/spark/blob/c622a87c44e0621e1b3024fdca9b2a
> a3c508615b/core/src/main/scala/org/apache/spark/deploy/
> SparkSubmit.scala#L757,
> > it does not look like so....
> >
> > Did I miss anything?
> >
> > Best,
> >
> > Nan
>
>
>
> --
> Marcelo
>

Re: --jars does not take remote jar?

Posted by Marcelo Vanzin <va...@cloudera.com>.
Remote jars are added to executors' classpaths, but not the driver's.
In YARN cluster mode, they would also be added to the driver's class
path.

On Tue, May 2, 2017 at 8:43 AM, Nan Zhu <zh...@gmail.com> wrote:
> Hi, all
>
> For some reason, I tried to pass in a HDFS path to the --jars option in
> spark-submit
>
> According to the document,
> http://spark.apache.org/docs/latest/submitting-applications.html#advanced-dependency-management,
> --jars would accept remote path
>
> However, in the implementation,
> https://github.com/apache/spark/blob/c622a87c44e0621e1b3024fdca9b2aa3c508615b/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala#L757,
> it does not look like so....
>
> Did I miss anything?
>
> Best,
>
> Nan



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org