You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Rares Vernica <rv...@gmail.com> on 2015/08/05 21:29:59 UTC

Set Job Descriptions for Scala application

Hello,

My Spark application is written in Scala and submitted to a Spark cluster
in standalone mode. The Spark Jobs for my application are listed in the
Spark UI like this:

Job Id     Description ...
6          saveAsTextFile at Foo.scala:202
5          saveAsTextFile at Foo.scala:201
4          count at Foo.scala:188
3          collect at Foo.scala:182
2          count at Foo.scala:162
1          count at Foo.scala:152
0          collect at Foo.scala:142


Is it possible to assign Job Descriptions to all these jobs in my Scala
code?

Thanks!
Rares

Re: Set Job Descriptions for Scala application

Posted by Mark Hamstra <ma...@clearstorydata.com>.
SparkContext#setJobDescription or SparkContext#setJobGroup

On Wed, Aug 5, 2015 at 12:29 PM, Rares Vernica <rv...@gmail.com> wrote:

> Hello,
>
> My Spark application is written in Scala and submitted to a Spark cluster
> in standalone mode. The Spark Jobs for my application are listed in the
> Spark UI like this:
>
> Job Id     Description ...
> 6          saveAsTextFile at Foo.scala:202
> 5          saveAsTextFile at Foo.scala:201
> 4          count at Foo.scala:188
> 3          collect at Foo.scala:182
> 2          count at Foo.scala:162
> 1          count at Foo.scala:152
> 0          collect at Foo.scala:142
>
>
> Is it possible to assign Job Descriptions to all these jobs in my Scala
> code?
>
> Thanks!
> Rares
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>