You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Deepesh Maheshwari <de...@gmail.com> on 2015/08/05 10:42:05 UTC

Debugging Spark job in Eclipse

Hi,

As spark job is executed when you run start() method of
JavaStreamingContext.
All the job like map, flatMap is already defined earlier but even though
you put breakpoints in the function ,breakpoint doesn't stop there , then
how can i debug the spark jobs.

JavaDStream<String> words=lines.flatMap(new FlatMapFunction<String,
String>() {
            private static final long serialVersionUID =
-2042174881679341118L;

            @Override
            public Iterable<String> call(String t) throws Exception {

*// Mark Debug Point here, it doesn't stop here.*

                return Lists.newArrayList(SPACE.split(t));
            }
        });

Please suggest how can i saw the in-between data values.

Regards,
Deepesh

Re: Debugging Spark job in Eclipse

Posted by Eugene Morozov <fa...@list.ru>.
Deepesh, 

you have to call an action to start actual processing.
words.count() would do the trick.


On 05 Aug 2015, at 11:42, Deepesh Maheshwari <de...@gmail.com> wrote:

> Hi,
> 
> As spark job is executed when you run start() method of JavaStreamingContext.
> All the job like map, flatMap is already defined earlier but even though you put breakpoints in the function ,breakpoint doesn't stop there , then how can i debug the spark jobs.
> 
> JavaDStream<String> words=lines.flatMap(new FlatMapFunction<String, String>() {
>             private static final long serialVersionUID = -2042174881679341118L;
> 
>             @Override
>             public Iterable<String> call(String t) throws Exception {
> 
> // Mark Debug Point here, it doesn't stop here.
> 
>                 return Lists.newArrayList(SPACE.split(t));
>             }
>         });
> 
> Please suggest how can i saw the in-between data values.
> 
> Regards,
> Deepesh

Eugene Morozov
fathersson@list.ru