You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by spr <sp...@yarcdata.com> on 2014/11/03 22:12:53 UTC

with SparkStreeaming spark-submit, don't see output after ssc.start()

I have a Spark Streaming program that works fine if I execute it via 

sbt "runMain com.cray.examples.spark.streaming.cyber.StatefulDhcpServerHisto
-f /Users/spr/Documents/<...>/tmp/ -t 10"

but if I start it via

$S/bin/spark-submit --master local[12] --class StatefulNewDhcpServers 
target/scala-2.10/newd*jar -f /Users/spr/Documents/<...>/tmp/ -t 10

(where $S points to the base of the Spark installation), it prints the
output of print statements before the ssc.start() but nothing after that.

I might well have screwed up something, but I'm getting no output anywhere
AFAICT.  I have set spark.eventLog.enabled to True in my spark-defaults.conf
file.  The Spark History Server at localhost:18080 says "no completed
applications found".  There must be some log output somewhere.  Any ideas?



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/with-SparkStreeaming-spark-submit-don-t-see-output-after-ssc-start-tp17989.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: with SparkStreeaming spark-submit, don't see output after ssc.start()

Posted by spr <sp...@yarcdata.com>.
This problem turned out to be a cockpit error.  I had the same class name
defined in a couple different files, and didn't realize SBT was compiling
them all together, and then executing the "wrong" one.  Mea culpa.



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/with-SparkStreeaming-spark-submit-don-t-see-output-after-ssc-start-tp17989p18224.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: with SparkStreeaming spark-submit, don't see output after ssc.start()

Posted by spr <sp...@yarcdata.com>.
P.S.  I believe I am creating output from the Spark Streaming app, and thus
not falling into the "no-output, no-execution" pitfall, as at the end I have 

newServers.print()
newServers.saveAsTextFiles("newServers","out")



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/with-SparkStreeaming-spark-submit-don-t-see-output-after-ssc-start-tp17989p18003.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: with SparkStreeaming spark-submit, don't see output after ssc.start()

Posted by Steve Reinhardt <sp...@yarcdata.com>.
From: Tobias Pfeiffer <tg...@preferred.jp>>

Am I right that you are actually executing two different classes here?

Yes, I realized after I posted that I was calling 2 different classes, though they are in the same JAR.   I went back and tried it again with the same class in both cases, and it failed the same way.  I thought perhaps having 2 classes in a JAR was an issue, but commenting out one of the classes did not seem to make a difference.


Re: with SparkStreeaming spark-submit, don't see output after ssc.start()

Posted by spr <sp...@yarcdata.com>.
Yes, good catch.  I also realized, after I posted, that I was calling 2
different classes, though they are in the same JAR.   I went back and tried
it again with the same class in both cases, and it failed the same way.  I
thought perhaps having 2 classes in a JAR was an issue, but commenting out
one of the classes did not seem to make a difference.



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/with-SparkStreeaming-spark-submit-don-t-see-output-after-ssc-start-tp17989p18066.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: with SparkStreeaming spark-submit, don't see output after ssc.start()

Posted by Tobias Pfeiffer <tg...@preferred.jp>.
H

On Tue, Nov 4, 2014 at 6:12 AM, spr <sp...@yarcdata.com> wrote:
>
> sbt "runMain
> com.cray.examples.spark.streaming.cyber.StatefulDhcpServerHisto
> -f /Users/spr/Documents/<...>/tmp/ -t 10"
>
> [...]
>
> $S/bin/spark-submit --master local[12] --class StatefulNewDhcpServers
> target/scala-2.10/newd*jar -f /Users/spr/Documents/<...>/tmp/ -t 10
>

Am I right that you are actually executing two different classes here?

Tobias