You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@ignite.apache.org by ApacheUser <bh...@gmail.com> on 2018/08/12 02:12:45 UTC

Running Spark Job in Background

Hello Ignite Team,

I have Spark job thats streams live data into Ignite Cache . The  job gets
closed as soon as I close window(Linux shell) . The other spark streaming
jobs I run with "&" at the end of spark submit job and they run for very
long time untill they I stop or crash due to other factors etc.

Is there any way I can run Spark-Ignite job continuously?

This is my spark submit:

spark-submit --packages org.apache.spark:spark-sql-kafka-0-10_2.11:2.3.0
--master spark://<IP>:7077  --executor-cores x --total-executor-cores x
--executor-memory Xg --conf spark.driver.maxResultSize=Xg --driver-memory Xg
--conf spark.default.parallelism=XX --conf
spark.serializer=org.apache.spark.serializer.KryoSerializer   --class
com.yyyy.yyyy.dataload <path to Jar>.jar  &


Thanks




--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/

Re: Running Spark Job in Background

Posted by Ilya Kasnacheev <il...@gmail.com>.
Hello!

You can invoke `disown' after launching process with &.

Note that & and nohup are very different, it is very strange if the result
is the same. Nohup jobs don't even use same terminal.

Regards,

-- 
Ilya Kasnacheev

2018-08-13 14:02 GMT+03:00 ApacheUser <bh...@gmail.com>:

> Thanks Denis,
>
> When Submit Spark job which connects to Ignite cluster creates an Ignite
> Client. The Ignite Client gets disconnected whe I close the window(Linux
> Shell).
> Regular Spark jobs are running fine with & or nohup, but in Spark/Ignite
> case, the clienst ae getting killed and spark job nologer runs.
>
> is there any way I can run the spark/Ignite job continuously even ater
> closing the linux shell?
>
> thanks
>
>
>
> --
> Sent from: http://apache-ignite-users.70518.x6.nabble.com/
>

Re: Running Spark Job in Background

Posted by ApacheUser <bh...@gmail.com>.
Thanks Denis,

When Submit Spark job which connects to Ignite cluster creates an Ignite
Client. The Ignite Client gets disconnected whe I close the window(Linux
Shell).
Regular Spark jobs are running fine with & or nohup, but in Spark/Ignite
case, the clienst ae getting killed and spark job nologer runs.

is there any way I can run the spark/Ignite job continuously even ater
closing the linux shell?

thanks



--
Sent from: http://apache-ignite-users.70518.x6.nabble.com/

Re: Running Spark Job in Background

Posted by Denis Mekhanikov <dm...@gmail.com>.
This is not really an Ignite question. Try asking it on Spark userlist:
http://apache-spark-user-list.1001560.n3.nabble.com/

Running commands with & is a valid approach though.
You can also try using nohup <https://linux.die.net/man/1/nohup>.

Denis

вс, 12 авг. 2018 г. в 5:12, ApacheUser <bh...@gmail.com>:

> Hello Ignite Team,
>
> I have Spark job thats streams live data into Ignite Cache . The  job gets
> closed as soon as I close window(Linux shell) . The other spark streaming
> jobs I run with "&" at the end of spark submit job and they run for very
> long time untill they I stop or crash due to other factors etc.
>
> Is there any way I can run Spark-Ignite job continuously?
>
> This is my spark submit:
>
> spark-submit --packages org.apache.spark:spark-sql-kafka-0-10_2.11:2.3.0
> --master spark://<IP>:7077  --executor-cores x --total-executor-cores x
> --executor-memory Xg --conf spark.driver.maxResultSize=Xg --driver-memory
> Xg
> --conf spark.default.parallelism=XX --conf
> spark.serializer=org.apache.spark.serializer.KryoSerializer   --class
> com.yyyy.yyyy.dataload <path to Jar>.jar  &
>
>
> Thanks
>
>
>
>
> --
> Sent from: http://apache-ignite-users.70518.x6.nabble.com/
>