You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by 965 <10...@qq.com> on 2018/11/30 14:31:51 UTC
回复:Do we need to kill a spark job every time we change and deploy it?
I think if your job is running and you want to deploy a new jar which is the new version for the other, spark will think the new jar is another job ,
they distinguish job by Job ID , so if you want to replace the jar ,you have to kil job every time;
------------------ 原始邮件 ------------------
发件人: "Mina Aslani"<as...@gmail.com>;
发送时间: 2018年11月29日(星期四)凌晨2:44
收件人: "user @spark"<us...@spark.apache.org>;
主题: Do we need to kill a spark job every time we change and deploy it?
Hi,
I have a question for you.
Do we need to kill a spark job every time we change and deploy it to cluster? Or, is there a way for Spark to automatically pick up the recent jar version?
Best regards,
Mina