You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by nagaraj <na...@gmail.com> on 2016/09/10 16:22:28 UTC

Spark using my last job resources and jar files

I am new to spark . Trying to run spark job with client mode and it works
well if I use the same path for jar and other resource files. After killing
the running application using Yarn command and if spark job is resubmitted
with updated jar and file locations, job still uses my old path. After
reboot of system , spark job takes new path. Spark-submit command

spark-submit \ --class export.streaming.DataExportStreaming \ --jars
/usr/hdp/current/spark-client/lib/postgresql-9.4.1209.jar \
--driver-class-path
/usr/hdp/current/spark-client/lib/postgresql-9.4.1209.jar \ --conf
spark.driver.extraClassPath=/usr/hdp/current/spark-client/lib/postgresql-9.4.1209.jar
\ --conf
spark.executor.extraClassPath=/usr/hdp/current/spark-client/lib/postgresql-9.4.1209.jar
\ --master yarn --deploy-mode client \ --files
/usr/lib/firebet-spark/52.0.2-1/data-export/resources/rollup_selfservice_session_daily.sql,/usr/lib/firebet-spark/52.0.2-1/data-export/resources/rollup_selfservice_session_device.sql,/usr/lib/firebet-spark/52.0.2-1/data-export/resources/rollup_selfservice_session_workflow.sql,/usr/lib/firebet-spark/52.0.2-1/data-export/resources/rollup_selfservice_session_workflow_step.sql,/usr/lib/firebet-spark/52.0.2-1/data-export/resources/rollup_session_assignment.sql,/usr/lib/firebet-spark/52.0.2-1/data-export/resources/rollup_session_daily.sql,/usr/lib/firebet-spark/52.0.2-1/data-export/resources/rollup_session_device.sql,/usr/lib/firebet-spark/52.0.2-1/data-export/resources/rollup_session_queue.sql,/usr/lib/firebet-spark/52.0.2-1/data-export/resources/rollup_session_workflow.sql,/usr/lib/firebet-spark/52.0.2-1/data-export/resources/rollup_session_workflow_step.sql,/usr/lib/firebet-spark/52.0.2-1/data-export/resources/rollup_user_login_session.sql
/usr/lib/firebet-spark/52.0.2-1/data-export/lib/data-export-assembly-52.0.2-1.jar
/usr/lib/firebet-spark/52.0.2-1/data-export/resources/application.conf

How to fix this problem ? Is spark-submit command is correct ? Which
deployment mode is better client or cluster in production ?



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-using-my-last-job-resources-and-jar-files-tp27690.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org