You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2018/11/13 00:09:00 UTC
[jira] [Resolved] (SPARK-26009) Unable to fetch jar from remote
repo while running spark-submit on kubernetes
[ https://issues.apache.org/jira/browse/SPARK-26009?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon resolved SPARK-26009.
----------------------------------
Resolution: Invalid
> Unable to fetch jar from remote repo while running spark-submit on kubernetes
> -----------------------------------------------------------------------------
>
> Key: SPARK-26009
> URL: https://issues.apache.org/jira/browse/SPARK-26009
> Project: Spark
> Issue Type: Question
> Components: Kubernetes
> Affects Versions: 2.3.2
> Reporter: Bala Bharath Reddy Resapu
> Priority: Major
>
> I am trying to run spark on kubernetes with a docker image. My requirement is to download the jar from the external repo while running spark-submit. I am able to download the jar using wget in the container but it doesn't work when inputting in the spark-submit command. I am not packaging the jar with docker image. It works fine when I input the jar file inside the docker image.
>
> ./bin/spark-submit \
> --master k8s://https://ip:port \
> --deploy-mode cluster \
> --name test3 \
> --class hello \
> --conf spark.kubernetes.container.image.pullSecrets=abcd \
> --conf spark.kubernetes.container.image=spark:h2.0 \
> [https://devops.com/artifactory/local/testing/testing_2.11/h|https://bala.bharath.reddy.resapu%40ibm.com:AKCp5bCBKTyKG2ti28sJu4GTEBSqwkG2MQKaf9w6g5rdbo3iWrWX7qb1M5doKgD54HDRU2AXr@na.artifactory.swg-devops.com/artifactory/txo-cedp-garage-artifacts-sbt-local/testing/testing_2.11/arithmetic.jar]ello.jar
>
>
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org