You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Oscar Bonilla (JIRA)" <ji...@apache.org> on 2018/10/16 11:04:00 UTC

[jira] [Created] (SPARK-25742) Is there a way to pass the Azure blob storage credentials to the spark for k8s init-container?

Oscar Bonilla created SPARK-25742:
-------------------------------------

             Summary: Is there a way to pass the Azure blob storage credentials to the spark for k8s init-container?
                 Key: SPARK-25742
                 URL: https://issues.apache.org/jira/browse/SPARK-25742
             Project: Spark
          Issue Type: Question
          Components: Kubernetes
    Affects Versions: 2.3.2
            Reporter: Oscar Bonilla


I'm trying to run spark on a kubernetes cluster in Azure. The idea is to store the Spark application jars and dependencies in a container in Azure Blob Storage.

I've tried to do this with a public container and this works OK, but when having a private Blob Storage container, the spark-init init container doesn't download the jars.

The equivalent in AWS S3 is as simple as adding the key_id and secret as environment variables, but I don't see how to do this for Azure Blob Storage.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org