You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Gnana Kumar <gn...@gmail.com> on 2022/04/04 07:44:23 UTC

Spark History Server in GCP

Hi There,

I have been able to start the Spark History Server in GKE Kubernetes
Cluster.

And I have created a Service Account in my google project with permissions
as storage admin,storage object admin and owner.

Now when I have tried to submit the job using Spark Submit, using below
options to write all spark job's event logs to Spark history
server's Google storage bucket.

--conf spark.eventLog.enabled=true \
--conf spark.eventLog.dir=gs://spark-history-server-gnana/ \
--conf spark.eventLog.permissions=777\
--conf spark.history.fs.logDirectory=gs://spark-history-server-gnana/ \
--conf spark.kubernetes.driver.secrets.sparklogs=/etc/secrets \
--conf spark.kubernetes.executor.secrets.sparklogs=/etc/secrets \

I'm getting the below exception.

com.google.cloud.hadoop.repackaged.gcs.com.google.api.client.googleapis.json.GoogleJsonResponseException:
403 Forbidden

*Note: In my Spark Image, I have added the GCS Connector jar
"gcs-connector-hadoop3-latest.jar"*
*to access the GCS bucket.*

Please help me resolve this issue with respect to GCS Connector and Cloud
Storage bucket.

Thanks
Gnana