You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@dlab.apache.org by "Vira Vitanska (JIRA)" <ji...@apache.org> on 2019/02/18 18:09:00 UTC
[jira] [Resolved] (DLAB-321) [GCP][Spark Standalone cluster]:
Playbook running fails using spark cluster kernel due to 'Class
com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystem not found'
[ https://issues.apache.org/jira/browse/DLAB-321?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Vira Vitanska resolved DLAB-321.
--------------------------------
Resolution: Cannot Reproduce
Commit ID f6d739dec2cf66dd2c2bb0aea747f5d15c828309
> [GCP][Spark Standalone cluster]: Playbook running fails using spark cluster kernel due to 'Class com.google.cloud.hadoop.fs.gcs.GoogleHadoopFileSystem not found'
> -----------------------------------------------------------------------------------------------------------------------------------------------------------------
>
> Key: DLAB-321
> URL: https://issues.apache.org/jira/browse/DLAB-321
> Project: Apache DLab
> Issue Type: Bug
> Reporter: Vira Vitanska
> Assignee: Demyan Mysakovets
> Priority: Critical
> Labels: DevOps
> Fix For: v.1.1
>
> Attachments: GCP.PNG, GCP_autotest.PNG
>
>
> *Steps to reproduce:*
> # Run autotest on GCP for Data Engine or Create Spark cluster on Jupyter/Zeppelin/Rstudio and run playbook by manual
>
> *Actual result:*
> playbook running fails with error:
> !GCP_autotest.PNG!
> [AutoTests_GCP/91/console|http://35.166.222.81/view/AutoTests/job/AutoTests_GCP/91/console]
> *Expected result:*
> Autotest runs successfully
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@dlab.apache.org
For additional commands, e-mail: dev-help@dlab.apache.org