You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "DB Tsai (JIRA)" <ji...@apache.org> on 2019/08/02 23:42:00 UTC
[jira] [Resolved] (SPARK-28606) Update CRAN key to recover docker
image generation
[ https://issues.apache.org/jira/browse/SPARK-28606?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
DB Tsai resolved SPARK-28606.
-----------------------------
Resolution: Fixed
Fix Version/s: 3.0.0
Issue resolved by pull request 25339
[https://github.com/apache/spark/pull/25339]
> Update CRAN key to recover docker image generation
> --------------------------------------------------
>
> Key: SPARK-28606
> URL: https://issues.apache.org/jira/browse/SPARK-28606
> Project: Spark
> Issue Type: Bug
> Components: Project Infra
> Affects Versions: 2.4.4, 3.0.0
> Reporter: Dongjoon Hyun
> Assignee: Dongjoon Hyun
> Priority: Blocker
> Fix For: 3.0.0
>
>
> CRAN repo changed the key.
> - https://cran.r-project.org/bin/linux/ubuntu/README.html
> {code}
> Err:1 https://cloud.r-project.org/bin/linux/ubuntu bionic-cran35/ InRelease
> The following signatures couldn't be verified because the public key is not available: NO_PUBKEY 51716619E084DAB9
> ...
> W: GPG error: https://cloud.r-project.org/bin/linux/ubuntu bionic-cran35/ InRelease: The following signatures couldn't be verified because the public key is not available: NO_PUBKEY 51716619E084DAB9
> E: The repository 'https://cloud.r-project.org/bin/linux/ubuntu bionic-cran35/ InRelease' is not signed.
> {code}
> Although they changed the key, but it reuses `cran35` for R 3.6.
> {code}
> Even though R has moved to version 3.6, for compatibility the sources.list entry still uses the cran3.5 designation.
> {code}
> This issue aims to recover the docker image generation first. We will verify the R doc generation in a separate issue.
--
This message was sent by Atlassian JIRA
(v7.6.14#76016)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org