You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by gu...@apache.org on 2022/07/22 00:06:07 UTC
[spark] branch branch-3.2 updated: Revert "[SPARK-39831][BUILD] Fix R dependencies installation failure"
This is an automated email from the ASF dual-hosted git repository.
gurwls223 pushed a commit to branch branch-3.2
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/branch-3.2 by this push:
new f344bf97265 Revert "[SPARK-39831][BUILD] Fix R dependencies installation failure"
f344bf97265 is described below
commit f344bf97265306b50ab79e465535054e2d582877
Author: Hyukjin Kwon <gu...@apache.org>
AuthorDate: Fri Jul 22 09:05:53 2022 +0900
Revert "[SPARK-39831][BUILD] Fix R dependencies installation failure"
This reverts commit 29290306749f75eb96f51fc5b61114e9b8a3bf53.
---
.github/workflows/build_and_test.yml | 4 +---
1 file changed, 1 insertion(+), 3 deletions(-)
diff --git a/.github/workflows/build_and_test.yml b/.github/workflows/build_and_test.yml
index 1c8ff1b1d10..6aaaa8424dc 100644
--- a/.github/workflows/build_and_test.yml
+++ b/.github/workflows/build_and_test.yml
@@ -378,9 +378,7 @@ jobs:
python3.9 -m pip install flake8 pydata_sphinx_theme 'mypy==0.910' numpydoc 'jinja2<3.0.0' 'black==21.5b2'
- name: Install R linter dependencies and SparkR
run: |
- apt-get install -y libcurl4-openssl-dev libgit2-dev libssl-dev libxml2-dev \
- libfontconfig1-dev libharfbuzz-dev libfribidi-dev libfreetype6-dev libpng-dev \
- libtiff5-dev libjpeg-dev
+ apt-get install -y libcurl4-openssl-dev libgit2-dev libssl-dev libxml2-dev
Rscript -e "install.packages(c('devtools'), repos='https://cloud.r-project.org/')"
Rscript -e "devtools::install_version('lintr', version='2.0.1', repos='https://cloud.r-project.org')"
./R/install-dev.sh
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org