You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Yuming Wang (JIRA)" <ji...@apache.org> on 2019/04/16 15:03:00 UTC
[jira] [Commented] (SPARK-27475) dev/deps/spark-deps-hadoop-3.2 is
incorrect
[ https://issues.apache.org/jira/browse/SPARK-27475?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16819124#comment-16819124 ]
Yuming Wang commented on SPARK-27475:
-------------------------------------
[~srowen]
One way to fix it is to change [this line|https://github.com/apache/spark/blob/9c0af746e5dda9f05e64f0a16a3dbe11a23024de/dev/test-dependencies.sh#L71] to {{$MVN $HADOOP2_MODULE_PROFILES -P$HADOOP_PROFILE clean install -DskipTests -q}}. But it's very expensive. I‘m not sure if we have another way.
> dev/deps/spark-deps-hadoop-3.2 is incorrect
> -------------------------------------------
>
> Key: SPARK-27475
> URL: https://issues.apache.org/jira/browse/SPARK-27475
> Project: Spark
> Issue Type: Bug
> Components: Build
> Affects Versions: 3.0.0
> Reporter: Yuming Wang
> Priority: Major
>
> parquet-hadoop-bundle-1.6.0.jar should be parquet-hadoop-bundle-1.8.1.jar.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org