You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2020/10/15 17:38:26 UTC

[GitHub] [spark] sunchao commented on pull request #30022: [SPARK-33090][BUILD][test-hadoop2.7] Upgrade Google Guava to 29.0-jre

sunchao commented on pull request #30022:
URL: https://github.com/apache/spark/pull/30022#issuecomment-709482519


   Not sure if this works well with Hive 2.3.x also since it is still on Guava 14.0.1.
   
   > IMO, if spark-3 with hadoop3.2 can work well in Hadoop cluster (2.6/2/7/2.8. etc), it's ok just use hadoop 3.2 client.
   
   Yes it's expected to work. There is an issue [HDFS-15191](https://issues.apache.org/jira/browse/HDFS-15191) which potentially breaks compatibility between Hadoop 3.2.1 and 2.x server but it is fixed in 3.2.2 (which Spark is probably going to use).
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org