You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (Jira)" <ji...@apache.org> on 2019/10/23 07:16:00 UTC
[jira] [Resolved] (SPARK-29546) Recover jersey-guava test
dependency in docker-integration-tests
[ https://issues.apache.org/jira/browse/SPARK-29546?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Dongjoon Hyun resolved SPARK-29546.
-----------------------------------
Fix Version/s: 3.0.0
Resolution: Fixed
Issue resolved by pull request 26203
[https://github.com/apache/spark/pull/26203]
> Recover jersey-guava test dependency in docker-integration-tests
> ----------------------------------------------------------------
>
> Key: SPARK-29546
> URL: https://issues.apache.org/jira/browse/SPARK-29546
> Project: Spark
> Issue Type: Sub-task
> Components: Tests
> Affects Versions: 3.0.0
> Reporter: Dongjoon Hyun
> Assignee: Dongjoon Hyun
> Priority: Major
> Fix For: 3.0.0
>
>
> While SPARK-28737 upgrades `Jersey` to 2.29, `docker-integration-tests` is broken because `com.spotify.docker-client` depends on `jersey-guava`. The latest `com.spotify.docker-client` is also still depending on that, too.
> - https://mvnrepository.com/artifact/com.spotify/docker-client/5.0.2
> -> https://mvnrepository.com/artifact/org.glassfish.jersey.core/jersey-client/2.19
> -> https://mvnrepository.com/artifact/org.glassfish.jersey.core/jersey-common/2.19
> -> https://mvnrepository.com/artifact/org.glassfish.jersey.bundles.repackaged/jersey-guava/2.19
> **AFTER**
> {code}
> build/mvn -Pdocker-integration-tests -pl :spark-docker-integration-tests_2.12 -Dtest=none -DwildcardSuites=org.apache.spark.sql.jdbc.PostgresIntegrationSuite test
> Tests: succeeded 6, failed 0, canceled 0, ignored 0, pending 0
> All tests passed.
> {code}
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org