You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Jacek Laskowski <ja...@japila.pl> on 2015/12/01 10:50:05 UTC

Re: Bringing up JDBC Tests to trunk

On Mon, Nov 30, 2015 at 10:53 PM, Josh Rosen <jo...@databricks.com> wrote:

> In SBT, these wind up on the Docker JDBC tests' classpath as a transitive
> dependency of the `spark-sql` test JAR. However, what we should be doing is
> adding them as explicit test dependencies of the `docker-integration-tests`
> subproject, since Maven handles transitive test JAR dependencies differently
> than SBT (see
> https://github.com/apache/spark/pull/9876#issuecomment-158593498 for some
> discussion). If you choose to make that fix as part of your PR, be sure to
> move the version handling to the root POM's <dependencyManagement> section
> so that the versions in both modules stay in sync. We might also be able to
> just simply move the JDBC driver dependencies to docker-integration-tests'
> POM if it turns out that they're not used anywhere else (that's my hunch).

Hi Josh,

Could you elaborate a little more on what is really required and how
to verify requested changes (or at least what is failing so once it's
not, it's supposed to be a solution)? The little magic word "sbt"
triggered me thinking I could help here and there? :-)

If there's a JIRA task for it, let me know. Thanks!

Jacek

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org