You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2017/08/20 06:55:02 UTC
[jira] [Assigned] (SPARK-21790) Running Docker-based Integration
Test Suites throws exception
[ https://issues.apache.org/jira/browse/SPARK-21790?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean Owen reassigned SPARK-21790:
---------------------------------
Assignee: Yuming Wang
> Running Docker-based Integration Test Suites throws exception
> -------------------------------------------------------------
>
> Key: SPARK-21790
> URL: https://issues.apache.org/jira/browse/SPARK-21790
> Project: Spark
> Issue Type: Test
> Components: Tests
> Affects Versions: 2.3.0
> Reporter: Yuming Wang
> Assignee: Yuming Wang
>
> {{mvn test -Pdocker-integration-tests -pl :spark-docker-integration-tests_2.11}} throws Exception:
> {noformat}
> [error] missing or invalid dependency detected while loading class file 'SQLTestUtils.class'.
> [error] Could not access type PlanTest in package org.apache.spark.sql.catalyst.plans,
> [error] because it (or its dependencies) are missing. Check your build definition for
> [error] missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
> [error] A full rebuild may help if 'SQLTestUtils.class' was compiled against an incompatible version of org.apache.spark.sql.catalyst.plans.
> [error] /root/create/spark/external/docker-integration-tests/src/test/scala/org/apache/spark/sql/jdbc/OracleIntegrationSuite.scala:258: value metadata is not a member of org.apache.spark.sql.execution.SparkPlan
> [error] val metadata = df.queryExecution.sparkPlan.metadata
> [error] ^
> [error] two errors found
> [error] Compile failed at Aug 19, 2017 11:00:57 PM [0.357s]
> {noformat}
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org