You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2023/04/19 00:31:00 UTC
[jira] [Assigned] (SPARK-43173) `write jdbc` in `ClientE2ETestSuite` will test fail with out `-Phive`
[ https://issues.apache.org/jira/browse/SPARK-43173?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon reassigned SPARK-43173:
------------------------------------
Assignee: Yang Jie
> `write jdbc` in `ClientE2ETestSuite` will test fail with out `-Phive`
> ---------------------------------------------------------------------
>
> Key: SPARK-43173
> URL: https://issues.apache.org/jira/browse/SPARK-43173
> Project: Spark
> Issue Type: Improvement
> Components: Connect, Tests
> Affects Versions: 3.5.0
> Reporter: Yang Jie
> Assignee: Yang Jie
> Priority: Minor
>
> both
> ```
> build/mvn clean install -Dtest=none -DwildcardSuites=org.apache.spark.sql.ClientE2ETestSuite
> ```
> and
> ```
> build/sbt "connect-client-jvm/testOnly *ClientE2ETestSuite"
> ```
>
> will test failed when using Java 11&17
>
> {code:java}
> - write jdbc *** FAILED ***
> io.grpc.StatusRuntimeException: INTERNAL: No suitable driver
> at io.grpc.Status.asRuntimeException(Status.java:535)
> at io.grpc.stub.ClientCalls$BlockingResponseStream.hasNext(ClientCalls.java:660)
> at scala.collection.convert.Wrappers$JIteratorWrapper.hasNext(Wrappers.scala:45)
> at scala.collection.Iterator.foreach(Iterator.scala:943)
> at scala.collection.Iterator.foreach$(Iterator.scala:943)
> at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
> at org.apache.spark.sql.SparkSession.execute(SparkSession.scala:458)
> at org.apache.spark.sql.DataFrameWriter.executeWriteOperation(DataFrameWriter.scala:257)
> at org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:221)
> at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:218) {code}
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org