You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Yang Jie (Jira)" <ji...@apache.org> on 2023/04/18 09:40:00 UTC
[jira] [Updated] (SPARK-43173) `write jdbc` in `ClientE2ETestSuite` will test fail with out `-Phive`
[ https://issues.apache.org/jira/browse/SPARK-43173?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Yang Jie updated SPARK-43173:
-----------------------------
Description:
both
```
build/mvn clean install -Dtest=none -DwildcardSuites=org.apache.spark.sql.ClientE2ETestSuite
```
and
```
build/sbt "connect-client-jvm/testOnly *ClientE2ETestSuite"
```
will test failed when using Java 11&17
{code:java}
- write jdbc *** FAILED ***
io.grpc.StatusRuntimeException: INTERNAL: No suitable driver
at io.grpc.Status.asRuntimeException(Status.java:535)
at io.grpc.stub.ClientCalls$BlockingResponseStream.hasNext(ClientCalls.java:660)
at scala.collection.convert.Wrappers$JIteratorWrapper.hasNext(Wrappers.scala:45)
at scala.collection.Iterator.foreach(Iterator.scala:943)
at scala.collection.Iterator.foreach$(Iterator.scala:943)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
at org.apache.spark.sql.SparkSession.execute(SparkSession.scala:458)
at org.apache.spark.sql.DataFrameWriter.executeWriteOperation(DataFrameWriter.scala:257)
at org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:221)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:218) {code}
was:
both
```
build/mvn clean install -Dtest=none -DwildcardSuites=org.apache.spark.sql.ClientE2ETestSuite
```
and
```
build/sbt "connect-client-jvm/testOnly *ClientE2ETestSuite"
```
will test failed
{code:java}
- write jdbc *** FAILED ***
io.grpc.StatusRuntimeException: INTERNAL: No suitable driver
at io.grpc.Status.asRuntimeException(Status.java:535)
at io.grpc.stub.ClientCalls$BlockingResponseStream.hasNext(ClientCalls.java:660)
at scala.collection.convert.Wrappers$JIteratorWrapper.hasNext(Wrappers.scala:45)
at scala.collection.Iterator.foreach(Iterator.scala:943)
at scala.collection.Iterator.foreach$(Iterator.scala:943)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
at org.apache.spark.sql.SparkSession.execute(SparkSession.scala:458)
at org.apache.spark.sql.DataFrameWriter.executeWriteOperation(DataFrameWriter.scala:257)
at org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:221)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:218) {code}
> `write jdbc` in `ClientE2ETestSuite` will test fail with out `-Phive`
> ---------------------------------------------------------------------
>
> Key: SPARK-43173
> URL: https://issues.apache.org/jira/browse/SPARK-43173
> Project: Spark
> Issue Type: Improvement
> Components: Connect, Tests
> Affects Versions: 3.5.0
> Reporter: Yang Jie
> Priority: Minor
>
> both
> ```
> build/mvn clean install -Dtest=none -DwildcardSuites=org.apache.spark.sql.ClientE2ETestSuite
> ```
> and
> ```
> build/sbt "connect-client-jvm/testOnly *ClientE2ETestSuite"
> ```
>
> will test failed when using Java 11&17
>
> {code:java}
> - write jdbc *** FAILED ***
> io.grpc.StatusRuntimeException: INTERNAL: No suitable driver
> at io.grpc.Status.asRuntimeException(Status.java:535)
> at io.grpc.stub.ClientCalls$BlockingResponseStream.hasNext(ClientCalls.java:660)
> at scala.collection.convert.Wrappers$JIteratorWrapper.hasNext(Wrappers.scala:45)
> at scala.collection.Iterator.foreach(Iterator.scala:943)
> at scala.collection.Iterator.foreach$(Iterator.scala:943)
> at scala.collection.AbstractIterator.foreach(Iterator.scala:1431)
> at org.apache.spark.sql.SparkSession.execute(SparkSession.scala:458)
> at org.apache.spark.sql.DataFrameWriter.executeWriteOperation(DataFrameWriter.scala:257)
> at org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:221)
> at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:218) {code}
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org