You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Venkata Sai Akhil Gudesa (Jira)" <ji...@apache.org> on 2023/04/19 16:08:00 UTC
[jira] [Updated] (SPARK-43198) Fix "Could not initialise class ammonite..." error when using filter
[ https://issues.apache.org/jira/browse/SPARK-43198?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Venkata Sai Akhil Gudesa updated SPARK-43198:
---------------------------------------------
Description:
When
{code:java}
spark.range(10).filter(n => n % 2 == 0).collectAsList()`{code}
is run in the ammonite REPL (Spark Connect), the following error is thrown:
{noformat}
io.grpc.StatusRuntimeException: UNKNOWN: ammonite/repl/ReplBridge$
io.grpc.Status.asRuntimeException(Status.java:535)
io.grpc.stub.ClientCalls$BlockingResponseStream.hasNext(ClientCalls.java:660)
org.apache.spark.sql.connect.client.SparkResult.org$apache$spark$sql$connect$client$SparkResult$$processResponses(SparkResult.scala:62)
org.apache.spark.sql.connect.client.SparkResult.length(SparkResult.scala:114)
org.apache.spark.sql.connect.client.SparkResult.toArray(SparkResult.scala:131)
org.apache.spark.sql.Dataset.$anonfun$collect$1(Dataset.scala:2687)
org.apache.spark.sql.Dataset.withResult(Dataset.scala:3088)
org.apache.spark.sql.Dataset.collect(Dataset.scala:2686)
org.apache.spark.sql.Dataset.collectAsList(Dataset.scala:2700)
ammonite.$sess.cmd0$.<init>(cmd0.sc:1)
ammonite.$sess.cmd0$.<clinit>(cmd0.sc){noformat}
was:
When `spark.range(10).filter(n => n % 2 == 0).collectAsList()` is run in the ammonite REPL (Spark Connect), the following error is thrown:
```
io.grpc.StatusRuntimeException: UNKNOWN: ammonite/repl/ReplBridge$
io.grpc.Status.asRuntimeException(Status.java:535)
io.grpc.stub.ClientCalls$BlockingResponseStream.hasNext(ClientCalls.java:660)
org.apache.spark.sql.connect.client.SparkResult.org$apache$spark$sql$connect$client$SparkResult$$processResponses(SparkResult.scala:62)
org.apache.spark.sql.connect.client.SparkResult.length(SparkResult.scala:114)
org.apache.spark.sql.connect.client.SparkResult.toArray(SparkResult.scala:131)
org.apache.spark.sql.Dataset.$anonfun$collect$1(Dataset.scala:2687)
org.apache.spark.sql.Dataset.withResult(Dataset.scala:3088)
org.apache.spark.sql.Dataset.collect(Dataset.scala:2686)
org.apache.spark.sql.Dataset.collectAsList(Dataset.scala:2700)
ammonite.$sess.cmd0$.<init>(cmd0.sc:1)
ammonite.$sess.cmd0$.<clinit>(cmd0.sc)
```
> Fix "Could not initialise class ammonite..." error when using filter
> --------------------------------------------------------------------
>
> Key: SPARK-43198
> URL: https://issues.apache.org/jira/browse/SPARK-43198
> Project: Spark
> Issue Type: Bug
> Components: Connect
> Affects Versions: 3.5.0
> Reporter: Venkata Sai Akhil Gudesa
> Priority: Major
>
> When
> {code:java}
> spark.range(10).filter(n => n % 2 == 0).collectAsList()`{code}
> is run in the ammonite REPL (Spark Connect), the following error is thrown:
> {noformat}
> io.grpc.StatusRuntimeException: UNKNOWN: ammonite/repl/ReplBridge$
> io.grpc.Status.asRuntimeException(Status.java:535)
> io.grpc.stub.ClientCalls$BlockingResponseStream.hasNext(ClientCalls.java:660)
> org.apache.spark.sql.connect.client.SparkResult.org$apache$spark$sql$connect$client$SparkResult$$processResponses(SparkResult.scala:62)
> org.apache.spark.sql.connect.client.SparkResult.length(SparkResult.scala:114)
> org.apache.spark.sql.connect.client.SparkResult.toArray(SparkResult.scala:131)
> org.apache.spark.sql.Dataset.$anonfun$collect$1(Dataset.scala:2687)
> org.apache.spark.sql.Dataset.withResult(Dataset.scala:3088)
> org.apache.spark.sql.Dataset.collect(Dataset.scala:2686)
> org.apache.spark.sql.Dataset.collectAsList(Dataset.scala:2700)
> ammonite.$sess.cmd0$.<init>(cmd0.sc:1)
> ammonite.$sess.cmd0$.<clinit>(cmd0.sc){noformat}
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org