You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2022/12/16 04:22:00 UTC

[jira] [Assigned] (SPARK-41542) Run Coverage report for Spark Connect

     [ https://issues.apache.org/jira/browse/SPARK-41542?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Apache Spark reassigned SPARK-41542:
------------------------------------

    Assignee:     (was: Apache Spark)

> Run Coverage report for Spark Connect
> -------------------------------------
>
>                 Key: SPARK-41542
>                 URL: https://issues.apache.org/jira/browse/SPARK-41542
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Connect, Tests
>    Affects Versions: 3.4.0
>            Reporter: Hyukjin Kwon
>            Priority: Major
>
> Spark Connect does not report coverage so far, and our scheduled coverage job fails as below:
> {code:java}
> ======================================================================
> 1297ERROR [0.000s]: setUpClass (pyspark.sql.tests.connect.test_connect_function.SparkConnectFunctionTests)
> 1298----------------------------------------------------------------------
> 1299Traceback (most recent call last):
> 1300  File "/__w/spark/spark/python/pyspark/sql/tests/connect/test_connect_function.py", line 37, in setUpClass
> 1301    ReusedPySparkTestCase.setUpClass()
> 1302  File "/__w/spark/spark/python/pyspark/testing/utils.py", line 135, in setUpClass
> 1303    cls.sc = SparkContext("local[4]", cls.__name__, conf=cls.conf())
> 1304  File "/__w/spark/spark/python/pyspark/context.py", line 196, in __init__
> 1305    self._do_init(
> 1306  File "/__w/spark/spark/python/pyspark/context.py", line 283, in _do_init
> 1307    self._jsc = jsc or self._initialize_context(self._conf._jconf)
> 1308  File "/__w/spark/spark/python/pyspark/context.py", line 413, in _initialize_context
> 1309    return self._jvm.JavaSparkContext(jconf)
> 1310  File "/__w/spark/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/java_gateway.py", line 1587, in __call__
> 1311    return_value = get_return_value(
> 1312  File "/__w/spark/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/protocol.py", line 326, in get_return_value
> 1313    raise Py4JJavaError(
> 1314py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
> 1315: java.io.IOException: Failed to bind to address 0.0.0.0/0.0.0.0:15002
> 1316	at org.sparkproject.connect.grpc.io.grpc.netty.NettyServer.start(NettyServer.java:328)
> 1317	at org.sparkproject.connect.grpc.io.grpc.internal.ServerImpl.start(ServerImpl.java:183)
> 1318	at org.sparkproject.connect.grpc.io.grpc.internal.ServerImpl.start(ServerImpl.java:92)
> 1319	at org.apache.spark.sql.connect.service.SparkConnectService$.startGRPCService(SparkConnectService.scala:217)
> 1320	at org.apache.spark.sql.connect.service.SparkConnectService$.start(SparkConnectService.scala:222)
> 1321	at org.apache.spark.sql.connect.SparkConnectPlugin$$anon$1.init(SparkConnectPlugin.scala:48)
> 1322	at org.apache.spark.internal.plugin.DriverPluginContainer.$anonfun$driverPlugins$1(PluginContainer.scala:53)
> 1323	at scala.collection.TraversableLike.$anonfun$flatMap$1(TraversableLike.scala:293)
> 1324	at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
> 1325	at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
> 1326	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
> 1327	at scala.collection.TraversableLike.flatMap(TraversableLike.scala:293)
> 1328	at scala.collection.TraversableLike.flatMap$(TraversableLike.scala:290)
> 1329	at scala.collection.AbstractTraversable.flatMap(Traversable.scala:108)
> 1330	at org.apache.spark.internal.plugin.DriverPluginContainer.<init>(PluginContainer.scala:46)
> 1331	at org.apache.spark.internal.plugin.PluginContainer$.apply(PluginContainer.scala:210)
> 1332	at org.apache.spark.internal.plugin.PluginContainer$.apply(PluginContainer.scala:193)
> 1333	at org.apache.spark.SparkContext.<init>(SparkContext.scala:565)
> 1334	at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
> 1335	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> 1336	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> 1337	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> 1338	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> 1339	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
> 1340	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:374)
> 1341	at py4j.Gateway.invoke(Gateway.java:238)
> 1342	at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
> 1343	at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
> 1344	at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182)
> 1345	at py4j.ClientServerConnection.run(ClientServerConnection.java:106)
> 1346	at java.lang.Thread.run(Thread.java:750)
> 1347Caused by: io.netty.channel.unix.Errors$NativeIoException: bind(..) failed: Address already in use
> 1348 {code}
>  
> [https://github.com/apache/spark/actions/runs/3663716955/jobs/6193686439]
>  
> Should enable this



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org