You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Shixiong Zhu (Jira)" <ji...@apache.org> on 2023/01/04 17:12:00 UTC
[jira] [Created] (SPARK-41885) --packages may not work on Windows 11
Shixiong Zhu created SPARK-41885:
------------------------------------
Summary: --packages may not work on Windows 11
Key: SPARK-41885
URL: https://issues.apache.org/jira/browse/SPARK-41885
Project: Spark
Issue Type: Bug
Components: Spark Core
Affects Versions: 3.2.1
Environment: Hadoop 2.7 in windows 11
Reporter: Shixiong Zhu
Gastón Ortiz reported an issue when using spark 3.2.1 and hadoop 2.7 in windows 11. See [https://github.com/delta-io/delta/issues/1059]
Looks like executor cannot fetch the jar files. See the critical stack trace below (the full stack trace is in [https://github.com/delta-io/delta/issues/1059] ):
{code:java}
org.apache.spark.rpc.netty.NettyRpcEnv.openChannel(NettyRpcEnv.scala:366) at org.apache.spark.util.Utils$.doFetchFile(Utils.scala:762) at org.apache.spark.util.Utils$.fetchFile(Utils.scala:549) at org.apache.spark.executor.Executor.$anonfun$updateDependencies$13(Executor.scala:962) at org.apache.spark.executor.Executor.$anonfun$updateDependencies$13$adapted(Executor.scala:954) at scala.collection.TraversableLike$WithFilter.$anonfun$foreach$1(TraversableLike.scala:985) at scala.collection.mutable.HashMap.$anonfun$foreach$1(HashMap.scala:149) at scala.collection.mutable.HashTable.foreachEntry(HashTable.scala:237) at scala.collection.mutable.HashTable.foreachEntry$(HashTable.scala:230) at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:44) at scala.collection.mutable.HashMap.foreach(HashMap.scala:149) at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:984) at org.apache.spark.executor.Executor.org$apache$spark$executor$Executor$$updateDependencies(Executor.scala:954) at org.apache.spark.executor.Executor.<init>(Executor.scala:247) at {code}
This is not a Delta Lake issue, as this can be reproduced by running `pyspark --packages org.apache.kafka:kafka-clients:2.8.1` as well.
I don't have a Windows 11 environment to debug. Hence I help Gastón Ortiz create this ticket and it would be great if anyone who has a Windows 11 environment can help this.
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org