You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@hive.apache.org by "Vihang Karajgaonkar (Jira)" <ji...@apache.org> on 2023/02/16 23:52:00 UTC

[jira] [Created] (HIVE-27087) Fix TestMiniSparkOnYarnCliDriver test failures on branch-3

Vihang Karajgaonkar created HIVE-27087:
------------------------------------------

             Summary: Fix TestMiniSparkOnYarnCliDriver test failures on branch-3
                 Key: HIVE-27087
                 URL: https://issues.apache.org/jira/browse/HIVE-27087
             Project: Hive
          Issue Type: Sub-task
            Reporter: Vihang Karajgaonkar
            Assignee: Vihang Karajgaonkar


TestMiniSparkOnYarnCliDriver are failing with the error below

[ERROR] 2023-02-16 14:13:08.991 [Driver] SparkContext - Error initializing SparkContext.
java.lang.RuntimeException: java.lang.NoSuchFieldException: DEFAULT_TINY_CACHE_SIZE
at org.apache.spark.network.util.NettyUtils.getPrivateStaticField(NettyUtils.java:131) ~[spark-network-common_2.11-2.3.0.jar:2.3.0]
at org.apache.spark.network.util.NettyUtils.createPooledByteBufAllocator(NettyUtils.java:118) ~[spark-network-common_2.11-2.3.0.jar:2.3.0]
at org.apache.spark.network.server.TransportServer.init(TransportServer.java:94) ~[spark-network-common_2.11-2.3.0.jar:2.3.0]
at org.apache.spark.network.server.TransportServer.<init>(TransportServer.java:73) ~[spark-network-common_2.11-2.3.0.jar:2.3.0]
at org.apache.spark.network.TransportContext.createServer(TransportContext.java:114) ~[spark-network-common_2.11-2.3.0.jar:2.3.0]
at org.apache.spark.rpc.netty.NettyRpcEnv.startServer(NettyRpcEnv.scala:119) ~[spark-core_2.11-2.3.0.jar:2.3.0]
at org.apache.spark.rpc.netty.NettyRpcEnvFactory$$anonfun$4.apply(NettyRpcEnv.scala:465) ~[spark-core_2.11-2.3.0.jar:2.3.0]
at org.apache.spark.rpc.netty.NettyRpcEnvFactory$$anonfun$4.apply(NettyRpcEnv.scala:464) ~[spark-core_2.11-2.3.0.jar:2.3.0]
at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:2271) ~[spark-core_2.11-2.3.0.jar:2.3.0]
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160) ~[scala-library-2.11.8.jar:?]
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:2263) ~[spark-core_2.11-2.3.0.jar:2.3.0]
at org.apache.spark.rpc.netty.NettyRpcEnvFactory.create(NettyRpcEnv.scala:469) ~[spark-core_2.11-2.3.0.jar:2.3.0]
at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:57) ~[spark-core_2.11-2.3.0.jar:2.3.0]
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:249) ~[spark-core_2.11-2.3.0.jar:2.3.0]
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175) ~[spark-core_2.11-2.3.0.jar:2.3.0]
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:256) [spark-core_2.11-2.3.0.jar:2.3.0]
at org.apache.spark.SparkContext.<init>(SparkContext.scala:423) [spark-core_2.11-2.3.0.jar:2.3.0]
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58) [spark-core_2.11-2.3.0.jar:2.3.0]
at org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:161) [hive-exec-3.2.0-SNAPSHOT.jar:3.2.0-SNAPSHOT]
at org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:536) [hive-exec-3.2.0-SNAPSHOT.jar:3.2.0-SNAPSHOT]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_322]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_322]


The root cause of the problem is that we upgrade the netty library from 4.1.17.Final to 4.1.69.Final. The upgraded library does not have `DEFAULT_TINY_CACHE_SIZE` field [here|https://github.com/netty/netty/blob/netty-4.1.51.Final/buffer/src/main/java/io/netty/buffer/PooledByteBufAllocator.java#L46] which was removed in 4.1.52.Final



--
This message was sent by Atlassian Jira
(v8.20.10#820010)