You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "Midhunpottammal (via GitHub)" <gi...@apache.org> on 2024/03/05 09:52:06 UTC
Re: [PR] [SPARK-44259][CONNECT][TESTS] Make `connect-client-jvm` pass on Java 21 except `RemoteSparkSession`-based tests [spark]
Midhunpottammal commented on PR #41805:
URL: https://github.com/apache/spark/pull/41805#issuecomment-1978359967
> Merged to master for Apache Spark 3.5.0. Thank you, @LuciferYang , @yaooqinn , @HyukjinKwon .
while java Java(TM) SE Runtime Environment (build 21.0.2+13-LTS-58)
Spark 3.5.0 giving error
`Previous exception in task: sun.misc.Unsafe or java.nio.DirectByteBuffer.<init>(long, int) not available
org.apache.arrow.memory.util.MemoryUtil.directBuffer(MemoryUtil.java:174)
org.apache.arrow.memory.ArrowBuf.getDirectBuffer(ArrowBuf.java:229)
org.apache.arrow.memory.ArrowBuf.nioBuffer(ArrowBuf.java:224)
org.apache.arrow.vector.ipc.WriteChannel.write(WriteChannel.java:133)
org.apache.arrow.vector.ipc.message.MessageSerializer.writeBatchBuffers(MessageSerializer.java:303)
`
what is the solution for this : Arrow can work with Aparch spark 3.5.0 in java 21
code :
`spark = SparkSession.builder \
.appName("ArrowPySparkExample") \
.getOrCreate()
# Enable Arrow-based columnar data transfers
spark.conf.set("Dio.netty.tryReflectionSetAccessible", "true")
spark.conf.set("spark.sql.execution.arrow.pyspark.enabled", "true")
time.sleep(2000)
# Generate a pandas DataFrame
pdf = pd.DataFrame(["midhun"])
# Create a Spark DataFrame from a pandas DataFrame using Arrow
df = spark.createDataFrame(pdf)
# Convert the Spark DataFrame back to a pandas DataFrame using Arrow
result_pdf = df.select("*").toPandas()`
which s
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org