You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2021/02/13 10:04:00 UTC

[GitHub] [spark] Nozziel commented on a change in pull request #31551: [SPARK-34430][Doc] Update index.md with a pyspark hint to avoid java.nio.DirectByteBuffer.(long, int) not available

Nozziel commented on a change in pull request #31551:
URL: https://github.com/apache/spark/pull/31551#discussion_r575648280



##########
File path: docs/index.md
##########
@@ -53,6 +53,11 @@ uses Scala {{site.SCALA_BINARY_VERSION}}. You will need to use a compatible Scal
 For Python 3.9, Arrow optimization and pandas UDFs might not work due to the supported Python versions in Apache Arrow. Please refer to the latest [Python Compatibility](https://arrow.apache.org/docs/python/install.html#python-compatibility) page.
 For Java 11, `-Dio.netty.tryReflectionSetAccessible=true` is required additionally for Apache Arrow library. This prevents `java.lang.UnsupportedOperationException: sun.misc.Unsafe or java.nio.DirectByteBuffer.(long, int) not available` when Apache Arrow uses Netty internally.
 
+```python
+# PySpark
+SparkSession.builder.config('spark.driver.extraJavaOptions', '-Dio.netty.tryReflectionSetAccessible=true').getOrCreate()

Review comment:
       updated with jira link. 
   The pyspark example is intended for those that rarely ever change the config...
   Those that would not think of using 'spark.driver.extraJavaOptions' in the first place... 
   And should be clear enough now for everybody to adapt it to their prefered way to set this config setting.
   
   And hopeful this underlying issue gets resolved soon making this all moot. But till then I hope to save a few people a bit of time.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org