You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/04/30 16:54:00 UTC

[jira] [Resolved] (SPARK-27587) No such method error (sun.nio.ch.DirectBuffer.cleaner()) when reading big table from JDBC (with one slow query)

     [ https://issues.apache.org/jira/browse/SPARK-27587?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon resolved SPARK-27587.
----------------------------------
    Resolution: Duplicate

> No such method error (sun.nio.ch.DirectBuffer.cleaner()) when reading big table from JDBC (with one slow query)
> ---------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-27587
>                 URL: https://issues.apache.org/jira/browse/SPARK-27587
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Spark Core, SQL
>    Affects Versions: 2.4.1, 2.4.2
>            Reporter: Mohsen Taheri
>            Priority: Major
>
> It throws the error while reading big tables from JDBC data source:
> > Code:
> sparkSession.read()
>  .option("numPartitions", data.numPartitions)
>  .option("partitionColumn", data.pk)
>  .option("lowerBound", data.min)
>  .option("upperBound", data.max)
>  .option("queryTimeout", 180).
>  format("jdbc").
>  jdbc(dbURL, tableName, props).
>  repartition(10).write().mode(SaveMode.Overwrite).parquet(tableF.getAbsolutePath());
>  
> > Stacktrace:
> Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, localhost, executor driver): java.lang.NoSuchMethodError: sun.nio.ch.DirectBuffer.cleaner()Lsun/misc/Cleaner; +details
> Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, localhost, executor driver): java.lang.NoSuchMethodError: sun.nio.ch.DirectBuffer.cleaner()Lsun/misc/Cleaner;
>  at org.apache.spark.storage.StorageUtils$.cleanDirectBuffer(StorageUtils.scala:212)
>  at org.apache.spark.storage.StorageUtils$.dispose(StorageUtils.scala:207)
>  at org.apache.spark.storage.StorageUtils.dispose(StorageUtils.scala)
>  at org.apache.spark.io.NioBufferedFileInputStream.close(NioBufferedFileInputStream.java:130)
>  at java.base/java.io.FilterInputStream.close(FilterInputStream.java:180)
>  at org.apache.spark.io.ReadAheadInputStream.close(ReadAheadInputStream.java:400)
>  at org.apache.spark.util.collection.unsafe.sort.UnsafeSorterSpillReader.close(UnsafeSorterSpillReader.java:151)
>  at org.apache.spark.util.collection.unsafe.sort.UnsafeSorterSpillReader.loadNext(UnsafeSorterSpillReader.java:123)
>  at org.apache.spark.util.collection.unsafe.sort.UnsafeSorterSpillMerger$1.loadNext(UnsafeSorterSpillMerger.java:82)
>  at org.apache.spark.sql.execution.UnsafeExternalRowSorter$1.next(UnsafeExternalRowSorter.java:187)
>  at org.apache.spark.sql.execution.UnsafeExternalRowSorter$1.next(UnsafeExternalRowSorter.java:174)
>  at scala.collection.Iterator$$anon$10.next(Iterator.scala:459)
>  at org.apache.spark.shuffle.sort.BypassMergeSortShuffleWriter.write(BypassMergeSortShuffleWriter.java:149)
>  at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
>  at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
>  at org.apache.spark.scheduler.Task.run(Task.scala:121)
>  at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:411)
>  at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
>  at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
>  at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
>  at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
>  at java.base/java.lang.Thread.run(Thread.java:834)
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org