You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Zamil Majdy (Jira)" <ji...@apache.org> on 2023/08/08 11:56:00 UTC

[jira] [Created] (SPARK-44718) High On-heap memory usage is detected while doing parquet-file reading with Off-Heap memory mode enabled on spark

Zamil Majdy created SPARK-44718:
-----------------------------------

             Summary: High On-heap memory usage is detected while doing parquet-file reading with Off-Heap memory mode enabled on spark
                 Key: SPARK-44718
                 URL: https://issues.apache.org/jira/browse/SPARK-44718
             Project: Spark
          Issue Type: Improvement
          Components: Spark Core, SQL
    Affects Versions: 3.4.1
            Reporter: Zamil Majdy


I see the high use of on-heap memory usage while doing the parquet file reading when the off-heap memory mode is enabled. This is caused by the memory-mode for the column vector for the vectorized reader is configured by different flag, and the default value is always set to On-Heap.

Conf to reproduce the issue:

{{spark.memory.offHeap.size 1000000}}
{{spark.memory.offHeap.enabled true}}

Enabling these configurations only will not change the memory mode used for parquet-reading by the vectorized reader to Off-Heap.

 

Proposed PR: https://github.com/apache/spark/pull/42394



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org