You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Srini E (Jira)" <ji...@apache.org> on 2019/10/02 20:49:00 UTC

[jira] [Updated] (SPARK-29337) How to Cache Table and Pin it in Memory and should not Spill to Disk on Thrift Server

     [ https://issues.apache.org/jira/browse/SPARK-29337?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Srini E updated SPARK-29337:
----------------------------
    Attachment: Cache+Image.png

> How to Cache Table and Pin it in Memory and should not Spill to Disk on Thrift Server 
> --------------------------------------------------------------------------------------
>
>                 Key: SPARK-29337
>                 URL: https://issues.apache.org/jira/browse/SPARK-29337
>             Project: Spark
>          Issue Type: Question
>          Components: SQL
>    Affects Versions: 2.3.0
>            Reporter: Srini E
>            Priority: Major
>         Attachments: Cache+Image.png
>
>
> Hi Team,
> How to pin the table in cache so it would not swap out of memory?
> Situation: We are using Microstrategy BI reporting. Semantic layer is built. We wanted to Cache highly used tables into CACHE using Spark SQL CACHE Table <table_name>; we did cache for SPARK context( Thrift server). Please see below snapshot of Cache table, went to disk over time. Initially it was all in cache , now some in cache and some in disk. That disk may be local disk relatively more expensive reading than from s3. Queries may take longer and inconsistent times from user experience perspective. If More queries running using Cache tables, copies of the cache table images are copied and copies are not staying in memory causing reports to run longer. so how to pin the table so would not swap to disk. Spark memory management is dynamic allocation, and how to use those few tables to Pin in memory .



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org