You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@dlab.apache.org by "Vira Vitanska (Jira)" <ji...@apache.org> on 2020/04/21 15:42:00 UTC
[jira] [Updated] (DLAB-308) [Local spark]: Spark.executor.memory
should be allocated depending on notebook instance shape
[ https://issues.apache.org/jira/browse/DLAB-308?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Vira Vitanska updated DLAB-308:
-------------------------------
Labels: 2.3_old AWS AZURE Debian DevOps GCP Known_issues(release2.2) Known_issues(release2.3) RedHat (was: 2.3_old AWS AZURE Debian DevOps GCP Known_issues(release2.2) RedHat)
> [Local spark]: Spark.executor.memory should be allocated depending on notebook instance shape
> ---------------------------------------------------------------------------------------------
>
> Key: DLAB-308
> URL: https://issues.apache.org/jira/browse/DLAB-308
> Project: Apache DLab
> Issue Type: Bug
> Components: DLab Old
> Reporter: Vira Vitanska
> Priority: Major
> Labels: 2.3_old, AWS, AZURE, Debian, DevOps, GCP, Known_issues(release2.2), Known_issues(release2.3), RedHat
>
> *Preconditions:*
> Notebook is created
> *Steps to reproduce:*
> 1. run playbok on local kernel
> *Actual result:*
> 1. the value of spark.executor.memory is less than 75% from total (if instance shape is up to 8 GB)
> 2. the value of spark.executor.memory is less than formula value: total value minus 3500MB (if instance shape is over than 8 GB)
> *Expected result:*
> 1. the value of spark.executor.memory is equal 75% from total (if instance shape is up to 8 GB)
> 2. the value of spark.executor.memory is equal formula value: total value minus 3500MB (if instance shape is over than 8 GB)
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@dlab.apache.org
For additional commands, e-mail: dev-help@dlab.apache.org