You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@datalab.apache.org by "Vira Vitanska (Jira)" <ji...@apache.org> on 2021/04/01 08:54:00 UTC

[jira] [Updated] (DATALAB-308) [Local spark]: Spark.executor.memory should be allocated depending on notebook instance shape

     [ https://issues.apache.org/jira/browse/DATALAB-308?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Vira Vitanska updated DATALAB-308:
----------------------------------
    Description: 
*(i) {color:#de350b}This allocation is obsolete because of task https://issues.apache.org/jira/browse/DATALAB-1985{color}*

*Preconditions:*
 Notebook is created

*Steps to reproduce:*
 1. run playbok on local kernel

*Actual result:*
 1. the value of spark.executor.memory is less than 75% from total (if instance shape is up to 8 GB)
 2. the value of spark.executor.memory is less than formula value: total value minus 3500MB (if instance shape is over than 8 GB)

*Expected result:*
 1. the value of spark.executor.memory is equal 75% from total (if instance shape is up to 8 GB)
 2. the value of spark.executor.memory is equal formula value: total value minus 3500MB (if instance shape is over than 8 GB)

  was:
*Preconditions:*
Notebook is created

*Steps to reproduce:*
1. run playbok on local kernel

*Actual result:*
1. the value of spark.executor.memory is less than 75% from total (if instance shape is up to 8 GB)
2. the value of spark.executor.memory is less than formula value: total value minus 3500MB (if instance shape is over than 8 GB)

*Expected result:*
1. the value of spark.executor.memory is equal 75% from total (if instance shape is up to 8 GB)
2. the value of spark.executor.memory is equal formula value: total value minus 3500MB (if instance shape is over than 8 GB)


> [Local spark]: Spark.executor.memory should be allocated depending on notebook instance shape
> ---------------------------------------------------------------------------------------------
>
>                 Key: DATALAB-308
>                 URL: https://issues.apache.org/jira/browse/DATALAB-308
>             Project: Apache DataLab
>          Issue Type: Bug
>          Components: DataLab Old
>            Reporter: Vira Vitanska
>            Priority: Major
>              Labels: 2.3_old, AWS, AZURE, Debian, DevOps, GCP, Known_issues(release2.2), Known_issues(release2.3), Known_issues(release2.4), RedHat
>
> *(i) {color:#de350b}This allocation is obsolete because of task https://issues.apache.org/jira/browse/DATALAB-1985{color}*
> *Preconditions:*
>  Notebook is created
> *Steps to reproduce:*
>  1. run playbok on local kernel
> *Actual result:*
>  1. the value of spark.executor.memory is less than 75% from total (if instance shape is up to 8 GB)
>  2. the value of spark.executor.memory is less than formula value: total value minus 3500MB (if instance shape is over than 8 GB)
> *Expected result:*
>  1. the value of spark.executor.memory is equal 75% from total (if instance shape is up to 8 GB)
>  2. the value of spark.executor.memory is equal formula value: total value minus 3500MB (if instance shape is over than 8 GB)



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@datalab.apache.org
For additional commands, e-mail: dev-help@datalab.apache.org