You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@datalab.apache.org by "Vira Vitanska (Jira)" <ji...@apache.org> on 2021/04/01 08:46:00 UTC
[jira] [Updated] (DATALAB-953) [Notebook]: Investigate how it is
allocated memory for spark
[ https://issues.apache.org/jira/browse/DATALAB-953?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Vira Vitanska updated DATALAB-953:
----------------------------------
Description:
{color:#de350b}*(i) This allocation is absolute because of task https://issues.apache.org/jira/browse/DATALAB-1985*{color}
*Previously it was:*
1. the value of spark memory is equal 75% from total (if instance shape is up to 8 GB)
2. the value of spark.executor.memory is equal formula value: total value minus 3.5 GB (if instance shape is over than 8 GB).
*Currently it is:*
+Case 1:+ Rstudio [4 GB]
3.4*0.75*1024=3072MB (by theory)
But by practical:
!image-2019-07-26-17-55-04-023.png!
+Case2:+
Rstudio [122 GB]
(122-3.5)*1024= 121344
But by practical: (by theory)
!image-2019-07-26-17-54-18-544.png!
Please, investigate how memory should be allocated
was:
*Previously it was:*
1. the value of spark memory is equal 75% from total (if instance shape is up to 8 GB)
2. the value of spark.executor.memory is equal formula value: total value minus 3.5 GB (if instance shape is over than 8 GB).
*Currently it is:*
+Case 1:+ Rstudio [4 GB]
3.4*0.75*1024=3072MB (by theory)
But by practical:
!image-2019-07-26-17-55-04-023.png!
+Case2:+
Rstudio [122 GB]
(122-3.5)*1024= 121344
But by practical: (by theory)
!image-2019-07-26-17-54-18-544.png!
Please, investigate how memory should be allocated
> [Notebook]: Investigate how it is allocated memory for spark
> ------------------------------------------------------------
>
> Key: DATALAB-953
> URL: https://issues.apache.org/jira/browse/DATALAB-953
> Project: Apache DataLab
> Issue Type: Task
> Components: DataLab Main
> Reporter: Vira Vitanska
> Assignee: Mykola Bodnar
> Priority: Minor
> Labels: 2.3_old, AWS, Debian, RedHat
> Attachments: image-2019-07-26-17-49-24-994.png, image-2019-07-26-17-54-18-544.png, image-2019-07-26-17-55-04-023.png
>
>
> {color:#de350b}*(i) This allocation is absolute because of task https://issues.apache.org/jira/browse/DATALAB-1985*{color}
> *Previously it was:*
> 1. the value of spark memory is equal 75% from total (if instance shape is up to 8 GB)
> 2. the value of spark.executor.memory is equal formula value: total value minus 3.5 GB (if instance shape is over than 8 GB).
> *Currently it is:*
> +Case 1:+ Rstudio [4 GB]
> 3.4*0.75*1024=3072MB (by theory)
> But by practical:
> !image-2019-07-26-17-55-04-023.png!
> +Case2:+
> Rstudio [122 GB]
> (122-3.5)*1024= 121344
> But by practical: (by theory)
> !image-2019-07-26-17-54-18-544.png!
> Please, investigate how memory should be allocated
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@datalab.apache.org
For additional commands, e-mail: dev-help@datalab.apache.org