You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@dlab.apache.org by "Vira Vitanska (JIRA)" <ji...@apache.org> on 2019/07/26 14:57:00 UTC
[jira] [Created] (DLAB-953) [Notebook]: Investigate how it is
allocated memory for spark
Vira Vitanska created DLAB-953:
----------------------------------
Summary: [Notebook]: Investigate how it is allocated memory for spark
Key: DLAB-953
URL: https://issues.apache.org/jira/browse/DLAB-953
Project: Apache DLab
Issue Type: Task
Components: DLab Main
Reporter: Vira Vitanska
Assignee: Mykola
Fix For: v.2.2
Attachments: image-2019-07-26-17-49-24-994.png, image-2019-07-26-17-54-18-544.png, image-2019-07-26-17-55-04-023.png
*Previously it was:*
1. the value of spark memory is equal 75% from total (if instance shape is up to 8 GB)
2. the value of spark.executor.memory is equal formula value: total value minus 3.5 GB (if instance shape is over than 8 GB).
*Currently it is:*
+Case 1:+ Rstudio [4 GB]
3.4*0.75*1024=3072MB (by theory)
But by practical:
!image-2019-07-26-17-55-04-023.png!
+Case2:+
Rstudio [122 GB]
(122-3.5)*1024= 121344
But by practical: (by theory)
!image-2019-07-26-17-54-18-544.png!
Please, investigate how memory should be allocated
--
This message was sent by Atlassian JIRA
(v7.6.14#76016)
---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@dlab.apache.org
For additional commands, e-mail: dev-help@dlab.apache.org