You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@dlab.apache.org by "Vira Vitanska (JIRA)" <ji...@apache.org> on 2018/12/19 19:34:00 UTC

[jira] [Created] (DLAB-157) If update the configurations for Spark Standalone the value of Memory per Executor is less than 75% from total

Vira Vitanska created DLAB-157:
----------------------------------

             Summary: If update the configurations for Spark Standalone the value of Memory per Executor is less than 75% from total
                 Key: DLAB-157
                 URL: https://issues.apache.org/jira/browse/DLAB-157
             Project: Apache DLab
          Issue Type: Bug
            Reporter: Vira Vitanska
            Assignee: Oleh Martushevskyi
             Fix For: 2.1 release


*Preconditions:*
1. Notebook is created

*Steps to reproduce:*
1. Create custom Spark standalone with parameters:
[
{
"Classification": "spark-defaults",
"Properties":

{ "spark.shuffle.sort.bypassMergeThreshold": "200" }

}
]
2. Update custom Spark standalone with parameters:
[
{
"Classification": "spark-defaults",
"Properties":

{ "spark.shuffle.sort.bypassMergeThreshold": "250" }

}
]
3. run a playbook on remote kernel
4. go to Spark standalone (master) UI

*Actual result:*
13% is allocated for Memory per Executor from total

*Expected result:*
75% is allocated for Memory per Executor from total



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@dlab.apache.org
For additional commands, e-mail: dev-help@dlab.apache.org