You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Rakesh Raushan (Jira)" <ji...@apache.org> on 2019/09/27 09:37:00 UTC

[jira] [Updated] (SPARK-29152) Spark Executor Plugin API shutdown is not proper when dynamic allocation enabled[SPARK-24918]

     [ https://issues.apache.org/jira/browse/SPARK-29152?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Rakesh Raushan updated SPARK-29152:
-----------------------------------
    Description: 
*Issue Description*

Spark Executor Plugin API *shutdown handling is not proper*, when dynamic allocation enabled .Plugin's shutdown method is not processed when dynamic allocation is enabled and *executors become dead* after inactive time.

*Test Precondition*
1. Create a plugin and make a jar named SparkExecutorplugin.jar

import org.apache.spark.ExecutorPlugin;
public class ExecutoTest1 implements ExecutorPlugin{
    public void init(){
        System.out.println("Executor Plugin Initialised.");
    }

    public void shutdown(){
        System.out.println("Executor plugin closed successfully.");
    }
}

2. Create the  jars with the same and put it in folder /spark/examples/jars

*Test Steps*

1. launch bin/spark-sql with dynamic allocation enabled

./spark-sql --master yarn --conf spark.executor.plugins=ExecutoTest1  --jars /opt/HA/C10/install/spark/spark/examples/jars/SparkExecutorPlugin.jar --conf spark.dynamicAllocation.enabled=true --conf spark.dynamicAllocation.initialExecutors=2 --conf spark.dynamicAllocation.minExecutors=1

2 create a table , insert the data and select * from tablename
3.Check the spark UI Jobs tab/SQL tab
4. Check all Executors(executor tab will give all executors details) application log file for Executor plugin Initialization and Shutdown messages or operations.
Example /yarn/logdir/application_1567156749079_0025/container_e02_1567156749079_0025_01_000005/ stdout

5. Wait for the executor to be dead after the inactive time and check the same container log 
6. Kill the spark sql and check the container log  for executor plugin shutdown.

*Expect Output*

1. Job should be success. Create table ,insert and select query should be success.

2.While running query All Executors  log should contain the executor plugin Init messages or operations.
"Executor Plugin Initialised.

3.Once the executors are dead ,shutdown message should be there in log file.
“ Executor plugin closed successfully.

4.Once the sql application closed ,shutdown message should be there in log.
“ Executor plugin closed successfully". 


*Actual Output*

Shutdown message is not called when executor is dead after inactive time.

*Observation*
Without dynamic allocation Executor plugin is working fine. But after enabling dynamic allocation,Executor shutdown is not processed.




  was:
*Issue Description*

Spark Executor Plugin API *shutdown handling is not proper*, when dynamic allocation enabled .Plugin shutdown method is not processed-while dynamic allocation is enabled and *executors become dead* after inactive time.

*Test Precondition*
1.Prepared 4 spark applications with executor plugin interface.
First application-SparkExecutorplugin.jar

import org.apache.spark.ExecutorPlugin;
public class ExecutoTest1 implements ExecutorPlugin{
    public void init(){
        System.out.println("Executor Plugin Initialised.");
    }

    public void shutdown(){
        System.out.println("Executor plugin closed successfully.");
    }
}



2. Create the  jars with the same and put it in folder /spark/examples/jars

*Test Steps*

1. launch bin/spark-sql with dynamic allocation enabled

./spark-sql --master yarn --conf spark.executor.plugins=ExecutoTest1  --conf="spark.executor.extraClassPath=/opt/HA/C10/install/spark/spark/examples/jars/*" --conf spark.dynamicAllocation.enabled=true --conf spark.dynamicAllocation.initialExecutors=2 --conf spark.dynamicAllocation.minExecutors=0

2 create a table , insert the data and select * from tablename
3.Check the spark UI Jobs tab/SQL tab
4. Check all Executors(executor tab will give all executors details) application log file for Executor plugin Initialization and Shutdown messages or operations.
Example /yarn/logdir/application_1567156749079_0025/container_e02_1567156749079_0025_01_000005/ stdout

5. Wait for the executor to be dead after the inactive time and check the same container log 
6. Kill the spark sql and check the container log  for executor plugin shutdown.

*Expect Output*

1. Job should be success. Create table ,insert and select query should be success.

2.While running query All Executors  log should contain the executor plugin Init and shutdown messages or operations.
"Executor Plugin Initialised.

3.Once the executors are dead ,executor  shutdown should call shutdown message should be there in log file.
“ Executor plugin closed successfully.

4.Once the sql application closed ,executor  shutdown should call shutdown message should be there in log.
“ Executor plugin closed successfully". 


*Actual Output*

Shutdown message is not called when executor is dead or after closing the application

*Observation*
Without dynamic allocation Executor plugin is working fine.But after enabling dynamic allocation,Executor shutdown is not processed.





> Spark Executor Plugin API shutdown is not proper when dynamic allocation enabled[SPARK-24918]
> ---------------------------------------------------------------------------------------------
>
>                 Key: SPARK-29152
>                 URL: https://issues.apache.org/jira/browse/SPARK-29152
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.4.4
>            Reporter: jobit mathew
>            Priority: Major
>
> *Issue Description*
> Spark Executor Plugin API *shutdown handling is not proper*, when dynamic allocation enabled .Plugin's shutdown method is not processed when dynamic allocation is enabled and *executors become dead* after inactive time.
> *Test Precondition*
> 1. Create a plugin and make a jar named SparkExecutorplugin.jar
> import org.apache.spark.ExecutorPlugin;
> public class ExecutoTest1 implements ExecutorPlugin{
>     public void init(){
>         System.out.println("Executor Plugin Initialised.");
>     }
>     public void shutdown(){
>         System.out.println("Executor plugin closed successfully.");
>     }
> }
> 2. Create the  jars with the same and put it in folder /spark/examples/jars
> *Test Steps*
> 1. launch bin/spark-sql with dynamic allocation enabled
> ./spark-sql --master yarn --conf spark.executor.plugins=ExecutoTest1  --jars /opt/HA/C10/install/spark/spark/examples/jars/SparkExecutorPlugin.jar --conf spark.dynamicAllocation.enabled=true --conf spark.dynamicAllocation.initialExecutors=2 --conf spark.dynamicAllocation.minExecutors=1
> 2 create a table , insert the data and select * from tablename
> 3.Check the spark UI Jobs tab/SQL tab
> 4. Check all Executors(executor tab will give all executors details) application log file for Executor plugin Initialization and Shutdown messages or operations.
> Example /yarn/logdir/application_1567156749079_0025/container_e02_1567156749079_0025_01_000005/ stdout
> 5. Wait for the executor to be dead after the inactive time and check the same container log 
> 6. Kill the spark sql and check the container log  for executor plugin shutdown.
> *Expect Output*
> 1. Job should be success. Create table ,insert and select query should be success.
> 2.While running query All Executors  log should contain the executor plugin Init messages or operations.
> "Executor Plugin Initialised.
> 3.Once the executors are dead ,shutdown message should be there in log file.
> “ Executor plugin closed successfully.
> 4.Once the sql application closed ,shutdown message should be there in log.
> “ Executor plugin closed successfully". 
> *Actual Output*
> Shutdown message is not called when executor is dead after inactive time.
> *Observation*
> Without dynamic allocation Executor plugin is working fine. But after enabling dynamic allocation,Executor shutdown is not processed.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org