You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "angerszhu (Jira)" <ji...@apache.org> on 2019/10/10 09:47:00 UTC

[jira] [Updated] (SPARK-29424) Prevent Spark to committing stage of too much Task

     [ https://issues.apache.org/jira/browse/SPARK-29424?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

angerszhu updated SPARK-29424:
------------------------------
    Description: 
Our user always submit bad SQL in query platform, Such as :
# write wrong join condition but submit that sql
# write wrong where condition
# etc..

 This case will make Spark scheduler to submit a lot of task. It will cause spark run very slow and impact other user(spark thrift server)  even run out of memory because of too many object generated by a big num of  tasks. 

So i add a constraint when submit tasks.I wonder if the community will accept it

> Prevent Spark to committing stage of too much Task
> --------------------------------------------------
>
>                 Key: SPARK-29424
>                 URL: https://issues.apache.org/jira/browse/SPARK-29424
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 2.4.0, 3.0.0
>            Reporter: angerszhu
>            Priority: Major
>
> Our user always submit bad SQL in query platform, Such as :
> # write wrong join condition but submit that sql
> # write wrong where condition
> # etc..
>  This case will make Spark scheduler to submit a lot of task. It will cause spark run very slow and impact other user(spark thrift server)  even run out of memory because of too many object generated by a big num of  tasks. 
> So i add a constraint when submit tasks.I wonder if the community will accept it



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org