You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@linkis.apache.org by lucky059 <lu...@126.com> on 2022/08/03 02:58:15 UTC

dss工作流提交到schedulis,spark的运行参数怎么设置


通过管理台设置的参数,在scripts好使,但是提交到schedulis就不管用
比如我想设置         spark.executor.cores,spark.executor.memory
| |
lucky059
|
|
lucky059@126.com
|

Re: dss工作流提交到schedulis,spark的运行参数怎么设置

Posted by "casion@apache.org" <ca...@apache.org>.
I create a new issue in github
https://github.com/apache/incubator-linkis/issues/2566
Best Regards!

Chen Xia


casion@apache.org <ca...@apache.org> 于2022年8月3日周三 21:21写道:

> https://github.com/apache/incubator-linkis/issues/2566
>
> 1. if you run by linkis-cli
> such as    `sh ./bin/linkis-cli -engineType spark-2.4.3 -codeType sql
> -confMap wds.linkis.yarnqueue=q02 -confMap spark.executor.instances=3 -code
> "select count(*) from testdb.test;"  -submitUser hadoop -proxyUser hadoop
>  `
> see detail
> https://linkis.apache.org/zh-CN/docs/1.1.3/user_guide/linkiscli-manual/
>
> 2.  system level
> add new  system level configuration  for creator  in linkis web console
> ![16595320571217](
> https://user-images.githubusercontent.com/7869972/182617776-3a2ac180-af11-4381-a857-18e29b356957.png
> )
>
> 3. set for single task  in dss (like eg 1)
> ![16595322452353](
> https://user-images.githubusercontent.com/7869972/182618096-f22cfc9b-3686-4dec-969b-d691f2d1fce6.png
> )
>
>
> Best Regards!
>
> Chen Xia
>
>
> lucky059 <lu...@126.com> 于2022年8月3日周三 16:35写道:
>
>>
>>
>> 通过管理台设置的参数,在scripts好使,但是提交到schedulis就不管用
>> 比如我想设置         spark.executor.cores,spark.executor.memory
>> | |
>> lucky059
>> |
>> |
>> lucky059@126.com
>> |
>
>

Re: dss工作流提交到schedulis,spark的运行参数怎么设置

Posted by "casion@apache.org" <ca...@apache.org>.
https://github.com/apache/incubator-linkis/issues/2566

1. if you run by linkis-cli
such as    `sh ./bin/linkis-cli -engineType spark-2.4.3 -codeType sql
-confMap wds.linkis.yarnqueue=q02 -confMap spark.executor.instances=3 -code
"select count(*) from testdb.test;"  -submitUser hadoop -proxyUser hadoop
 `
see detail
https://linkis.apache.org/zh-CN/docs/1.1.3/user_guide/linkiscli-manual/

2.  system level
add new  system level configuration  for creator  in linkis web console
![16595320571217](
https://user-images.githubusercontent.com/7869972/182617776-3a2ac180-af11-4381-a857-18e29b356957.png
)

3. set for single task  in dss (like eg 1)
![16595322452353](
https://user-images.githubusercontent.com/7869972/182618096-f22cfc9b-3686-4dec-969b-d691f2d1fce6.png
)


Best Regards!

Chen Xia


lucky059 <lu...@126.com> 于2022年8月3日周三 16:35写道:

>
>
> 通过管理台设置的参数,在scripts好使,但是提交到schedulis就不管用
> 比如我想设置         spark.executor.cores,spark.executor.memory
> | |
> lucky059
> |
> |
> lucky059@126.com
> |