You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@kylin.apache.org by "Xiaoxiang Yu (Jira)" <ji...@apache.org> on 2022/06/24 02:31:00 UTC

[jira] [Commented] (KYLIN-5201) build and merge job use different spark config

    [ https://issues.apache.org/jira/browse/KYLIN-5201?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17558293#comment-17558293 ] 

Xiaoxiang Yu commented on KYLIN-5201:
-------------------------------------

[~penghj]  yes, these config entries could be set at cube level or calculated automatically.

Please check our wiki [https://cwiki.apache.org/confluence/display/KYLIN/How+to+improve+cube+building+and+query+performance] .

> build and merge job use different spark config
> ----------------------------------------------
>
>                 Key: KYLIN-5201
>                 URL: https://issues.apache.org/jira/browse/KYLIN-5201
>             Project: Kylin
>          Issue Type: Improvement
>          Components: Job Engine
>    Affects Versions: v4.0.1
>            Reporter: PENGHUAJIE
>            Priority: Major
>
> The build job and the merge job have different requirements for spark resources, I hope to distinguish the spark configuration,including
> *kylin.engine.spark-conf.spark.executor.instances*
> *kylin.engine.spark-conf.spark.executor.memory*
> *kylin.engine.spark-conf.spark.executor.cores*
> *kylin.engine.spark-conf.spark.executor.memoryOverhead*



--
This message was sent by Atlassian Jira
(v8.20.7#820007)