You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@kylin.apache.org by "YANG LIU (Jira)" <ji...@apache.org> on 2021/03/10 01:54:00 UTC

[jira] [Closed] (KYLIN-4922) Failed to build cube

     [ https://issues.apache.org/jira/browse/KYLIN-4922?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

YANG LIU closed KYLIN-4922.
---------------------------
    Resolution: Not A Bug

it is my environmental problem.

at the same time,I used the lower version(kylin 3.1.1),it`s ok

> Failed to build cube 
> ---------------------
>
>                 Key: KYLIN-4922
>                 URL: https://issues.apache.org/jira/browse/KYLIN-4922
>             Project: Kylin
>          Issue Type: Bug
>          Components: Spark Engine
>    Affects Versions: v4.0.0-beta
>         Environment: kylin v4.0-beta、CDH-6.3.1、Spark version CDH-2.4.0,
>            Reporter: YANG LIU
>            Priority: Major
>              Labels: build
>             Fix For: v4.0.0-beta
>
>         Attachments: error.png, server config.png
>
>
> Error Log:
> java.io.IOException: OS command error exit with return code: 1, error message: WARNING: User-defined SPARK_HOME (/opt/cloudera/parcels/CDH-6.3.1-1.cdh6.3.1.p0.1470567/lib/spark) overrides detected (/opt/cloudera/parcels/CDH/lib/spark).
> WARNING: Running spark-class from user-defined location.
> Exception in thread "main" org.apache.spark.SparkException: Cluster deploy mode is not compatible with master "local"
>  at org.apache.spark.deploy.SparkSubmit.error(SparkSubmit.scala:859)
>  at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:290)
>  at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:143)
>  at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
>  at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:926)
>  at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:935)
>  at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> The command is: 
> export HADOOP_CONF_DIR=/opt/kylin/hadoop_conf && /opt/cloudera/parcels/CDH/lib/spark/bin/spark-submit --class org.apache.kylin.engine.spark.application.SparkEntry --conf 'spark.yarn.queue=default' --conf 'spark.history.fs.logDirectory=hdfs:///kylin/spark-history' --conf 'spark.master=local' --conf 'spark.hadoop.yarn.timeline-service.enabled=false' --conf 'spark.driver.cores=1' --conf  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)