You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@zeppelin.apache.org by "Nilanjan Roy (Jira)" <ji...@apache.org> on 2021/05/29 11:07:00 UTC

[jira] [Created] (ZEPPELIN-5397) SPARK interpreter not starting

Nilanjan Roy created ZEPPELIN-5397:
--------------------------------------

             Summary: SPARK interpreter not starting
                 Key: ZEPPELIN-5397
                 URL: https://issues.apache.org/jira/browse/ZEPPELIN-5397
             Project: Zeppelin
          Issue Type: Bug
          Components: interpreter-launcher, spark
    Affects Versions: 0.9.0
         Environment: We are using docker to run zeppelin. Both zeppelin and spark are installed inside the same container running debian buster. We are running mesos in a separate cluster and setting SPARK_MASTER to point to the mesos cluster.
            Reporter: Nilanjan Roy
         Attachments: interpreter.sh

We are setting following SPARK_SUBMIT_OPTIONS in zeppelin-env.sh file. 

 
{code:java}
export SPARK_SUBMIT_OPTIONS="$SPARK_SUBMIT_OPTIONS --conf 'spark.driver.extraJavaOptions=-Dcom.example.env=production -Dcom.example.role=zeppelin'"
export SPARK_SUBMIT_OPTIONS="$SPARK_SUBMIT_OPTIONS --conf 'spark.executor.extraJavaOptions=-Dcom.example.env=production -Dcom.example.role=zeppelin -Dfile.encoding=UTF-8'"
{code}
The spark interpreter is failing to start with error

 
{code:java}
Error: Unrecognized option: -Dcom.example.role='
{code}
We investigated the issue and the issue seems to be in the bin/interpreter.sh script which is failing to parse the SPARK_SUBMIT_OPTIONS properly. When the INTERPRETER_RUN_COMMAND is expanded we can see how the SPARK_SUBMIT_OPTIONS is interpreted
{code:java}
--conf ''\''spark.driver.extraJavaOptions=-Dcom.example.env=production' '-Dcom.example.role=zeppelin'\''' --conf ''\''spark.executor.extraJavaOptions=-Dcom.example.env=production' -Dcom.example.role=zeppelin '-Dfile.encoding=UTF-8'\'''
{code}
Upon further investigation we found the issue is the whitespace between the Dcom.example.env=production and Dcom.example.role in spark.driver.extraJavaOptions ** and similarly in spark.executor.extraJavaOptions.

This issue was not there in a SNAPSHOT version of 0.9.0 (not able to find that anymore). Attaching the interpreter.sh from that version which is working.[^interpreter.sh]

 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)