You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by Manuel Sopena Ballesteros <ma...@garvan.org.au> on 2019/08/12 07:40:52 UTC

multiple interpreters for spark python2 and 3

Dear Zeppelin community,

I have a zeppelin installation and a spark cluster. I need to provide options for users to run either python2 or 3 code using pyspark. At the moment the only way of doing this is by editing the spark interpreter and changing the `zeppelin.pyspark.python` from python to python3.6.
Is there a way to copy/duplicate the spark interpreter one with python2 and the other with python3 so I can chose which one to use without leaving the notebook?

Thank you

NOTICE
Please consider the environment before printing this email. This message and any attachments are intended for the addressee named and may contain legally privileged/confidential/copyright information. If you are not the intended recipient, you should not read, use, disclose, copy or distribute this communication. If you have received this message in error please notify us at once by return email and then delete both messages. We accept no liability for the distribution of viruses or similar in electronic communications. This notice should not be removed.

Re: multiple interpreters for spark python2 and 3

Posted by Jeff Zhang <zj...@gmail.com>.
Create 2 spark interpreter groups, do it in interpreter setting page
instead of editing the json file manually.


Manuel Sopena Ballesteros <ma...@garvan.org.au> 于2019年8月13日周二 下午1:13写道:

> Hi,
>
>
>
> Do I need to create 2 spark interpreter groups or can I just create a new
> py3spark interpreter inside eexisting spark interpreter group like the
> example below?
>
>
>
> …
>
>   {
>
>     "group": "spark",
>
>     "name": "pyspark",
>
>     "className": "org.apache.zeppelin.spark.PySparkInterpreter",
>
>     "properties": {
>
>       "zeppelin.pyspark.python": {
>
>         "envName": "PYSPARK_PYTHON",
>
>         "propertyName": null,
>
>         "defaultValue": "python",
>
>         "description": "Python command to run pyspark with",
>
>         "type": "string"
>
>       },
>
>       "zeppelin.pyspark.useIPython": {
>
>         "envName": null,
>
>         "propertyName": "zeppelin.pyspark.useIPython",
>
>         "defaultValue": true,
>
>         "description": "whether use IPython when it is available",
>
>         "type": "checkbox"
>
>       }
>
>     },
>
>     "editor": {
>
>       "language": "python",
>
>       "editOnDblClick": false,
>
>       "completionKey": "TAB",
>
>       "completionSupport": true
>
>     }
>
>   },
>
>   {
>
>     "group": "spark",
>
>     "name": "py3spark",
>
>     "className": "org.apache.zeppelin.spark.PySparkInterpreter",
>
>     "properties": {
>
>       "zeppelin.py3spark.python": {
>
>         "envName": "PYSPARK_PYTHON",
>
>         "propertyName": null,
>
>         "defaultValue": "python3.6",
>
>         "description": "Python3.6 command to run pyspark with",
>
>         "type": "string"
>
>       },
>
>       "zeppelin.pyspark.useIPython": {
>
>         "envName": null,
>
>         "propertyName": "zeppelin.pyspark.useIPython",
>
>         "defaultValue": true,
>
>         "description": "whether use IPython when it is available",
>
>         "type": "checkbox"
>
>       }
>
>     },
>
>     "editor": {
>
>       "language": "python",
>
>       "editOnDblClick": false,
>
>       "completionKey": "TAB",
>
>       "completionSupport": true
>
>     }
>
>   },
>
> …
>
>
>
> Thank you
>
>
>
> Manuel
>
>
>
> *From:* Jeff Zhang [mailto:zjffdu@gmail.com]
> *Sent:* Monday, August 12, 2019 5:46 PM
> *To:* users
> *Subject:* Re: multiple interpreters for spark python2 and 3
>
>
>
> 2 Approaches:
>
> 1.  create 2 spark interpreters, one with python2 and another with python3
>
> 2.  use generic configuration interpreter.
> https://medium.com/@zjffdu/zeppelin-0-8-0-new-features-ea53e8810235
>
>
>
> Manuel Sopena Ballesteros <ma...@garvan.org.au> 于2019年8月12日周一 下午3:41
> 写道:
>
>
>
> Dear Zeppelin community,
>
>
>
> I have a zeppelin installation and a spark cluster. I need to provide
> options for users to run either python2 or 3 code using pyspark. At the
> moment the only way of doing this is by editing the spark interpreter and
> changing the `zeppelin.pyspark.python` from python to python3.6.
>
> Is there a way to copy/duplicate the spark interpreter one with python2
> and the other with python3 so I can chose which one to use without leaving
> the notebook?
>
>
>
> Thank you
>
>
>
> NOTICE
>
> Please consider the environment before printing this email. This message
> and any attachments are intended for the addressee named and may contain
> legally privileged/confidential/copyright information. If you are not the
> intended recipient, you should not read, use, disclose, copy or distribute
> this communication. If you have received this message in error please
> notify us at once by return email and then delete both messages. We accept
> no liability for the distribution of viruses or similar in electronic
> communications. This notice should not be removed.
>
>
>
>
> --
>
> Best Regards
>
> Jeff Zhang
> NOTICE
> Please consider the environment before printing this email. This message
> and any attachments are intended for the addressee named and may contain
> legally privileged/confidential/copyright information. If you are not the
> intended recipient, you should not read, use, disclose, copy or distribute
> this communication. If you have received this message in error please
> notify us at once by return email and then delete both messages. We accept
> no liability for the distribution of viruses or similar in electronic
> communications. This notice should not be removed.
>


-- 
Best Regards

Jeff Zhang

RE: multiple interpreters for spark python2 and 3

Posted by Manuel Sopena Ballesteros <ma...@garvan.org.au>.
Hi,

Do I need to create 2 spark interpreter groups or can I just create a new py3spark interpreter inside eexisting spark interpreter group like the example below?

…
  {
    "group": "spark",
    "name": "pyspark",
    "className": "org.apache.zeppelin.spark.PySparkInterpreter",
    "properties": {
      "zeppelin.pyspark.python": {
        "envName": "PYSPARK_PYTHON",
        "propertyName": null,
        "defaultValue": "python",
        "description": "Python command to run pyspark with",
        "type": "string"
      },
      "zeppelin.pyspark.useIPython": {
        "envName": null,
        "propertyName": "zeppelin.pyspark.useIPython",
        "defaultValue": true,
        "description": "whether use IPython when it is available",
        "type": "checkbox"
      }
    },
    "editor": {
      "language": "python",
      "editOnDblClick": false,
      "completionKey": "TAB",
      "completionSupport": true
    }
  },
  {
    "group": "spark",
    "name": "py3spark",
    "className": "org.apache.zeppelin.spark.PySparkInterpreter",
    "properties": {
      "zeppelin.py3spark.python": {
        "envName": "PYSPARK_PYTHON",
        "propertyName": null,
        "defaultValue": "python3.6",
        "description": "Python3.6 command to run pyspark with",
        "type": "string"
      },
      "zeppelin.pyspark.useIPython": {
        "envName": null,
        "propertyName": "zeppelin.pyspark.useIPython",
        "defaultValue": true,
        "description": "whether use IPython when it is available",
        "type": "checkbox"
      }
    },
    "editor": {
      "language": "python",
      "editOnDblClick": false,
      "completionKey": "TAB",
      "completionSupport": true
    }
  },
…

Thank you

Manuel

From: Jeff Zhang [mailto:zjffdu@gmail.com]
Sent: Monday, August 12, 2019 5:46 PM
To: users
Subject: Re: multiple interpreters for spark python2 and 3

2 Approaches:
1.  create 2 spark interpreters, one with python2 and another with python3
2.  use generic configuration interpreter. https://medium.com/@zjffdu/zeppelin-0-8-0-new-features-ea53e8810235

Manuel Sopena Ballesteros <ma...@garvan.org.au>> 于2019年8月12日周一 下午3:41写道:

Dear Zeppelin community,

I have a zeppelin installation and a spark cluster. I need to provide options for users to run either python2 or 3 code using pyspark. At the moment the only way of doing this is by editing the spark interpreter and changing the `zeppelin.pyspark.python` from python to python3.6.
Is there a way to copy/duplicate the spark interpreter one with python2 and the other with python3 so I can chose which one to use without leaving the notebook?

Thank you

NOTICE
Please consider the environment before printing this email. This message and any attachments are intended for the addressee named and may contain legally privileged/confidential/copyright information. If you are not the intended recipient, you should not read, use, disclose, copy or distribute this communication. If you have received this message in error please notify us at once by return email and then delete both messages. We accept no liability for the distribution of viruses or similar in electronic communications. This notice should not be removed.


--
Best Regards

Jeff Zhang
NOTICE
Please consider the environment before printing this email. This message and any attachments are intended for the addressee named and may contain legally privileged/confidential/copyright information. If you are not the intended recipient, you should not read, use, disclose, copy or distribute this communication. If you have received this message in error please notify us at once by return email and then delete both messages. We accept no liability for the distribution of viruses or similar in electronic communications. This notice should not be removed.

Re: multiple interpreters for spark python2 and 3

Posted by Jeff Zhang <zj...@gmail.com>.
2 Approaches:
1.  create 2 spark interpreters, one with python2 and another with python3
2.  use generic configuration interpreter.
https://medium.com/@zjffdu/zeppelin-0-8-0-new-features-ea53e8810235

Manuel Sopena Ballesteros <ma...@garvan.org.au> 于2019年8月12日周一 下午3:41写道:

>
>
> Dear Zeppelin community,
>
>
>
> I have a zeppelin installation and a spark cluster. I need to provide
> options for users to run either python2 or 3 code using pyspark. At the
> moment the only way of doing this is by editing the spark interpreter and
> changing the `zeppelin.pyspark.python` from python to python3.6.
>
> Is there a way to copy/duplicate the spark interpreter one with python2
> and the other with python3 so I can chose which one to use without leaving
> the notebook?
>
>
>
> Thank you
>
>
> NOTICE
> Please consider the environment before printing this email. This message
> and any attachments are intended for the addressee named and may contain
> legally privileged/confidential/copyright information. If you are not the
> intended recipient, you should not read, use, disclose, copy or distribute
> this communication. If you have received this message in error please
> notify us at once by return email and then delete both messages. We accept
> no liability for the distribution of viruses or similar in electronic
> communications. This notice should not be removed.
>


-- 
Best Regards

Jeff Zhang