You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by Anton Kulaga <an...@gmail.com> on 2019/11/08 10:11:35 UTC

Re: Zeppelin 0.8.2 New Spark Interpreter

Are there clear instructions how to use spark.jars.packages properties?
For instance, if I want to depend on bintray repo https://dl.bintray.com/comp-bio-aging/main with "group.research.aging:spark-extensions_2.11:0.0.7.2" as a dependency, what should I do with newintepreter?

On 2019/10/12 01:18:09, Jeff Zhang <zj...@gmail.com> wrote: 
> Glad to hear that.
> 
> Mark Bidewell <mb...@gmail.com> 于2019年10月12日周六 上午1:30写道:
> 
> > Just wanted to say "thanks"!  Using spark.jars.packages, etc worked great!
> >
> > On Fri, Oct 11, 2019 at 9:45 AM Jeff Zhang <zj...@gmail.com> wrote:
> >
> >> That's right, document should also be updated
> >>
> >> Mark Bidewell <mb...@gmail.com> 于2019年10月11日周五 下午9:28写道:
> >>
> >>> Also the interpreter setting UI is still listed as the first way to
> >>> handle dependencies in the documentation - Maybe it should be marked as
> >>> deprecated?
> >>>
> >>> http://zeppelin.apache.org/docs/0.8.2/interpreter/spark.html
> >>>
> >>>
> >>> On Thu, Oct 10, 2019 at 9:58 PM Jeff Zhang <zj...@gmail.com> wrote:
> >>>
> >>>> It looks like many users still get used to specify spark dependencies
> >>>> in interpreter setting UI, spark.jars and spark.jars.packages seems too
> >>>> difficult to understand and not transparent, so I create ticket
> >>>> https://issues.apache.org/jira/browse/ZEPPELIN-4374 that user can
> >>>> still set dependencies in interpreter setting UI.
> >>>>
> >>>> Jeff Zhang <zj...@gmail.com> 于2019年10月11日周五 上午9:54写道:
> >>>>
> >>>>> Like I said above, try to set them via spark.jars and
> >>>>> spark.jars.packages.
> >>>>>
> >>>>> Don't set them here
> >>>>>
> >>>>> [image: image.png]
> >>>>>
> >>>>>
> >>>>> Mark Bidewell <mb...@gmail.com> 于2019年10月11日周五 上午9:35写道:
> >>>>>
> >>>>>> I was specifying them in the interpreter settings in the UI.
> >>>>>>
> >>>>>> On Thu, Oct 10, 2019 at 9:30 PM Jeff Zhang <zj...@gmail.com> wrote:
> >>>>>>
> >>>>>>> How do you specify your spark interpreter dependencies ? You need to
> >>>>>>> specify it via property spark.jars or spark.jars.packages for non-local
> >>>>>>> model.
> >>>>>>>
> >>>>>>> Mark Bidewell <mb...@gmail.com> 于2019年10月11日周五 上午3:45写道:
> >>>>>>>
> >>>>>>>> I am running some initial tests of Zeppelin 0.8.2 and I am seeing
> >>>>>>>> some weird issues with dependencies.  When I use the old interpreter,
> >>>>>>>> everything works as expected.  When I use the new interpreter, classes in
> >>>>>>>> my interpreter dependencies cannot be resolved when connecting to a master
> >>>>>>>> that is not local[*],  I did not encounter issues with either interpreter
> >>>>>>>> on 0.8.1.
> >>>>>>>>
> >>>>>>>> Has anyone else seen this?
> >>>>>>>>
> >>>>>>>> Thanks!
> >>>>>>>>
> >>>>>>>> --
> >>>>>>>> Mark Bidewell
> >>>>>>>> http://www.linkedin.com/in/markbidewell
> >>>>>>>>
> >>>>>>>
> >>>>>>>
> >>>>>>> --
> >>>>>>> Best Regards
> >>>>>>>
> >>>>>>> Jeff Zhang
> >>>>>>>
> >>>>>>
> >>>>>>
> >>>>>> --
> >>>>>> Mark Bidewell
> >>>>>> http://www.linkedin.com/in/markbidewell
> >>>>>>
> >>>>>
> >>>>>
> >>>>> --
> >>>>> Best Regards
> >>>>>
> >>>>> Jeff Zhang
> >>>>>
> >>>>
> >>>>
> >>>> --
> >>>> Best Regards
> >>>>
> >>>> Jeff Zhang
> >>>>
> >>>
> >>>
> >>> --
> >>> Mark Bidewell
> >>> http://www.linkedin.com/in/markbidewell
> >>>
> >>
> >>
> >> --
> >> Best Regards
> >>
> >> Jeff Zhang
> >>
> >
> >
> > --
> > Mark Bidewell
> > http://www.linkedin.com/in/markbidewell
> >
> 
> 
> -- 
> Best Regards
> 
> Jeff Zhang
> 

Re: Zeppelin 0.8.2 New Spark Interpreter

Posted by Mark Bidewell <mb...@gmail.com>.
Number 2 under http://zeppelin.apache.org/docs/0.8.2/interpreter/spark.html is
the best guide.  spark.jars.packages can be set on the interpreter.  I had
to add export SPARK_SUBMIT_OPTIONS="--repositories <URL>" to
zeppelin-env.sh to add my repo to the mix

On Fri, Nov 8, 2019 at 5:11 AM Anton Kulaga <an...@gmail.com> wrote:

> Are there clear instructions how to use spark.jars.packages properties?
> For instance, if I want to depend on bintray repo
> https://dl.bintray.com/comp-bio-aging/main with
> "group.research.aging:spark-extensions_2.11:0.0.7.2" as a dependency, what
> should I do with newintepreter?
>
> On 2019/10/12 01:18:09, Jeff Zhang <zj...@gmail.com> wrote:
> > Glad to hear that.
> >
> > Mark Bidewell <mb...@gmail.com> 于2019年10月12日周六 上午1:30写道:
> >
> > > Just wanted to say "thanks"!  Using spark.jars.packages, etc worked
> great!
> > >
> > > On Fri, Oct 11, 2019 at 9:45 AM Jeff Zhang <zj...@gmail.com> wrote:
> > >
> > >> That's right, document should also be updated
> > >>
> > >> Mark Bidewell <mb...@gmail.com> 于2019年10月11日周五 下午9:28写道:
> > >>
> > >>> Also the interpreter setting UI is still listed as the first way to
> > >>> handle dependencies in the documentation - Maybe it should be marked
> as
> > >>> deprecated?
> > >>>
> > >>> http://zeppelin.apache.org/docs/0.8.2/interpreter/spark.html
> > >>>
> > >>>
> > >>> On Thu, Oct 10, 2019 at 9:58 PM Jeff Zhang <zj...@gmail.com> wrote:
> > >>>
> > >>>> It looks like many users still get used to specify spark
> dependencies
> > >>>> in interpreter setting UI, spark.jars and spark.jars.packages seems
> too
> > >>>> difficult to understand and not transparent, so I create ticket
> > >>>> https://issues.apache.org/jira/browse/ZEPPELIN-4374 that user can
> > >>>> still set dependencies in interpreter setting UI.
> > >>>>
> > >>>> Jeff Zhang <zj...@gmail.com> 于2019年10月11日周五 上午9:54写道:
> > >>>>
> > >>>>> Like I said above, try to set them via spark.jars and
> > >>>>> spark.jars.packages.
> > >>>>>
> > >>>>> Don't set them here
> > >>>>>
> > >>>>> [image: image.png]
> > >>>>>
> > >>>>>
> > >>>>> Mark Bidewell <mb...@gmail.com> 于2019年10月11日周五 上午9:35写道:
> > >>>>>
> > >>>>>> I was specifying them in the interpreter settings in the UI.
> > >>>>>>
> > >>>>>> On Thu, Oct 10, 2019 at 9:30 PM Jeff Zhang <zj...@gmail.com>
> wrote:
> > >>>>>>
> > >>>>>>> How do you specify your spark interpreter dependencies ? You
> need to
> > >>>>>>> specify it via property spark.jars or spark.jars.packages for
> non-local
> > >>>>>>> model.
> > >>>>>>>
> > >>>>>>> Mark Bidewell <mb...@gmail.com> 于2019年10月11日周五 上午3:45写道:
> > >>>>>>>
> > >>>>>>>> I am running some initial tests of Zeppelin 0.8.2 and I am
> seeing
> > >>>>>>>> some weird issues with dependencies.  When I use the old
> interpreter,
> > >>>>>>>> everything works as expected.  When I use the new interpreter,
> classes in
> > >>>>>>>> my interpreter dependencies cannot be resolved when connecting
> to a master
> > >>>>>>>> that is not local[*],  I did not encounter issues with either
> interpreter
> > >>>>>>>> on 0.8.1.
> > >>>>>>>>
> > >>>>>>>> Has anyone else seen this?
> > >>>>>>>>
> > >>>>>>>> Thanks!
> > >>>>>>>>
> > >>>>>>>> --
> > >>>>>>>> Mark Bidewell
> > >>>>>>>> http://www.linkedin.com/in/markbidewell
> > >>>>>>>>
> > >>>>>>>
> > >>>>>>>
> > >>>>>>> --
> > >>>>>>> Best Regards
> > >>>>>>>
> > >>>>>>> Jeff Zhang
> > >>>>>>>
> > >>>>>>
> > >>>>>>
> > >>>>>> --
> > >>>>>> Mark Bidewell
> > >>>>>> http://www.linkedin.com/in/markbidewell
> > >>>>>>
> > >>>>>
> > >>>>>
> > >>>>> --
> > >>>>> Best Regards
> > >>>>>
> > >>>>> Jeff Zhang
> > >>>>>
> > >>>>
> > >>>>
> > >>>> --
> > >>>> Best Regards
> > >>>>
> > >>>> Jeff Zhang
> > >>>>
> > >>>
> > >>>
> > >>> --
> > >>> Mark Bidewell
> > >>> http://www.linkedin.com/in/markbidewell
> > >>>
> > >>
> > >>
> > >> --
> > >> Best Regards
> > >>
> > >> Jeff Zhang
> > >>
> > >
> > >
> > > --
> > > Mark Bidewell
> > > http://www.linkedin.com/in/markbidewell
> > >
> >
> >
> > --
> > Best Regards
> >
> > Jeff Zhang
> >
>


-- 
Mark Bidewell
http://www.linkedin.com/in/markbidewell