You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by Andrea Santurbano <sa...@gmail.com> on 2016/07/30 12:58:51 UTC

Import jars from spark spackages

Hi all,
i want to import this library:
https://github.com/databricks/spark-corenlp
which is under spark packages:
https://spark-packages.org/package/databricks/spark-corenlp
If in my interpreter settings, in artifact section i insert:
databricks:spark-corenlp:0.1
or
com.databricks:spark-corenlp:0.1

no package is found.
Where am i wrong?

Thanks
Andrea

Re: Import jars from spark spackages

Posted by DuyHai Doan <do...@gmail.com>.
I can't find either the package "databricks:spark-corenlp:0.1" on Maven
Central. Normally when you specify the dependencies in Zeppelin interpreter
setting, Zeppelin will look for them using your local Maven repo and Maven
central repo.

I suspect that this package is only available on the repo of Databricks,
which I don't know the URL. If you find this URL, you can add their repo
URL in Zeppelin so that the jar can be downloaded from there

On Sat, Jul 30, 2016 at 8:33 PM, Andrea Santurbano <sa...@gmail.com>
wrote:

> Here is the log[1]
> I have a standard zeppelin configuration:
> zeppelin.dep.additionalRemoteRepository=spark-packages,
> http://dl.bintray.com/spark-packages/maven,false;
>
>
> [1] https://gist.github.com/conker84/d2ad350850f39022e594825b6fda980e
>
> Il giorno sab 30 lug 2016 alle ore 15:14 DuyHai Doan <do...@gmail.com>
> ha scritto:
>
>> What is exactly the error message you have in the logs ?
>>
>> On Sat, Jul 30, 2016 at 2:58 PM, Andrea Santurbano <sa...@gmail.com>
>> wrote:
>>
>>> Hi all,
>>> i want to import this library:
>>> https://github.com/databricks/spark-corenlp
>>> which is under spark packages:
>>> https://spark-packages.org/package/databricks/spark-corenlp
>>> If in my interpreter settings, in artifact section i insert:
>>> databricks:spark-corenlp:0.1
>>> or
>>> com.databricks:spark-corenlp:0.1
>>>
>>> no package is found.
>>> Where am i wrong?
>>>
>>> Thanks
>>> Andrea
>>>
>>
>>

Re: Import jars from spark spackages

Posted by Andrea Santurbano <sa...@gmail.com>.
Here is the log[1]
I have a standard zeppelin configuration:
zeppelin.dep.additionalRemoteRepository=spark-packages,
http://dl.bintray.com/spark-packages/maven,false;


[1] https://gist.github.com/conker84/d2ad350850f39022e594825b6fda980e

Il giorno sab 30 lug 2016 alle ore 15:14 DuyHai Doan <do...@gmail.com>
ha scritto:

> What is exactly the error message you have in the logs ?
>
> On Sat, Jul 30, 2016 at 2:58 PM, Andrea Santurbano <sa...@gmail.com>
> wrote:
>
>> Hi all,
>> i want to import this library:
>> https://github.com/databricks/spark-corenlp
>> which is under spark packages:
>> https://spark-packages.org/package/databricks/spark-corenlp
>> If in my interpreter settings, in artifact section i insert:
>> databricks:spark-corenlp:0.1
>> or
>> com.databricks:spark-corenlp:0.1
>>
>> no package is found.
>> Where am i wrong?
>>
>> Thanks
>> Andrea
>>
>
>

Re: Import jars from spark spackages

Posted by DuyHai Doan <do...@gmail.com>.
What is exactly the error message you have in the logs ?

On Sat, Jul 30, 2016 at 2:58 PM, Andrea Santurbano <sa...@gmail.com>
wrote:

> Hi all,
> i want to import this library:
> https://github.com/databricks/spark-corenlp
> which is under spark packages:
> https://spark-packages.org/package/databricks/spark-corenlp
> If in my interpreter settings, in artifact section i insert:
> databricks:spark-corenlp:0.1
> or
> com.databricks:spark-corenlp:0.1
>
> no package is found.
> Where am i wrong?
>
> Thanks
> Andrea
>