You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@livy.apache.org by Daniel Seybold <da...@uni-ulm.de> on 2018/09/04 15:36:24 UTC

How to deploy generic Spark applications via Livy using the Java client

Hi guys,

I'd like to use Livy Server and its Java client to deploy generic Spark 
applications by integrating the Java client into a custom orchestration 
engine.

After going through the docs and experimenting the code I am not sure if 
this is already possible with Livy, see the following example:

The orchestration engine can reveive generic Spark binaries and 
additional input parameters, which should be executed programmitcally at 
a Spark Cluster (with the Livy Server).

Yet, according to the PiJob 
<https://livy.incubator.apache.org/docs/latest/programmatic-api.html>example 
it seems that I need to wrap the code of any Spark application to submit 
it via the Java client?

Hence, the following code snippet would not work:

LivyClient client =new LivyClientBuilder()
     .setURI(new URI("http://IP:8998")).build(); client.uploadJar(new File("genberic-spark-app.jar"));

Yet, it seems that this kind of execution would be possible by using the 
REST-API directly (and not the Java Client)?

Thanks for any advice!

Cheers,
Daniel

Re: How to deploy generic Spark applications via Livy using the Java client

Posted by Marcelo Vanzin <va...@cloudera.com>.
If your "orchestrator engine" is receiving pre-built apps from others
and needs to execute them in the cluster, you could just use Livy's
batch API. I don't think there are Java bindings for that, you'd need
to talk to the REST endpoints directly.

The code you're referring to is for "interactive" sessions, where you
can send closures or even code snippets to be executed in Spark.

On Tue, Sep 4, 2018 at 8:37 AM Daniel Seybold <da...@uni-ulm.de> wrote:
>
> Hi guys,
>
> I'd like to use Livy Server and its Java client to deploy generic Spark applications by integrating the Java client into a custom orchestration engine.
>
> After going through the docs and experimenting the code I am not sure if this is already possible with Livy, see the following example:
>
> The orchestration engine can reveive generic Spark binaries and additional input parameters, which should be executed programmitcally at a Spark Cluster (with the Livy Server).
>
> Yet, according to the PiJob example it seems that I need to wrap the code of any Spark application to submit it via the Java client?
>
> Hence, the following code snippet would not work:
>
> LivyClient client = new LivyClientBuilder()
>     .setURI(new URI("http://IP:8998")).build();
>
> client.uploadJar(new File("genberic-spark-app.jar"));
>
> Yet, it seems that this kind of execution would be possible by using the REST-API directly (and not the Java Client)?
>
> Thanks for any advice!
>
> Cheers,
> Daniel



-- 
Marcelo