You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Mahender Sarangam <Ma...@outlook.com> on 2016/10/26 08:35:44 UTC

Any Dynamic Compilation of Scala Query

Hi,

Is there any way to dynamically execute a string  which has scala code 
against spark engine. We are dynamically creating scala file, we would 
like to submit this scala file to Spark, but currently spark accepts 
only JAR file has input from Remote Job submission. Is there any other 
way to submit .SCALA instead of .JAR to REST API of Spark ?

/MS


Re: Any Dynamic Compilation of Scala Query

Posted by Vadim Semenov <va...@datadoghq.com>.
You can use Cloudera Livy for that https://github.com/cloudera/livy
take a look at this example https://github.com/cloudera/livy#spark-example

On Wed, Oct 26, 2016 at 4:35 AM, Mahender Sarangam <
Mahender.BigData@outlook.com> wrote:

> Hi,
>
> Is there any way to dynamically execute a string  which has scala code
> against spark engine. We are dynamically creating scala file, we would
> like to submit this scala file to Spark, but currently spark accepts
> only JAR file has input from Remote Job submission. Is there any other
> way to submit .SCALA instead of .JAR to REST API of Spark ?
>
> /MS
>
>

Re: Any Dynamic Compilation of Scala Query

Posted by Mahender Sarangam <Ma...@outlook.com>.
Hi Kiran,

Thanks for responding. We would like to know how industry is dealing scenario like Update in SPARK.  Here is our scenario Manjunath, We are in process of migrating our SQL server data to Spark. We have our logic in stored procedure, where we dynamically create SQL String and execute that SQL String (Dynamic SQL), we would like to implement Dynamic string and submit to hive context and execute it .

Here is the query in SQL

UPDATE     table1
      SET       X = A
        ,Y  = B
 FROM     Table1
 WHERE  ISNULL([Z] ,'') <> ''
      AND  [ColumnW] NOT IN ('X' ,'ACD', 'A', 'B', 'C')
     AND   [ColumnA] IS NULL


We would like to convert  using Spark SQL , the other way i would think of is using of Data frame with "WithColumn" along with WHEN condition for each column i.e X and Y , here when condition will have same repetitive code applied on each column based on above where clause stmt/condition . I would like to know Industry practices for these kind of scenarios.

On 10/26/2016 4:09 AM, Manjunath, Kiran wrote:
Hi,

Can you elaborate with sample example on why you would want to do so?
Ideally there would be a better approach than solving such problems as mentioned below.

A sample example would help to understand the problem.

Regards,
Kiran

From: Mahender Sarangam <Ma...@outlook.com>
Date: Wednesday, October 26, 2016 at 2:05 PM
To: user <us...@spark.apache.org>
Subject: Any Dynamic Compilation of Scala Query

Hi,

Is there any way to dynamically execute a string  which has scala code
against spark engine. We are dynamically creating scala file, we would
like to submit this scala file to Spark, but currently spark accepts
only JAR file has input from Remote Job submission. Is there any other
way to submit .SCALA instead of .JAR to REST API of Spark ?

/MS




Re: Any Dynamic Compilation of Scala Query

Posted by "Manjunath, Kiran" <ki...@akamai.com>.
Hi,

Can you elaborate with sample example on why you would want to do so?
Ideally there would be a better approach than solving such problems as mentioned below.

A sample example would help to understand the problem.

Regards,
Kiran

From: Mahender Sarangam <Ma...@outlook.com>
Date: Wednesday, October 26, 2016 at 2:05 PM
To: user <us...@spark.apache.org>
Subject: Any Dynamic Compilation of Scala Query

Hi,

Is there any way to dynamically execute a string  which has scala code
against spark engine. We are dynamically creating scala file, we would
like to submit this scala file to Spark, but currently spark accepts
only JAR file has input from Remote Job submission. Is there any other
way to submit .SCALA instead of .JAR to REST API of Spark ?

/MS