You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Andrew Or <an...@databricks.com> on 2014/06/12 18:47:18 UTC

Re: use spark-shell in the source

Not sure if this is what you're looking for, but have you looked at java's
ProcessBuilder? You can do something like

for (line <- lines) {
  val command = line.split(" ") // You may need to deal with quoted strings
  val process = new ProcessBuilder(command)
  // redirect output of process to main thread
  process.start()
}

Or are you trying to launch an interactive REPL in the middle of your
application?


2014-06-11 22:56 GMT-07:00 JaeBoo Jung <it...@samsung.com>:

>  Hi all,
>
>
>
> Can I use spark-shell programmatically in my spark application(in java or
> scala)?
>
> Because I want to convert scala lines to string array and run
> automatically in my application.
>
> For example,
>
>     for( var line <- lines){
>
>         //run this line in spark shell style and get outputs.
>
>         run(line);
>
>     }
>
> Thanks
>
> _____________________________________________
>
> *JaeBoo, Jung*
> Assistant Engineer / BDA Lab / Samsung SDS
>

Re: use spark-shell in the source

Posted by Kevin Jung <it...@samsung.com>.
Thanks for answer.
Yes, I tried to launch an interactive REPL in the middle of my application
:)




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/use-spark-shell-in-the-source-tp7453p7539.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.