You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by fernandrez1987 <an...@wellsfargo.com> on 2016/01/26 15:47:34 UTC

RE: how to correctly run scala script using spark-shell through stdin (spark v1.0.0)

spark-shell -i file.scala is not working for me in Spark 1.6.0, was this
removed or what do I have to take into account? The script does not get run
at all. What can be happening?
<http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/script.png> 

<http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/shell-call.png> 

<http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/no-println.png> 



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/how-to-correctly-run-scala-script-using-spark-shell-through-stdin-spark-v1-0-0-tp12972p26071.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


RE: how to correctly run scala script using spark-shell through stdin (spark v1.0.0)

Posted by An...@wellsfargo.com.
True thank you. Is there a way of having the shell not closed (how to avoid the :quit statement). Thank you both.

Andres

From: Ewan Leith [mailto:ewan.leith@realitymine.com]
Sent: Tuesday, January 26, 2016 1:50 PM
To: Iulian Dragoș; Fernandez, Andres
Cc: user
Subject: RE: how to correctly run scala script using spark-shell through stdin (spark v1.0.0)

I’ve just tried running this using a normal stdin redirect:

~/spark/bin/spark-shell < simple.scala

Which worked, it started spark-shell, executed the script, the stopped the shell.

Thanks,
Ewan

From: Iulian Dragoș [mailto:iulian.dragos@typesafe.com]
Sent: 26 January 2016 15:00
To: fernandrez1987 <an...@wellsfargo.com>>
Cc: user <us...@spark.apache.org>>
Subject: Re: how to correctly run scala script using spark-shell through stdin (spark v1.0.0)


I don’t see -i in the output of spark-shell --help. Moreover, in master I get an error:

$ bin/spark-shell -i test.scala

bad option: '-i'

iulian
​

On Tue, Jan 26, 2016 at 3:47 PM, fernandrez1987 <an...@wellsfargo.com>> wrote:
spark-shell -i file.scala is not working for me in Spark 1.6.0, was this
removed or what do I have to take into account? The script does not get run
at all. What can be happening?
<http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/script.png>

<http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/shell-call.png>

<http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/no-println.png>



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/how-to-correctly-run-scala-script-using-spark-shell-through-stdin-spark-v1-0-0-tp12972p26071.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org<ma...@spark.apache.org>
For additional commands, e-mail: user-help@spark.apache.org<ma...@spark.apache.org>



--

--
Iulian Dragos

------
Reactive Apps on the JVM
www.typesafe.com<http://www.typesafe.com>


Re: how to correctly run scala script using spark-shell through stdin (spark v1.0.0)

Posted by Iulian Dragoș <iu...@typesafe.com>.
On Fri, Jan 29, 2016 at 5:22 PM, Iulian Dragoș <iu...@typesafe.com>
wrote:

> I found the issue in the 2.11 version of the REPL, PR will follow shortly.
>


https://github.com/apache/spark/pull/10984



>
> The 2.10 version of Spark doesn't have this issue, so you could use that
> in the mean time.
>
> iulian
>
> On Wed, Jan 27, 2016 at 3:17 PM, <An...@wellsfargo.com> wrote:
>
>> So far, still cannot find a way of running a small Scala script right
>> after executing the shell, and get the shell to remain open. Is there a way
>> of doing this?
>>
>> Feels like a simple/naive question but really couldn’t find an answer.
>>
>>
>>
>> *From:* Fernandez, Andres
>> *Sent:* Tuesday, January 26, 2016 2:53 PM
>> *To:* 'Ewan Leith'; Iulian Dragoș
>> *Cc:* user
>> *Subject:* RE: how to correctly run scala script using spark-shell
>> through stdin (spark v1.0.0)
>>
>>
>>
>> True thank you. Is there a way of having the shell not closed (how to
>> avoid the :quit statement). Thank you both.
>>
>>
>>
>> Andres
>>
>>
>>
>> *From:* Ewan Leith [mailto:ewan.leith@realitymine.com
>> <ew...@realitymine.com>]
>> *Sent:* Tuesday, January 26, 2016 1:50 PM
>> *To:* Iulian Dragoș; Fernandez, Andres
>> *Cc:* user
>> *Subject:* RE: how to correctly run scala script using spark-shell
>> through stdin (spark v1.0.0)
>>
>>
>>
>> I’ve just tried running this using a normal stdin redirect:
>>
>>
>>
>> ~/spark/bin/spark-shell < simple.scala
>>
>>
>>
>> Which worked, it started spark-shell, executed the script, the stopped
>> the shell.
>>
>>
>>
>> Thanks,
>>
>> Ewan
>>
>>
>>
>> *From:* Iulian Dragoș [mailto:iulian.dragos@typesafe.com
>> <iu...@typesafe.com>]
>> *Sent:* 26 January 2016 15:00
>> *To:* fernandrez1987 <an...@wellsfargo.com>
>> *Cc:* user <us...@spark.apache.org>
>> *Subject:* Re: how to correctly run scala script using spark-shell
>> through stdin (spark v1.0.0)
>>
>>
>>
>> I don’t see -i in the output of spark-shell --help. Moreover, in master
>> I get an error:
>>
>> $ bin/spark-shell -i test.scala
>>
>> bad option: '-i'
>>
>> iulian
>>
>> ​
>>
>>
>>
>> On Tue, Jan 26, 2016 at 3:47 PM, fernandrez1987 <
>> andres.fernandez@wellsfargo.com> wrote:
>>
>> spark-shell -i file.scala is not working for me in Spark 1.6.0, was this
>> removed or what do I have to take into account? The script does not get
>> run
>> at all. What can be happening?
>> <
>> http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/script.png
>> >
>>
>> <
>> http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/shell-call.png
>> >
>>
>> <
>> http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/no-println.png
>> >
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/how-to-correctly-run-scala-script-using-spark-shell-through-stdin-spark-v1-0-0-tp12972p26071.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> For additional commands, e-mail: user-help@spark.apache.org
>>
>>
>>
>>
>>
>> --
>>
>>
>> --
>> Iulian Dragos
>>
>>
>>
>> ------
>> Reactive Apps on the JVM
>> www.typesafe.com
>>
>>
>>
>
>
>
> --
>
> --
> Iulian Dragos
>
> ------
> Reactive Apps on the JVM
> www.typesafe.com
>
>


-- 

--
Iulian Dragos

------
Reactive Apps on the JVM
www.typesafe.com

Re: how to correctly run scala script using spark-shell through stdin (spark v1.0.0)

Posted by Iulian Dragoș <iu...@typesafe.com>.
I found the issue in the 2.11 version of the REPL, PR will follow shortly.

The 2.10 version of Spark doesn't have this issue, so you could use that in
the mean time.

iulian

On Wed, Jan 27, 2016 at 3:17 PM, <An...@wellsfargo.com> wrote:

> So far, still cannot find a way of running a small Scala script right
> after executing the shell, and get the shell to remain open. Is there a way
> of doing this?
>
> Feels like a simple/naive question but really couldn’t find an answer.
>
>
>
> *From:* Fernandez, Andres
> *Sent:* Tuesday, January 26, 2016 2:53 PM
> *To:* 'Ewan Leith'; Iulian Dragoș
> *Cc:* user
> *Subject:* RE: how to correctly run scala script using spark-shell
> through stdin (spark v1.0.0)
>
>
>
> True thank you. Is there a way of having the shell not closed (how to
> avoid the :quit statement). Thank you both.
>
>
>
> Andres
>
>
>
> *From:* Ewan Leith [mailto:ewan.leith@realitymine.com
> <ew...@realitymine.com>]
> *Sent:* Tuesday, January 26, 2016 1:50 PM
> *To:* Iulian Dragoș; Fernandez, Andres
> *Cc:* user
> *Subject:* RE: how to correctly run scala script using spark-shell
> through stdin (spark v1.0.0)
>
>
>
> I’ve just tried running this using a normal stdin redirect:
>
>
>
> ~/spark/bin/spark-shell < simple.scala
>
>
>
> Which worked, it started spark-shell, executed the script, the stopped the
> shell.
>
>
>
> Thanks,
>
> Ewan
>
>
>
> *From:* Iulian Dragoș [mailto:iulian.dragos@typesafe.com
> <iu...@typesafe.com>]
> *Sent:* 26 January 2016 15:00
> *To:* fernandrez1987 <an...@wellsfargo.com>
> *Cc:* user <us...@spark.apache.org>
> *Subject:* Re: how to correctly run scala script using spark-shell
> through stdin (spark v1.0.0)
>
>
>
> I don’t see -i in the output of spark-shell --help. Moreover, in master I
> get an error:
>
> $ bin/spark-shell -i test.scala
>
> bad option: '-i'
>
> iulian
>
> ​
>
>
>
> On Tue, Jan 26, 2016 at 3:47 PM, fernandrez1987 <
> andres.fernandez@wellsfargo.com> wrote:
>
> spark-shell -i file.scala is not working for me in Spark 1.6.0, was this
> removed or what do I have to take into account? The script does not get run
> at all. What can be happening?
> <
> http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/script.png
> >
>
> <
> http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/shell-call.png
> >
>
> <
> http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/no-println.png
> >
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/how-to-correctly-run-scala-script-using-spark-shell-through-stdin-spark-v1-0-0-tp12972p26071.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>
>
>
>
> --
>
>
> --
> Iulian Dragos
>
>
>
> ------
> Reactive Apps on the JVM
> www.typesafe.com
>
>
>



-- 

--
Iulian Dragos

------
Reactive Apps on the JVM
www.typesafe.com

RE: how to correctly run scala script using spark-shell through stdin (spark v1.0.0)

Posted by An...@wellsfargo.com.
So far, still cannot find a way of running a small Scala script right after executing the shell, and get the shell to remain open. Is there a way of doing this?
Feels like a simple/naive question but really couldn’t find an answer.

From: Fernandez, Andres
Sent: Tuesday, January 26, 2016 2:53 PM
To: 'Ewan Leith'; Iulian Dragoș
Cc: user
Subject: RE: how to correctly run scala script using spark-shell through stdin (spark v1.0.0)

True thank you. Is there a way of having the shell not closed (how to avoid the :quit statement). Thank you both.

Andres

From: Ewan Leith [mailto:ewan.leith@realitymine.com]
Sent: Tuesday, January 26, 2016 1:50 PM
To: Iulian Dragoș; Fernandez, Andres
Cc: user
Subject: RE: how to correctly run scala script using spark-shell through stdin (spark v1.0.0)

I’ve just tried running this using a normal stdin redirect:

~/spark/bin/spark-shell < simple.scala

Which worked, it started spark-shell, executed the script, the stopped the shell.

Thanks,
Ewan

From: Iulian Dragoș [mailto:iulian.dragos@typesafe.com]
Sent: 26 January 2016 15:00
To: fernandrez1987 <an...@wellsfargo.com>>
Cc: user <us...@spark.apache.org>>
Subject: Re: how to correctly run scala script using spark-shell through stdin (spark v1.0.0)


I don’t see -i in the output of spark-shell --help. Moreover, in master I get an error:

$ bin/spark-shell -i test.scala

bad option: '-i'

iulian
​

On Tue, Jan 26, 2016 at 3:47 PM, fernandrez1987 <an...@wellsfargo.com>> wrote:
spark-shell -i file.scala is not working for me in Spark 1.6.0, was this
removed or what do I have to take into account? The script does not get run
at all. What can be happening?
<http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/script.png>

<http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/shell-call.png>

<http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/no-println.png>



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/how-to-correctly-run-scala-script-using-spark-shell-through-stdin-spark-v1-0-0-tp12972p26071.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org<ma...@spark.apache.org>
For additional commands, e-mail: user-help@spark.apache.org<ma...@spark.apache.org>



--

--
Iulian Dragos

------
Reactive Apps on the JVM
www.typesafe.com<http://www.typesafe.com>


RE: how to correctly run scala script using spark-shell through stdin (spark v1.0.0)

Posted by Ewan Leith <ew...@realitymine.com>.
I’ve just tried running this using a normal stdin redirect:

~/spark/bin/spark-shell < simple.scala

Which worked, it started spark-shell, executed the script, the stopped the shell.

Thanks,
Ewan

From: Iulian Dragoș [mailto:iulian.dragos@typesafe.com]
Sent: 26 January 2016 15:00
To: fernandrez1987 <an...@wellsfargo.com>
Cc: user <us...@spark.apache.org>
Subject: Re: how to correctly run scala script using spark-shell through stdin (spark v1.0.0)


I don’t see -i in the output of spark-shell --help. Moreover, in master I get an error:

$ bin/spark-shell -i test.scala

bad option: '-i'

iulian
​

On Tue, Jan 26, 2016 at 3:47 PM, fernandrez1987 <an...@wellsfargo.com>> wrote:
spark-shell -i file.scala is not working for me in Spark 1.6.0, was this
removed or what do I have to take into account? The script does not get run
at all. What can be happening?
<http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/script.png>

<http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/shell-call.png>

<http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/no-println.png>



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/how-to-correctly-run-scala-script-using-spark-shell-through-stdin-spark-v1-0-0-tp12972p26071.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org<ma...@spark.apache.org>
For additional commands, e-mail: user-help@spark.apache.org<ma...@spark.apache.org>



--

--
Iulian Dragos

------
Reactive Apps on the JVM
www.typesafe.com<http://www.typesafe.com>


Re: how to correctly run scala script using spark-shell through stdin (spark v1.0.0)

Posted by Iulian Dragoș <iu...@typesafe.com>.
On Tue, Jan 26, 2016 at 4:08 PM, <An...@wellsfargo.com> wrote:

> Yes no option –i. Thanks Iulian, but do you know how can I send three
> lines to be executed just after spark-shell has initiated. Please check
> http://apache-spark-user-list.1001560.n3.nabble.com/how-to-correctly-run-scala-script-using-spark-shell-through-stdin-spark-v1-0-0-td12972.html#a26071
> .
>

To be honest, I think this might be a regression I introduced, or at least,
it's something that works in the 2.10 version of Spark. By just looking at
the code, it should accept the same arguments as the Scala interpreter.
I'll look into it.

iulian


>
>
> Thank you very much for your time.
>
>
>
> *From:* Iulian Dragoș [mailto:iulian.dragos@typesafe.com]
> *Sent:* Tuesday, January 26, 2016 12:00 PM
> *To:* Fernandez, Andres
> *Cc:* user
> *Subject:* Re: how to correctly run scala script using spark-shell
> through stdin (spark v1.0.0)
>
>
>
> I don’t see -i in the output of spark-shell --help. Moreover, in master I
> get an error:
>
> $ bin/spark-shell -i test.scala
>
> bad option: '-i'
>
> iulian
>
> ​
>
>
>
> On Tue, Jan 26, 2016 at 3:47 PM, fernandrez1987 <
> andres.fernandez@wellsfargo.com> wrote:
>
> spark-shell -i file.scala is not working for me in Spark 1.6.0, was this
> removed or what do I have to take into account? The script does not get run
> at all. What can be happening?
> <
> http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/script.png
> >
>
> <
> http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/shell-call.png
> >
>
> <
> http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/no-println.png
> >
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/how-to-correctly-run-scala-script-using-spark-shell-through-stdin-spark-v1-0-0-tp12972p26071.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>
>
>
>
> --
>
>
> --
> Iulian Dragos
>
>
>
> ------
> Reactive Apps on the JVM
> www.typesafe.com
>
>
>



-- 

--
Iulian Dragos

------
Reactive Apps on the JVM
www.typesafe.com

Re: how to correctly run scala script using spark-shell through stdin (spark v1.0.0)

Posted by Iulian Dragoș <iu...@typesafe.com>.
I don’t see -i in the output of spark-shell --help. Moreover, in master I
get an error:

$ bin/spark-shell -i test.scala
bad option: '-i'

iulian
​

On Tue, Jan 26, 2016 at 3:47 PM, fernandrez1987 <
andres.fernandez@wellsfargo.com> wrote:

> spark-shell -i file.scala is not working for me in Spark 1.6.0, was this
> removed or what do I have to take into account? The script does not get run
> at all. What can be happening?
> <
> http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/script.png
> >
>
> <
> http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/shell-call.png
> >
>
> <
> http://apache-spark-user-list.1001560.n3.nabble.com/file/n26071/no-println.png
> >
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/how-to-correctly-run-scala-script-using-spark-shell-through-stdin-spark-v1-0-0-tp12972p26071.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>


-- 

--
Iulian Dragos

------
Reactive Apps on the JVM
www.typesafe.com