You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Pariksheet Barapatre <pb...@gmail.com> on 2014/03/18 07:37:56 UTC
Running spark examples/scala scripts
Hello all,
I am trying to run shipped in example with spark i.e. in example directory.
[cloudera@aster2 examples]$ ls
bagel ExceptionHandlingTest.scala HdfsTest2.scala
LocalKMeans.scala MultiBroadcastTest.scala SparkHdfsLR.scala
SparkPi.scala
BroadcastTest.scala graphx HdfsTest.scala
LocalLR.scala SimpleSkewedGroupByTest.scala SparkKMeans.scala
SparkTC.scala
CassandraTest.scala GroupByTest.scala LocalALS.scala
LocalPi.scala SkewedGroupByTest.scala SparkLR.scala
DriverSubmissionTest.scala HBaseTest.scala LocalFileLR.scala
LogQuery.scala SparkALS.scala SparkPageRank.scala
I am able to run these examples using run-example script, but how to run
these examples without using run-example script.
Thanks,
Pari
Re: Running spark examples/scala scripts
Posted by Mayur Rustagi <ma...@gmail.com>.
You have to pick the right client version for your Hadoop. So basically its
going to be your hadoop version. Map of hadoop versions to cdh &
hortonworks is given on spark website.
Regards
Mayur
Mayur Rustagi
Ph: +1 (760) 203 3257
http://www.sigmoidanalytics.com
@mayur_rustagi <https://twitter.com/mayur_rustagi>
On Wed, Mar 19, 2014 at 2:55 AM, Pariksheet Barapatre
<pb...@gmail.com>wrote:
> :-) Thanks for suggestion.
>
> I was actually asking how to run spark scripts as a standalone App. I am
> able to run Java code and Python code as standalone app.
>
> one more doubt, documentation says - to read HDFS file, we need to add
> dependency
> <groupId>org.apache.hadoop</groupId>
> <artifactId>hadoop-client</artifactId>
> <version>1.0.1</version>
> </dependency>
>
> How to know HDFS version, I just guess 1.0.1 and it worked.
>
>
> Next task is to run scala code with sbt.
>
> Cheers
> Pari
>
>
> On 18 March 2014 22:33, Mayur Rustagi <ma...@gmail.com> wrote:
>
>> print out the last line & run it outside on the shell :)
>>
>> Mayur Rustagi
>> Ph: +1 (760) 203 3257
>> http://www.sigmoidanalytics.com
>> @mayur_rustagi <https://twitter.com/mayur_rustagi>
>>
>>
>>
>> On Tue, Mar 18, 2014 at 2:37 AM, Pariksheet Barapatre <
>> pbarapatre@gmail.com> wrote:
>>
>>> Hello all,
>>>
>>> I am trying to run shipped in example with spark i.e. in example
>>> directory.
>>>
>>> [cloudera@aster2 examples]$ ls
>>> bagel ExceptionHandlingTest.scala HdfsTest2.scala
>>> LocalKMeans.scala MultiBroadcastTest.scala SparkHdfsLR.scala
>>> SparkPi.scala
>>> BroadcastTest.scala graphx HdfsTest.scala
>>> LocalLR.scala SimpleSkewedGroupByTest.scala SparkKMeans.scala
>>> SparkTC.scala
>>> CassandraTest.scala GroupByTest.scala LocalALS.scala
>>> LocalPi.scala SkewedGroupByTest.scala SparkLR.scala
>>> DriverSubmissionTest.scala HBaseTest.scala
>>> LocalFileLR.scala LogQuery.scala SparkALS.scala
>>> SparkPageRank.scala
>>>
>>>
>>> I am able to run these examples using run-example script, but how to run
>>> these examples without using run-example script.
>>>
>>>
>>>
>>>
>>> Thanks,
>>> Pari
>>>
>>
>>
>
>
> --
> Cheers,
> Pari
>
Re: Running spark examples/scala scripts
Posted by Pariksheet Barapatre <pb...@gmail.com>.
:-) Thanks for suggestion.
I was actually asking how to run spark scripts as a standalone App. I am
able to run Java code and Python code as standalone app.
one more doubt, documentation says - to read HDFS file, we need to add
dependency
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>1.0.1</version>
</dependency>
How to know HDFS version, I just guess 1.0.1 and it worked.
Next task is to run scala code with sbt.
Cheers
Pari
On 18 March 2014 22:33, Mayur Rustagi <ma...@gmail.com> wrote:
> print out the last line & run it outside on the shell :)
>
> Mayur Rustagi
> Ph: +1 (760) 203 3257
> http://www.sigmoidanalytics.com
> @mayur_rustagi <https://twitter.com/mayur_rustagi>
>
>
>
> On Tue, Mar 18, 2014 at 2:37 AM, Pariksheet Barapatre <
> pbarapatre@gmail.com> wrote:
>
>> Hello all,
>>
>> I am trying to run shipped in example with spark i.e. in example
>> directory.
>>
>> [cloudera@aster2 examples]$ ls
>> bagel ExceptionHandlingTest.scala HdfsTest2.scala
>> LocalKMeans.scala MultiBroadcastTest.scala SparkHdfsLR.scala
>> SparkPi.scala
>> BroadcastTest.scala graphx HdfsTest.scala
>> LocalLR.scala SimpleSkewedGroupByTest.scala SparkKMeans.scala
>> SparkTC.scala
>> CassandraTest.scala GroupByTest.scala LocalALS.scala
>> LocalPi.scala SkewedGroupByTest.scala SparkLR.scala
>> DriverSubmissionTest.scala HBaseTest.scala
>> LocalFileLR.scala LogQuery.scala SparkALS.scala
>> SparkPageRank.scala
>>
>>
>> I am able to run these examples using run-example script, but how to run
>> these examples without using run-example script.
>>
>>
>>
>>
>> Thanks,
>> Pari
>>
>
>
--
Cheers,
Pari
Re: Running spark examples/scala scripts
Posted by Mayur Rustagi <ma...@gmail.com>.
print out the last line & run it outside on the shell :)
Mayur Rustagi
Ph: +1 (760) 203 3257
http://www.sigmoidanalytics.com
@mayur_rustagi <https://twitter.com/mayur_rustagi>
On Tue, Mar 18, 2014 at 2:37 AM, Pariksheet Barapatre
<pb...@gmail.com>wrote:
> Hello all,
>
> I am trying to run shipped in example with spark i.e. in example directory.
>
> [cloudera@aster2 examples]$ ls
> bagel ExceptionHandlingTest.scala HdfsTest2.scala
> LocalKMeans.scala MultiBroadcastTest.scala SparkHdfsLR.scala
> SparkPi.scala
> BroadcastTest.scala graphx HdfsTest.scala
> LocalLR.scala SimpleSkewedGroupByTest.scala SparkKMeans.scala
> SparkTC.scala
> CassandraTest.scala GroupByTest.scala LocalALS.scala
> LocalPi.scala SkewedGroupByTest.scala SparkLR.scala
> DriverSubmissionTest.scala HBaseTest.scala LocalFileLR.scala
> LogQuery.scala SparkALS.scala SparkPageRank.scala
>
>
> I am able to run these examples using run-example script, but how to run
> these examples without using run-example script.
>
>
>
>
> Thanks,
> Pari
>