You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Exie <tf...@prodevelop.com.au> on 2015/07/01 05:40:51 UTC

Re: Spark run errors on Raspberry Pi

FWIW, I had some trouble getting Spark running on a Pi. 

My core problem was using snappy for compression as it comes as a pre-made
binary for i386 and I couldnt find one for ARM.

So to work around it there was an option to use LZO instead, then everything
worked.

Off the top of my head, it was something like:
spark.sql.parquet.compression.codec=lzo

This might be worth trying.



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-run-errors-on-Raspberry-Pi-tp23532p23561.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Spark run errors on Raspberry Pi

Posted by Akhil Das <ak...@sigmoidanalytics.com>.
Now i'm having a strange feeling to try this on KBOX
<http://kevinboone.net/kbox.html> :/

Thanks
Best Regards

On Wed, Jul 1, 2015 at 9:10 AM, Exie <tf...@prodevelop.com.au> wrote:

> FWIW, I had some trouble getting Spark running on a Pi.
>
> My core problem was using snappy for compression as it comes as a pre-made
> binary for i386 and I couldnt find one for ARM.
>
> So to work around it there was an option to use LZO instead, then
> everything
> worked.
>
> Off the top of my head, it was something like:
> spark.sql.parquet.compression.codec=lzo
>
> This might be worth trying.
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-run-errors-on-Raspberry-Pi-tp23532p23561.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>