You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@systemml.apache.org by Krishna Kalyan <kr...@gmail.com> on 2017/07/17 06:22:23 UTC

spark hybrid mode on HDFS

Hello All,
I have some questions running systemml scripts on HDFS (with hybrid_spark
execution mode).

My Current Configuration:
Standalone HDFS on OSX (version 2.8)
and Spark Pre-Built for hadoop 2.7 (version 2.1.0)

*jps* out from my system
[image: Inline image 1]


Both of them have been installed separately.
As far as I understand, to enable hdfs support we need to run spark master
on yarn-client | yarn-cluster. (Is this understanding correct?)

My question:
I dont have access to a cluster, is there a way to set up a yarn-client /
yarn-cluster or my local system so that I can run systemml scripts on
hybrid_spark mode with HDFS?. If yes could you please point to some
documentation?.

Thank you so much,
Krishna


PS : sysout of what I have tried already attached below.