You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by aaronjosephs <aa...@placeiq.com> on 2014/11/11 20:00:25 UTC

Re: How to execute a function from class in distributed jar on each worker node?

I'm not sure that this will work but it makes sense to me. Basically you
write the functionality in a static block in a class and broadcast that
class. Not sure what your use case is but I need to load a native library
and want to avoid running the init in mapPartitions if it's not necessary
(just to make the code look cleaner)



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-execute-a-function-from-class-in-distributed-jar-on-each-worker-node-tp3870p18611.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org