You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by RuiyangChen <rc...@illinois.edu> on 2018/11/01 02:30:50 UTC

Rack Awareness in Spark

Hello everyone,
Is there is a way to specify rack awareness in Spark? For example, if I want
to use AggregatebyKey, is there a way to let Spark aggregate within the same
rack first, then aggregate between rack? I'm interested in this because I am
trying to figure whether there is a way to deal with limp inter-rack
network. 
I'm have searched through mailing list and StackOverflow, but all of them
are talking about rack awareness in HDFS instead of Spark.
Thanks a lot!

Ruiyang



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org