You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Lee Ho Yeung <jo...@gmail.com> on 2016/06/15 10:02:55 UTC

can spark help to prevent memory error for itertools.combinations(initlist, 2) in python script

i write a python script which has itertools.combinations(initlist, 2)

but it got error when number of elements in initlist over 14,000

is it possible to use spark to do this work?

i have seen yatel can do this, is spark and yatel using hard disk as memory?

if so,

which need to change in python code ?