You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Harihar Nahak <hn...@wynyardgroup.com> on 2014/11/19 22:29:24 UTC
Can we make EdgeRDD and VertexRDD storage level to MEMORY_AND_DISK?
Hi,
I'm running out of memory when I run a GraphX program for dataset moe than
10 GB, It was handle pretty well in case of noraml spark operation when did
StorageLevel.MEMORY_AND_DISK.
In case of GraphX I found its only allowed storing in memory, and it is
because in Graph constructor, this property set by default. When I changed
storage level as per my requirement, it doesn't allow and throw Error
Message sayinh "Cannot Modify StorageLevel when Its already set"
Please help me on these queries :
1 > How to override current staorge level to MEMORY and DISK ?
2 > If its not possible through constructor, what If I modify Graph.scala
class and rebuild it to make it work? By applying this, is there any other
things I need know?
Thanks
-----
--Harihar
--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Can-we-make-EdgeRDD-and-VertexRDD-storage-level-to-MEMORY-AND-DISK-tp19307.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org
Re: Can we make EdgeRDD and VertexRDD storage level to
MEMORY_AND_DISK?
Posted by Harihar Nahak <hn...@wynyardgroup.com>.
Just figured it out using Graph constructor you can pass the storage level
for both Edge and Vertex :
Graph.fromEdges(edges, defaultValue =
("",""),StorageLevel.MEMORY_AND_DISK,StorageLevel.MEMORY_AND_DISK )
Thanks to this post : https://issues.apache.org/jira/browse/SPARK-1991
-----
--Harihar
--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Can-we-make-EdgeRDD-and-VertexRDD-storage-level-to-MEMORY-AND-DISK-tp19307p19335.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org