You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by mharwida <ma...@yahoo.com> on 2014/01/10 17:38:43 UTC

Default Storage Level in Spark

Hi,

I'm using Spark 0.8.0 and Shark 0.8.0. Upon creating a cached table in
memory using Shark, the Spark UI indicates that the type of storage level
used is 'Disk Memory Deserialized 1x Replicated'. I had the assumption that
Memory Only is the default storage level in Spark. Did that change in 0.8.0?
And how could I change the storage level in Spark?

Thanks
Majd




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Default-Storage-Level-in-Spark-tp447.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.