You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Nick Chammas <ni...@gmail.com> on 2017/05/05 13:55:13 UTC

Reading ORC file - fine on 1.6; GC timeout on 2+

I have this ORC file that was generated by a Spark 1.6 program. It opens
fine in Spark 1.6 with 6GB of driver memory, and probably less.

However, when I try to open the  same file in Spark 2.0 or 2.1, I get GC
timeout exceptions. And this is with 6, 8, and even 10GB of driver memory.

This is strange and smells like buggy behavior. How can I debug this or
workaround it in Spark 2+?

Nick




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Reading-ORC-file-fine-on-1-6-GC-timeout-on-2-tp28654.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.