You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Andrew Hu Zheng (JIRA)" <ji...@apache.org> on 2017/09/26 21:16:00 UTC

[jira] [Created] (SPARK-22134) StackOverflowError issue when applying large nested UDF calls

Andrew Hu Zheng created SPARK-22134:
---------------------------------------

             Summary: StackOverflowError issue when applying large nested UDF calls
                 Key: SPARK-22134
                 URL: https://issues.apache.org/jira/browse/SPARK-22134
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 2.1.0
         Environment: Spark 2.1.0 on Cloudera CDH 5u8
            Reporter: Andrew Hu Zheng
            Priority: Critical


Spark throws a StackOverflowError whenever there is a large nested call of UDFs.
I have tried increasing the memory, but the same issue still happens.

Sample code of the nested calls : 

{code:java}
val v4 = u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat(u_concat($"C0_0", $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0"), $"C0_0");
{code}


stack trace
{code:java}
java.lang.reflect.InvocationTargetException
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at com.informatica.compiler.InfaSparkMain$.main(InfaSparkMain.scala:74)
	at com.informatica.compiler.InfaSparkMain.main(InfaSparkMain.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:637)
Caused by: java.lang.StackOverflowError
	at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$5.apply(TreeNode.scala:358)
	at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:188)
	at org.apache.spark.sql.catalyst.trees.TreeNode.transformChildren(TreeNode.scala:329)
	at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:307)
	at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:307)
	at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:307)
	at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$5$$anonfun$apply$11.apply(TreeNode.scala:360)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
	at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
	at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
	at scala.collection.AbstractTraversable.map(Traversable.scala:104)
	at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$5.apply(TreeNode.scala:358)
	at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:188)
	at org.apache.spark.sql.catalyst.trees.TreeNode.transformChildren(TreeNode.scala:329)
	at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:307)
	at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:307)
	at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:307)
	at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$5$$anonfun$apply$11.apply(TreeNode.scala:360)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
	at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
{code}





--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org