You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Rafal Wojdyla (Jira)" <ji...@apache.org> on 2021/12/10 22:39:00 UTC

[jira] [Created] (SPARK-37609) Transient StackOverflowError on DataFrame from Catalyst QueryPlan

Rafal Wojdyla created SPARK-37609:
-------------------------------------

             Summary: Transient StackOverflowError on DataFrame from Catalyst QueryPlan
                 Key: SPARK-37609
                 URL: https://issues.apache.org/jira/browse/SPARK-37609
             Project: Spark
          Issue Type: Bug
          Components: PySpark
    Affects Versions: 3.1.2
            Reporter: Rafal Wojdyla


I sporadically observe a StackOverflowError from Catalyst's QueryPlan (for a relatively complicated query), below is a stacktrace from the {{count}} on that DF.  It's a bit troubling because it's a transient error, with enough retries (no change to code, probably some kind of cache?), I can get the op to work :(

{noformat}
---------------------------------------------------------------------------
Py4JJavaError                             Traceback (most recent call last)
~/miniconda3/envs/tr-dev/lib/python3.9/site-packages/pyspark/sql/dataframe.py in count(self)
    662         2
    663         """
--> 664         return int(self._jdf.count())
    665 
    666     def collect(self):

~/miniconda3/envs/tr-dev/lib/python3.9/site-packages/py4j/java_gateway.py in __call__(self, *args)
   1302 
   1303         answer = self.gateway_client.send_command(command)
-> 1304         return_value = get_return_value(
   1305             answer, self.gateway_client, self.target_id, self.name)
   1306 

~/miniconda3/envs/tr-dev/lib/python3.9/site-packages/pyspark/sql/utils.py in deco(*a, **kw)
    109     def deco(*a, **kw):
    110         try:
--> 111             return f(*a, **kw)
    112         except py4j.protocol.Py4JJavaError as e:
    113             converted = convert_exception(e.java_exception)

~/miniconda3/envs/tr-dev/lib/python3.9/site-packages/py4j/protocol.py in get_return_value(answer, gateway_client, target_id, name)
    324             value = OUTPUT_CONVERTER[type](answer[2:], gateway_client)
    325             if answer[1] == REFERENCE_TYPE:
--> 326                 raise Py4JJavaError(
    327                     "An error occurred while calling {0}{1}{2}.\n".
    328                     format(target_id, ".", name), value)

Py4JJavaError: An error occurred while calling o9123.count.
: java.lang.StackOverflowError
	at org.apache.spark.sql.catalyst.plans.QueryPlan.rewrite$1(QueryPlan.scala:188)
	at org.apache.spark.sql.catalyst.plans.QueryPlan.$anonfun$transformUpWithNewOutput$1(QueryPlan.scala:193)
	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$mapChildren$1(TreeNode.scala:408)
	at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:244)
	at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:406)
	at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:359)
	at org.apache.spark.sql.catalyst.plans.QueryPlan.rewrite$1(QueryPlan.scala:192)
	at org.apache.spark.sql.catalyst.plans.QueryPlan.$anonfun$transformUpWithNewOutput$1(QueryPlan.scala:193)
	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$mapChildren$1(TreeNode.scala:408)
	at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:244)
	at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:406)
	at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:359)
	at org.apache.spark.sql.catalyst.plans.QueryPlan.rewrite$1(QueryPlan.scala:192)
...
{noformat}



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org