You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Max Gekk (Jira)" <ji...@apache.org> on 2022/11/09 06:30:00 UTC

[jira] [Created] (SPARK-41072) Convert the internal error about failed stream to user-facing error

Max Gekk created SPARK-41072:
--------------------------------

             Summary: Convert the internal error about failed stream to user-facing error
                 Key: SPARK-41072
                 URL: https://issues.apache.org/jira/browse/SPARK-41072
             Project: Spark
          Issue Type: Sub-task
          Components: SQL
    Affects Versions: 3.4.0
            Reporter: Max Gekk


Assign an error class to the following internal error since it is an user-facing error:

{code}
java.lang.Exception: org.apache.spark.sql.streaming.StreamingQueryException: Query cloudtrail_pipeline [id = 5a3758c3-3b3a-47ff-843a-23292cde3b4f, runId = c1a90694-daa2-4929-b749-82b8a43fa2b1] terminated with exception: [INTERNAL_ERROR] Execution of the stream cloudtrail_pipeline failed. Please, fill a bug report in, and provide the full stack trace. 2 at org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runStream(StreamExecution.scala:403) 3 at org.apache.spark.sql.execution.streaming.StreamExecution$$anon$1.$anonfun$run$4(StreamExecution.scala:269) 4 at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) 5 at com.databricks.unity.EmptyHandle$.runWithAndClose(UCSHandle.scala:42) 6 at org.apache.spark.sql.execution.streaming.StreamExecution$$anon$1.run(StreamExecution.scala:269) 7Caused by: java.lang.Exception: org.apache.spark.SparkException: [INTERNAL_ERROR] Execution of the stream cloudtrail_pipeline failed. Please, fill a bug report in, and provide the full stack trace. 8 at 
{code}




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org