You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Lantao Jin (Jira)" <ji...@apache.org> on 2019/09/29 06:35:00 UTC

[jira] [Updated] (SPARK-29283) Error message is hidden when query from JDBC, especially enabled adaptive execution

     [ https://issues.apache.org/jira/browse/SPARK-29283?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Lantao Jin updated SPARK-29283:
-------------------------------
    Summary: Error message is hidden when query from JDBC, especially enabled adaptive execution  (was: Error information is hidden when query from JDBC, especially enabled adaptive execution)

> Error message is hidden when query from JDBC, especially enabled adaptive execution
> -----------------------------------------------------------------------------------
>
>                 Key: SPARK-29283
>                 URL: https://issues.apache.org/jira/browse/SPARK-29283
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.4.4, 3.0.0
>            Reporter: Lantao Jin
>            Priority: Major
>
> When adaptive execution is enabled, the Spark users who connected from JDBC always get adaptive execution error whatever the under root cause is. It's very confused. We have to check the driver log to find out why.
> {code}
> 0: jdbc:hive2://localhost:10000> SELECT * FROM testData join testData2 ON key = v;
> SELECT * FROM testData join testData2 ON key = v;
> Error: Error running query: org.apache.spark.SparkException: Adaptive execution failed due to stage materialization failures. (state=,code=0)
> 0: jdbc:hive2://localhost:10000> 
> {code}
> For example, a job queried from JDBC failed due to HDFS missing block. User still get the error message {{Adaptive execution failed due to stage materialization failures}}.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org