You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "zzzzming95 (Jira)" <ji...@apache.org> on 2023/04/30 07:12:00 UTC
[jira] [Created] (SPARK-43327) Trigger `committer.setupJob` before plan execute in `FileFormatWriter`
zzzzming95 created SPARK-43327:
----------------------------------
Summary: Trigger `committer.setupJob` before plan execute in `FileFormatWriter`
Key: SPARK-43327
URL: https://issues.apache.org/jira/browse/SPARK-43327
Project: Spark
Issue Type: Bug
Components: SQL
Affects Versions: 3.2.3
Reporter: zzzzming95
In this jira, the case where `outputOrdering` might not work if AQE is enabled has been resolved. However, since it materializes the AQE plan in advance (triggers getFinalPhysicalPlan) , it may cause the committer.setupJob(job) to not execute When `AdaptiveSparkPlanExec#getFinalPhysicalPlan()` is executed with an error.
Normally this step should be executed after committer.setupJob(job).
This may eventually result in the insertoverwrite directory being deleted.
{code:java}
import org.apache.hadoop.fs.{FileSystem, Path}
import org.apache.spark.sql.QueryTest
import org.apache.spark.sql.catalyst.TableIdentifier
sql("CREATE TABLE IF NOT EXISTS spark32_overwrite(amt1 int) STORED AS ORC")
sql("CREATE TABLE IF NOT EXISTS spark32_overwrite2(amt1 long) STORED AS ORC")
sql("INSERT OVERWRITE TABLE spark32_overwrite2 select 6000044164")
sql("set spark.sql.ansi.enabled=true")
val loc =
spark.sessionState.catalog.getTableMetadata(TableIdentifier("spark32_overwrite")).location
val fs = FileSystem.get(loc, spark.sparkContext.hadoopConfiguration)
println("Location exists: " + fs.exists(new Path(loc)))
try {
sql("INSERT OVERWRITE TABLE spark32_overwrite select amt1 from " +
"(select cast(amt1 as int) as amt1 from spark32_overwrite2 distribute by amt1)")
} finally {
println("Location exists: " + fs.exists(new Path(loc)))
} {code}
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org