You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "sinlang (Jira)" <ji...@apache.org> on 2022/12/22 08:54:00 UTC

[jira] [Created] (SPARK-41684) spark3 read the one partition data and write to anthor partition cause error

sinlang created SPARK-41684:
-------------------------------

             Summary: spark3 read the one partition data and write to anthor partition cause error
                 Key: SPARK-41684
                 URL: https://issues.apache.org/jira/browse/SPARK-41684
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 3.2.2
            Reporter: sinlang


spark3 read the one partition data and write to anthor partition cause error
{code:java}
1 create temporary view t1 :

 select * from jt_ods.ods_ebi_stm_retail_settle_detail_full_di 
where dt = '2022-12-21' 
union all ( 
select * from jt_ods.ods_ebi_stm_retail_settle_detail_full_df as i 
where i.dt = '2022-12-20'  
and not exists(select 1 from jt_ods.ods_ebi_stm_retail_settle_detail_full_di as d where  d.dt = '2022-12-21' and i.id = d.id))

2 insert data :

 insert sql insert overwrite table jt_ods.ods_ebi_stm_retail_settle_detail_full_df partition(dt = '2022-12-21') select * from t distribute by rand() {code}
{code:java}
2022-12-22 16:29:48 Driver ERROR org.apache.spark.deploy.yarn.ApplicationMaster 
 User class threw exception: org.apache.spark.sql.AnalysisException: Cannot overwrite a path that is also being read from.
org.apache.spark.sql.AnalysisException: Cannot overwrite a path that is also being read from.
    at org.apache.spark.sql.errors.QueryCompilationErrors$.cannotOverwritePathBeingReadFromError(QueryCompilationErrors.scala:1834)
    at org.apache.spark.sql.execution.command.DDLUtils$.verifyNotReadPath(ddl.scala:980)
    at org.apache.spark.sql.execution.datasources.DataSourceAnalysis$$anonfun$apply$1.applyOrElse(DataSourceStrategy.scala:221) {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org