You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "miaowang (Jira)" <ji...@apache.org> on 2023/03/28 10:20:00 UTC

[jira] [Created] (SPARK-42948) Execution plan error, unable to obtain desired results

miaowang created SPARK-42948:
--------------------------------

             Summary: Execution plan error, unable to obtain desired results
                 Key: SPARK-42948
                 URL: https://issues.apache.org/jira/browse/SPARK-42948
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 3.2.0
         Environment: !image-2023-03-28-18-15-55-189.png!

!image-2023-03-28-18-17-08-017.png!

!image-2023-03-28-18-18-41-754.png!
            Reporter: miaowang


A jar is packaged using SparkSession to submit Spark SQL:
{code:java}
//SparkSession.builder().appName(args(0)).config("spark.sql.crossJoin.enabled", true).enableHiveSupport().getOrCreate() spark.sql(arg(1)) {code}
Execute the following SQL fragment:
{code:java}
//INSERT INTO gjdw.aa partition(dt='20230327')
SELECT t1.mandt,
       t1.pur_no,
       t1.pur_item,
       t1.pur_comp_code,
       t1.pur_pur_org,
       t1.zzcoca,
       t1.zzycgdd
FROM
  (SELECT *
   FROM gjdw.aa
   WHERE dt=from_unixtime(unix_timestamp(date_add(from_unixtime(unix_timestamp('20230327','yyyymmdd'),'yyyy-mm-dd'),-1),'yyyy-mm-dd'),'yyyymmdd')) t1
LEFT JOIN
  (SELECT *
   FROM gjdw.aa
   WHERE dt='20230327') t ON t.pur_no = t1.pur_no
AND t.pur_item = t1.pur_item
WHERE (t.pur_no = ''
       AND t.pur_item = ''
       OR (t.pur_no IS NULL
           AND t.pur_item IS NULL)) {code}
 

Strangely, I didn't get the desired result. There was data in the table, and the correct value should have data inserted. However, there was no data output, and there was no task error message for the job. This occurred in the execution plan

!image-2023-03-28-18-15-07-115.png!



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org