You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "cheng (JIRA)" <ji...@apache.org> on 2018/06/30 13:42:00 UTC
[jira] [Issue Comment Deleted] (SPARK-24705)
Spark.sql.adaptive.enabled=true is enabled and self-join query
[ https://issues.apache.org/jira/browse/SPARK-24705?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
cheng updated SPARK-24705:
--------------------------
Comment: was deleted
(was: For example, my query:
device_loc table comes from the jdbc data source
select tv_a.imei
from ( select a.imei,a.speed from device_loc a) tv_a
inner join ( select a.imei,a.speed from device_loc a ) tv_b on tv_a.imei = tv_b.imei
group by tv_a.imei
When the cache tabel device_loc is executed before this query is executed, everything is fine,However, if you do not execute cache table, unexpected results will occur, resulting in failure to execute.
)
> Spark.sql.adaptive.enabled=true is enabled and self-join query
> --------------------------------------------------------------
>
> Key: SPARK-24705
> URL: https://issues.apache.org/jira/browse/SPARK-24705
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.2.1, 2.3.1
> Reporter: cheng
> Priority: Minor
>
> When loading data using jdbc and enabling spark.sql.adaptive.enabled=true , for example loading a tableA table, unexpected results can occur when you use the following query.
> For example, my query:
> device_loc table comes from the jdbc data source
> select tv_a.imei
> from ( select a.imei,a.speed from device_loc a) tv_a
> inner join ( select a.imei,a.speed from device_loc a ) tv_b on tv_a.imei = tv_b.imei
> group by tv_a.imei
> When the cache tabel device_loc is executed before this query is executed, everything is fine,However, if you do not execute cache table, unexpected results will occur, resulting in failure to execute.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org