You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Damien Carol (JIRA)" <ji...@apache.org> on 2014/12/09 09:11:12 UTC
[jira] [Created] (SPARK-4794) Wrong parse of GROUP BY query
Damien Carol created SPARK-4794:
-----------------------------------
Summary: Wrong parse of GROUP BY query
Key: SPARK-4794
URL: https://issues.apache.org/jira/browse/SPARK-4794
Project: Spark
Issue Type: Bug
Components: SQL
Affects Versions: 1.2.0
Reporter: Damien Carol
Spark is not able to parse this query :
{code:sql}
select
`cf_encaissement_fact_pq`.`annee` as `Annee`,
`cf_encaissement_fact_pq`.`mois` as `Mois`,
`cf_encaissement_fact_pq`.`jour` as `Jour`,
`cf_encaissement_fact_pq`.`heure` as `Heure`,
`cf_encaissement_fact_pq`.`nom_societe` as `Societe`,
`cf_encaissement_fact_pq`.`id_magasin` as `Magasin`,
`cf_encaissement_fact_pq`.`CarteFidelitePresentee` as `CF_Presentee`,
`cf_encaissement_fact_pq`.`CompteCarteFidelite` as `CompteCarteFidelite`,
`cf_encaissement_fact_pq`.`NbCompteCarteFidelite` as `NbCompteCarteFidelite`,
`cf_encaissement_fact_pq`.`DetentionCF` as `DetentionCF`,
`cf_encaissement_fact_pq`.`NbCarteFidelite` as `NbCarteFidelite`,
`cf_encaissement_fact_pq`.`Id_CF_Dim_DUCB` as `Plage_DUCB`,
`cf_encaissement_fact_pq`.`NbCheque` as `NbCheque`,
`cf_encaissement_fact_pq`.`CACheque` as `CACheque`,
`cf_encaissement_fact_pq`.`NbImpaye` as `NbImpaye`,
`cf_encaissement_fact_pq`.`Id_Ensemble` as `NbEnsemble`,
`cf_encaissement_fact_pq`.`ZIBZIN` as `NbCompte`,
`cf_encaissement_fact_pq`.`ResteDuImpaye` as `ResteDuImpaye`
from
`testsimon3`.`cf_encaissement_fact_pq` as `cf_encaissement_fact_pq`
where
`cf_encaissement_fact_pq`.`annee` = 2013
and
`cf_encaissement_fact_pq`.`mois` = 7
and
`cf_encaissement_fact_pq`.`jour` = 12
order by
`cf_encaissement_fact_pq`.`annee` ASC,
`cf_encaissement_fact_pq`.`mois` ASC,
`cf_encaissement_fact_pq`.`jour` ASC,
`cf_encaissement_fact_pq`.`heure` ASC,
`cf_encaissement_fact_pq`.`nom_societe` ASC,
`cf_encaissement_fact_pq`.`id_magasin` ASC,
`cf_encaissement_fact_pq`.`CarteFidelitePresentee` ASC,
`cf_encaissement_fact_pq`.`CompteCarteFidelite` ASC,
`cf_encaissement_fact_pq`.`NbCompteCarteFidelite` ASC,
`cf_encaissement_fact_pq`.`DetentionCF` ASC,
`cf_encaissement_fact_pq`.`NbCarteFidelite` ASC,
`cf_encaissement_fact_pq`.`Id_CF_Dim_DUCB` ASC
{code}
If I remove table name in ORDER BY conditions, Spark can handle it.
{code:sql}
select
`cf_encaissement_fact_pq`.`annee` as `Annee`,
`cf_encaissement_fact_pq`.`mois` as `Mois`,
`cf_encaissement_fact_pq`.`jour` as `Jour`,
`cf_encaissement_fact_pq`.`heure` as `Heure`,
`cf_encaissement_fact_pq`.`nom_societe` as `Societe`,
`cf_encaissement_fact_pq`.`id_magasin` as `Magasin`,
`cf_encaissement_fact_pq`.`CarteFidelitePresentee` as `CFPresentee`,
`cf_encaissement_fact_pq`.`CompteCarteFidelite` as `CompteCarteFidelite`,
`cf_encaissement_fact_pq`.`NbCompteCarteFidelite` as `NbCompteCarteFidelite`,
`cf_encaissement_fact_pq`.`DetentionCF` as `DetentionCF`,
`cf_encaissement_fact_pq`.`NbCarteFidelite` as `NbCarteFidelite`,
`cf_encaissement_fact_pq`.`Id_CF_Dim_DUCB` as `PlageDUCB`,
`cf_encaissement_fact_pq`.`NbCheque` as `NbCheque`,
`cf_encaissement_fact_pq`.`CACheque` as `CACheque`,
`cf_encaissement_fact_pq`.`NbImpaye` as `NbImpaye`,
`cf_encaissement_fact_pq`.`Id_Ensemble` as `NbEnsemble`,
`cf_encaissement_fact_pq`.`ZIBZIN` as `NbCompte`,
`cf_encaissement_fact_pq`.`ResteDuImpaye` as `ResteDuImpaye`
from
`testsimon3`.`cf_encaissement_fact_pq` as `cf_encaissement_fact_pq`
where
`cf_encaissement_fact_pq`.`annee` = 2013
and
`cf_encaissement_fact_pq`.`mois` = 7
and
`cf_encaissement_fact_pq`.`jour` = 12
order by
`annee` ASC,
`mois` ASC,
`jour` ASC,
`heure` ASC,
`nom_societe` ASC,
`id_magasin` ASC,
`CarteFidelitePresentee` ASC,
`CompteCarteFidelite` ASC,
`NbCompteCarteFidelite` ASC,
`DetentionCF` ASC,
`NbCarteFidelite` ASC,
`Id_CF_Dim_DUCB` ASC
{code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org