You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "U Shaw (JIRA)" <ji...@apache.org> on 2019/06/12 04:10:00 UTC

[jira] [Created] (SPARK-28011) SQL parse error when there are too many aliases in the table

U Shaw created SPARK-28011:
------------------------------

             Summary: SQL parse error when there are too many aliases in the table
                 Key: SPARK-28011
                 URL: https://issues.apache.org/jira/browse/SPARK-28011
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 2.1.1
            Reporter: U Shaw


A sql syntax error is reported when the following statement is executed.

......
FROM
	menu_item_categories_tmp t1
	LEFT JOIN menu_item_categories_tmp t2 ON t1.icat_id = t2.icat_parent_icat_id 
	AND t1.tenant_id = t2.tenant_id 
	AND t2.icat_status != 'd'
	LEFT JOIN menu_item_categories_tmp t3 ON t2.icat_id = t3.icat_parent_icat_id 
	AND t2.tenant_id = t3.tenant_id 
	AND t3.icat_status != 'd'
	LEFT JOIN menu_item_categories_tmp t4 ON t3.icat_id = t4.icat_parent_icat_id 
	AND t3.tenant_id = t4.tenant_id 
	AND t4.icat_status != 'd'
	LEFT JOIN menu_item_categories_tmp t5 ON t4.icat_id = t5.icat_parent_icat_id 
	AND t4.tenant_id = t5.tenant_id 
	AND t5.icat_status != 'd'
	LEFT JOIN menu_item_categories_tmp t6 ON t5.icat_id = t6.icat_parent_icat_id 
	AND t5.tenant_id = t6.tenant_id 
	AND t6.icat_status != 'd' 
WHERE
	t1.icat_parent_icat_id = '0' 
	AND t1.icat_status != 'd' 
	) SELECT DISTINCT
	tenant_id AS tenant_id,
	type AS type,
CASE
	
	WHEN t2.num >= 1 THEN
	level0 ELSE NULL 
	END AS level0,
CASE
		
		WHEN t2.num >= 2 THEN
		level1 ELSE NULL 
	END AS level1,
CASE
		
		WHEN t2.num >= 3 THEN
		level2 ELSE NULL 
	END AS level2,
CASE
......
		



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org