You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2016/11/03 00:50:58 UTC
[jira] [Closed] (SPARK-14195) Error:
org.apache.spark.sql.AnalysisException: cannot recognize input near
'select' 'MAC' 'from' in expression specification; line 1 pos 16
(state=,code=0)
[ https://issues.apache.org/jira/browse/SPARK-14195?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon closed SPARK-14195.
--------------------------------
Resolution: Cannot Reproduce
I am closing this as Cannot Reproduce as I can't against the current master.
{code}
scala> spark.sql("select a.imei, (select MAC from testolap where imei like '%AA%'), MAC from testolap a limit 10").show()
+------+----------------+---+
| imei|scalarsubquery()|MAC|
+------+----------------+---+
|12AA34| 123|123|
+------+----------------+---+
{code}
> Error: org.apache.spark.sql.AnalysisException: cannot recognize input near 'select' 'MAC' 'from' in expression specification; line 1 pos 16 (state=,code=0)
> -----------------------------------------------------------------------------------------------------------------------------------------------------------
>
> Key: SPARK-14195
> URL: https://issues.apache.org/jira/browse/SPARK-14195
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 1.5.1
> Environment: SUSE 11
> Reporter: Chetan Bhat
> Priority: Minor
> Original Estimate: 0.05h
> Remaining Estimate: 0.05h
>
> In Beeline when the SQL is executed using Spark the following error is displayed.
> select a.imei, (select MAC from testolap where imei like '%AA%' 1) MAC from testolap a limit 10;
> Error: org.apache.spark.sql.AnalysisException: cannot recognize input near 'select' 'MAC' 'from' in expression specification; line 1 pos 16 (state=,code=0)
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org