You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Herman van Hovell (JIRA)" <ji...@apache.org> on 2016/03/28 15:09:25 UTC
[jira] [Comment Edited] (SPARK-14195) Error:
org.apache.spark.sql.AnalysisException: cannot recognize input near
'select' 'MAC' 'from' in expression specification; line 1 pos 16
(state=,code=0)
[ https://issues.apache.org/jira/browse/SPARK-14195?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15214167#comment-15214167 ]
Herman van Hovell edited comment on SPARK-14195 at 3/28/16 1:08 PM:
--------------------------------------------------------------------
Uncorrelated subqueries are not supported in Spark 1.\*. They have been added in 2.\*. Could you try this with the latest master?
was (Author: hvanhovell):
Uncorrelated subqueries are not supported in Spark 1.*. They have been added in 2.*. Could you try this with the latest master?
> Error: org.apache.spark.sql.AnalysisException: cannot recognize input near 'select' 'MAC' 'from' in expression specification; line 1 pos 16 (state=,code=0)
> -----------------------------------------------------------------------------------------------------------------------------------------------------------
>
> Key: SPARK-14195
> URL: https://issues.apache.org/jira/browse/SPARK-14195
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 1.5.1
> Environment: SUSE 11
> Reporter: Chetan Bhat
> Priority: Minor
> Original Estimate: 0.05h
> Remaining Estimate: 0.05h
>
> In Beeline when the SQL is executed using Spark the following error is displayed.
> select a.imei, (select MAC from testolap where imei like '%AA%' 1) MAC from testolap a limit 10;
> Error: org.apache.spark.sql.AnalysisException: cannot recognize input near 'select' 'MAC' 'from' in expression specification; line 1 pos 16 (state=,code=0)
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org