You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Ravindra Pesala (JIRA)" <ji...@apache.org> on 2014/10/30 15:14:34 UTC
[jira] [Created] (SPARK-4154) Query does not work if it has "not
between " in Spark SQL and HQL
Ravindra Pesala created SPARK-4154:
--------------------------------------
Summary: Query does not work if it has "not between " in Spark SQL and HQL
Key: SPARK-4154
URL: https://issues.apache.org/jira/browse/SPARK-4154
Project: Spark
Issue Type: Bug
Components: SQL
Affects Versions: 1.1.0
Reporter: Ravindra Pesala
if the query contains "not between" does not work.
{code}
SELECT * FROM src where key not between 10 and 20
{code}
It gives the following error
{code}
Exception in thread "main" java.lang.RuntimeException:
Unsupported language features in query: SELECT * FROM src where key not between 10 and 20
TOK_QUERY
TOK_FROM
TOK_TABREF
TOK_TABNAME
src
TOK_INSERT
TOK_DESTINATION
TOK_DIR
TOK_TMP_FILE
TOK_SELECT
TOK_SELEXPR
TOK_ALLCOLREF
TOK_WHERE
TOK_FUNCTION
between
KW_TRUE
TOK_TABLE_OR_COL
key
10
20
scala.NotImplementedError: No parse rules for ASTNode type: 256, text: KW_TRUE :
KW_TRUE
" +
org.apache.spark.sql.hive.HiveQl$.nodeToExpr(HiveQl.scala:1088)
at scala.sys.package$.error(package.scala:27)
at org.apache.spark.sql.hive.HiveQl$.createPlan(HiveQl.scala:251)
at org.apache.spark.sql.hive.ExtendedHiveQlParser$$anonfun$hiveQl$1.apply(ExtendedHiveQlParser.scala:50)
at org.apache.spark.sql.hive.ExtendedHiveQlParser$$anonfun$hiveQl$1.apply(ExtendedHiveQlParser.scala:49)
at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
at scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
at scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
at scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parsers.scala:242)
at scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
{code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org