You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "wangwei (JIRA)" <ji...@apache.org> on 2015/08/25 15:46:46 UTC

[jira] [Updated] (SPARK-10226) Error occured in SparkSQL when using !=

     [ https://issues.apache.org/jira/browse/SPARK-10226?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

wangwei updated SPARK-10226:
----------------------------
    Description: 
DataSource:  
                    src/main/resources/kv1.txt

SQL: 
          1. create table src(id string, name string);
          2. load data local inpath '${SparkHome}/examples/src/main/resources/kv1.txt' into table src;
          3. select count(*) from src where id != '0';

[ERROR] Could not expand event
java.lang.IllegalArgumentException: != 0;: event not found
	at jline.console.ConsoleReader.expandEvents(ConsoleReader.java:779)
	at jline.console.ConsoleReader.finishBuffer(ConsoleReader.java:631)
	at jline.console.ConsoleReader.accept(ConsoleReader.java:2019)
	at jline.console.ConsoleReader.readLine(ConsoleReader.java:2666)
	at jline.console.ConsoleReader.readLine(ConsoleReader.java:2269)
	at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:231)
	at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:601)
	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:666)
	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:178)
	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:203)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:118)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

> Error occured in SparkSQL when using  !=
> ----------------------------------------
>
>                 Key: SPARK-10226
>                 URL: https://issues.apache.org/jira/browse/SPARK-10226
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.5.0
>            Reporter: wangwei
>
> DataSource:  
>                     src/main/resources/kv1.txt
> SQL: 
>           1. create table src(id string, name string);
>           2. load data local inpath '${SparkHome}/examples/src/main/resources/kv1.txt' into table src;
>           3. select count(*) from src where id != '0';
> [ERROR] Could not expand event
> java.lang.IllegalArgumentException: != 0;: event not found
> 	at jline.console.ConsoleReader.expandEvents(ConsoleReader.java:779)
> 	at jline.console.ConsoleReader.finishBuffer(ConsoleReader.java:631)
> 	at jline.console.ConsoleReader.accept(ConsoleReader.java:2019)
> 	at jline.console.ConsoleReader.readLine(ConsoleReader.java:2666)
> 	at jline.console.ConsoleReader.readLine(ConsoleReader.java:2269)
> 	at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:231)
> 	at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:601)
> 	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:666)
> 	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:178)
> 	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:203)
> 	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:118)
> 	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org