You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Ravindra Pesala (JIRA)" <ji...@apache.org> on 2014/10/07 07:05:33 UTC

[jira] [Commented] (SPARK-3814) Bitwise & does not work in Hive

    [ https://issues.apache.org/jira/browse/SPARK-3814?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14161472#comment-14161472 ] 

Ravindra Pesala commented on SPARK-3814:
----------------------------------------

Currently there is no support of  Bitwise & in Spark HiveQl and Spark SQL as well.

I am working on this issue.

And as well as we need to support Bitwise | , Bitwise ^ , Bitwise ~. I will add seperate jira for these operations and I will work on these.
Thank you.

> Bitwise & does not work  in Hive
> --------------------------------
>
>                 Key: SPARK-3814
>                 URL: https://issues.apache.org/jira/browse/SPARK-3814
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.1.0
>            Reporter: Yana Kadiyska
>            Priority: Minor
>
> Error: java.lang.RuntimeException: 
> Unsupported language features in query: select (case when bit_field & 1=1 then r_end - r_start else NULL end) from mytable where pkey='0178-2014-07' LIMIT 2
> TOK_QUERY
>   TOK_FROM
>     TOK_TABREF
>       TOK_TABNAME
>        mytable 
>   TOK_INSERT
>     TOK_DESTINATION
>       TOK_DIR
>         TOK_TMP_FILE
>     TOK_SELECT
>       TOK_SELEXPR
>         TOK_FUNCTION
>           when
>           =
>             &
>               TOK_TABLE_OR_COL
>                 bit_field
>               1
>             1
>           -
>             TOK_TABLE_OR_COL
>               r_end
>             TOK_TABLE_OR_COL
>               r_start
>           TOK_NULL
>     TOK_WHERE
>       =
>         TOK_TABLE_OR_COL
>           pkey
>         '0178-2014-07'
>     TOK_LIMIT
>       2
> SQLState:  null
> ErrorCode: 0



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org