You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Herman van Hovell (JIRA)" <ji...@apache.org> on 2016/03/02 23:10:18 UTC

[jira] [Resolved] (SPARK-13535) Script Transformation returns analysis errors when using backticks

     [ https://issues.apache.org/jira/browse/SPARK-13535?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Herman van Hovell resolved SPARK-13535.
---------------------------------------
          Resolution: Resolved
            Assignee: Xiao Li
    Target Version/s: 2.0.0

> Script Transformation returns analysis errors when using backticks
> ------------------------------------------------------------------
>
>                 Key: SPARK-13535
>                 URL: https://issues.apache.org/jira/browse/SPARK-13535
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.0.0
>            Reporter: Xiao Li
>            Assignee: Xiao Li
>
> {code}
> FROM
> (FROM test SELECT TRANSFORM(key, value) USING 'cat' AS (`thing1` int, thing2 string)) t
> SELECT thing1 + 1
> {code}
> This query returns an analysis error, like:
> {code}
> Failed to analyze query: org.apache.spark.sql.AnalysisException: cannot resolve '`thing1`' given input columns: [`thing1`, thing2]; line 3 pos 7
> 'Project [unresolvedalias(('thing1 + 1), None)]
> +- SubqueryAlias t
>    +- ScriptTransformation [key#2,value#3], cat, [`thing1`#6,thing2#7], HiveScriptIOSchema(List(),List(),Some(org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe),Some(org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe),List((field.delim,	)),List((field.delim,	)),Some(org.apache.hadoop.hive.ql.exec.TextRecordReader),Some(org.apache.hadoop.hive.ql.exec.TextRecordWriter),false)
>       +- SubqueryAlias test
>          +- Project [_1#0 AS key#2,_2#1 AS value#3]
>             +- LocalRelation [_1#0,_2#1], [[1,1],[2,2],[3,3],[4,4],[5,5]]
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org