You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Cheng Hao (JIRA)" <ji...@apache.org> on 2015/04/24 03:01:38 UTC

[jira] [Reopened] (SPARK-7044) [Spark SQL] query would hang when using scripts in SQL statement

     [ https://issues.apache.org/jira/browse/SPARK-7044?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Cheng Hao reopened SPARK-7044:
------------------------------

backport to Spark1.3

> [Spark SQL] query would hang when using scripts in SQL statement
> ----------------------------------------------------------------
>
>                 Key: SPARK-7044
>                 URL: https://issues.apache.org/jira/browse/SPARK-7044
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.3.0, 1.4.0
>            Reporter: Yi Zhou
>            Assignee: Cheng Hao
>             Fix For: 1.3.0, 1.4.0
>
>
> Query with 'USING' operator like below would hang when using scripts in SQL statement
> {code}
> INSERT INTO TABLE ${hiveconf:RESULT_TABLE}
> SELECT pid1, pid2, COUNT (*) AS cnt
> FROM (
>   --Make items basket
>   FROM (
>     -- Joining two tables
>     SELECT s.ss_ticket_number AS oid , s.ss_item_sk AS pid
>     FROM store_sales s
>     INNER JOIN item i ON (s.ss_item_sk = i.i_item_sk)
>     WHERE i.i_category_id in (${hiveconf:q01_i_category_id_IN})
>     AND s.ss_store_sk in (${hiveconf:q01_ss_store_sk_IN})
>     CLUSTER BY oid
>   ) q01_map_output
>   REDUCE q01_map_output.oid, q01_map_output.pid
>   USING '${env:BIG_BENCH_JAVA} ${env:BIG_BENCH_java_child_process_xmx} -cp bigbenchqueriesmr.jar de.bankmark.bigbench.queries.q01.Red -ITEM_SET_MAX ${hiveconf:q01_NPATH_ITEM_SET_MAX} '
>   AS (pid1 BIGINT, pid2 BIGINT)
> ) q01_temp_basket
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org