You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "marymwu (JIRA)" <ji...@apache.org> on 2016/06/21 08:04:57 UTC

[jira] [Created] (SPARK-16092) Spark2.0 take no effect after set hive.exec.dynamic.partition.mode=nonstrict as a global variable in Spark2.0 configuration file while Spark1.6 does

marymwu created SPARK-16092:
-------------------------------

             Summary: Spark2.0 take no effect after set hive.exec.dynamic.partition.mode=nonstrict as a global variable in Spark2.0 configuration file while Spark1.6 does
                 Key: SPARK-16092
                 URL: https://issues.apache.org/jira/browse/SPARK-16092
             Project: Spark
          Issue Type: Bug
    Affects Versions: 2.0.0
            Reporter: marymwu


Spark2.0 take no effect after set hive.exec.dynamic.partition.mode=nonstrict as a global variable in Spark2.0 configuration file while Spark1.6 does

Precondition:
set hive.exec.dynamic.partition.mode=nonstrict as a global variable in Spark2.0 configuration file
"<property>
    <name>hive.exec.dynamic.partition.mode</name>
    <value>nonstrict</value>
</property>"

Testcase:
"insert overwrite table d_test_tpc_2g_txt.marytest1 partition (dt)  select t.nid, t.price, t.dt from (select nid, price, dt from d_test_tpc_2g_txt.marytest where dt >= '2016-06-20' and dt <= '2016-06-21') t group by t.nid, t.price, t.dt;"

Result:
Error: org.apache.spark.SparkException: Dynamic partition strict mode requires at least one static partition column. To turn this off set hive.exec.dynamic.partition.mode=nonstrict (state=,code=0)

Note:
Spark1.6 supports the above SQL statement after set hive.exec.dynamic.partition.mode=nonstrict as a global variable in Spark configuration file
"<property>
    <name>hive.exec.dynamic.partition.mode</name>
    <value>nonstrict</value>
</property>"




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org