You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2019/09/24 13:15:00 UTC
[jira] [Updated] (SPARK-29174) LOCAL is not supported in INSERT
OVERWRITE DIRECTORY to data source
[ https://issues.apache.org/jira/browse/SPARK-29174?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon updated SPARK-29174:
---------------------------------
Description:
*using does not work for insert overwrite when in local but works when insert overwrite in HDFS directory*
**
{code}
0: jdbc:hive2://10.18.18.214:23040/default> insert overwrite directory '/user/trash2/' using parquet select * from trash1 a where a.country='PAK';
+---------+--+
| Result |
+---------+--+
+---------+--+
No rows selected (0.448 seconds)
0: jdbc:hive2://10.18.18.214:23040/default> insert overwrite local directory '/opt/trash2/' using parquet select * from trash1 a where a.country='PAK';
Error: org.apache.spark.sql.catalyst.parser.ParseException:
LOCAL is not supported in INSERT OVERWRITE DIRECTORY to data source(line 1, pos 0)
== SQL ==
insert overwrite local directory '/opt/trash2/' using parquet select * from trash1 a where a.country='PAK'
^^^ (state=,code=0)
0: jdbc:hive2://10.18.18.214:23040/default> insert overwrite local directory '/opt/trash2/' stored as parquet select * from trash1 a where a.country='PAK';
+---------+--+
| Result |
+---------+--+
| | |
{code}
was:
*using does not work for insert overwrite when in local but works when insert overwrite in HDFS directory*
**
0: jdbc:hive2://10.18.18.214:23040/default> insert overwrite directory '/user/trash2/' using parquet select * from trash1 a where a.country='PAK';
+---------+--+
| Result |
+---------+--+
+---------+--+
No rows selected (0.448 seconds)
0: jdbc:hive2://10.18.18.214:23040/default> insert overwrite local directory '/opt/trash2/' using parquet select * from trash1 a where a.country='PAK';
Error: org.apache.spark.sql.catalyst.parser.ParseException:
LOCAL is not supported in INSERT OVERWRITE DIRECTORY to data source(line 1, pos 0)
== SQL ==
insert overwrite local directory '/opt/trash2/' using parquet select * from trash1 a where a.country='PAK'
^^^ (state=,code=0)
0: jdbc:hive2://10.18.18.214:23040/default> insert overwrite local directory '/opt/trash2/' stored as parquet select * from trash1 a where a.country='PAK';
+---------+--+
| Result |
+---------+--+
| | |
> LOCAL is not supported in INSERT OVERWRITE DIRECTORY to data source
> -------------------------------------------------------------------
>
> Key: SPARK-29174
> URL: https://issues.apache.org/jira/browse/SPARK-29174
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 3.0.0
> Reporter: ABHISHEK KUMAR GUPTA
> Priority: Major
>
> *using does not work for insert overwrite when in local but works when insert overwrite in HDFS directory*
> **
>
> {code}
> 0: jdbc:hive2://10.18.18.214:23040/default> insert overwrite directory '/user/trash2/' using parquet select * from trash1 a where a.country='PAK';
> +---------+--+
> | Result |
> +---------+--+
> +---------+--+
> No rows selected (0.448 seconds)
> 0: jdbc:hive2://10.18.18.214:23040/default> insert overwrite local directory '/opt/trash2/' using parquet select * from trash1 a where a.country='PAK';
> Error: org.apache.spark.sql.catalyst.parser.ParseException:
> LOCAL is not supported in INSERT OVERWRITE DIRECTORY to data source(line 1, pos 0)
>
> == SQL ==
> insert overwrite local directory '/opt/trash2/' using parquet select * from trash1 a where a.country='PAK'
> ^^^ (state=,code=0)
> 0: jdbc:hive2://10.18.18.214:23040/default> insert overwrite local directory '/opt/trash2/' stored as parquet select * from trash1 a where a.country='PAK';
> +---------+--+
> | Result |
> +---------+--+
> | | |
> {code}
>
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org