You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Tran Quyet Thang (JIRA)" <ji...@apache.org> on 2016/10/06 01:01:20 UTC

[jira] [Created] (SPARK-17796) spark HiveThriftServer2 sql AnalysisException: LOAD DATA input path does not exist. if sql query is existed wild card characters

Tran Quyet Thang created SPARK-17796:
----------------------------------------

             Summary: spark HiveThriftServer2 sql AnalysisException: LOAD DATA input path does not exist. if sql query is existed wild card characters
                 Key: SPARK-17796
                 URL: https://issues.apache.org/jira/browse/SPARK-17796
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 2.0.0
         Environment: CentOS release 6.6 (Final)
            Reporter: Tran Quyet Thang
             Fix For: 1.6.2


1. I'm using a ETL tool and connecting to Spark2-HiveThriftServer2 over connection string URL: "jdbc:hive2://10.30.164.132:10000/nat" 


2. I'm facing with below error, if the sql have existed wild card characters "*" : LOAD DATA LOCAL INPATH '${SOURCE_DB_FILE}/V_SOURCE_BTS/${YYYYMMDD}/s0_V_SOURCE_BTS_*_part_*' INTO TABLE ${SCHEMA_BI}V_SOURCE_BTS;

 Couldn't execute SQL: LOAD DATA LOCAL INPATH '/u02/CDR/HAITI/MakeFile/V_SOURCE_BTS/20160927/s0_V_SOURCE_BTS_*_part_*' INTO TABLE nat.V_SOURCE_BTS
2016/10/05 14:38:43 - V_SOURCE_BTS -
2016/10/05 14:38:43 - V_SOURCE_BTS - org.apache.spark.sql.AnalysisException: LOAD DATA input path does not exist: /u02/CDR/HAITI/MakeFile/V_SOURCE_BTS/20160927/s0_V_SOURCE_BTS_*_part_*;


3. The sql query can execute without error if I changed the sql and remove wild cards to: 
LOAD DATA LOCAL INPATH '/u02/CDR/HAITI/MakeFile/V_SOURCE_BTS/20160927/s0_V_SOURCE_BTS_20160927_part_0000000' INTO TABLE ${SCHEMA_BI}V_SOURCE_BTS;



4. The problem's happened from Spark 2.0.0. 





--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org