You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Xiaochen Ouyang (JIRA)" <ji...@apache.org> on 2018/07/26 09:06:00 UTC
[jira] [Created] (SPARK-24930) Exception information is not
accurate when using `LOAD DATA LOCAL INPATH`
Xiaochen Ouyang created SPARK-24930:
---------------------------------------
Summary: Exception information is not accurate when using `LOAD DATA LOCAL INPATH`
Key: SPARK-24930
URL: https://issues.apache.org/jira/browse/SPARK-24930
Project: Spark
Issue Type: Improvement
Components: SQL
Affects Versions: 2.3.1, 2.3.0, 2.2.2, 2.2.1
Reporter: Xiaochen Ouyang
# root user create a test.txt file contains a record '123' in /root/ directory
# switch mr user to execute spark-shell --master local
{code:java}
scala> spark.version
res2: String = 2.2.1
scala> spark.sql("load data local inpath '/root/test.txt' into table t1 partition(area ='025')")
org.apache.spark.sql.AnalysisException: LOAD DATA input path does not exist: /root/test.txt;
at org.apache.spark.sql.execution.command.LoadDataCommand.run(tables.scala:339)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:56)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:67)
at org.apache.spark.sql.Dataset.<init>(Dataset.scala:183)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:68)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:639)
... 48 elided
scala>
{code}
In fact, the input path exists, but the mr user does not have permission to access the directory `/root/` ,so the message throwed by `AnalysisException` can misleading user fix problem.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org