You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2017/06/13 06:45:00 UTC
[jira] [Commented] (SPARK-20338) Spaces in spark.eventLog.dir are
not correctly handled
[ https://issues.apache.org/jira/browse/SPARK-20338?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16047501#comment-16047501 ]
Apache Spark commented on SPARK-20338:
--------------------------------------
User 'zuotingbing' has created a pull request for this issue:
https://github.com/apache/spark/pull/18285
> Spaces in spark.eventLog.dir are not correctly handled
> ------------------------------------------------------
>
> Key: SPARK-20338
> URL: https://issues.apache.org/jira/browse/SPARK-20338
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 2.1.0
> Reporter: zuotingbing
>
> set spark.eventLog.dir=/home/mr/event log and submit an app ,we got error as follows:
> 017-04-14 17:28:40,378 INFO org.apache.spark.SparkContext: Successfully stopped SparkContext
> Exception in thread "main" ExitCodeException exitCode=1: chmod: cannot access `/home/mr/event%20log/app-20170414172839-0000.inprogress': No such file or directory
> at org.apache.hadoop.util.Shell.runCommand(Shell.java:561)
> at org.apache.hadoop.util.Shell.run(Shell.java:478)
> at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:738)
> at org.apache.hadoop.util.Shell.execCommand(Shell.java:831)
> at org.apache.hadoop.util.Shell.execCommand(Shell.java:814)
> at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:712)
> at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:506)
> at org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:125)
> at org.apache.spark.SparkContext.<init>(SparkContext.scala:516)
> at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2258)
> at org.apache.spark.sql.SparkSession$Builder$$anonfun$9.apply(SparkSession.scala:879)
> at org.apache.spark.sql.SparkSession$Builder$$anonfun$9.apply(SparkSession.scala:871)
> at scala.Option.getOrElse(Option.scala:121)
> at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:871)
> at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.scala:58)
> at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.<init>(SparkSQLCLIDriver.scala:288)
> at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:137)
> at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:736)
> at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org