You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/05/21 04:13:08 UTC
[jira] [Resolved] (SPARK-21859) SparkFiles.get failed on driver in
yarn-cluster and yarn-client mode
[ https://issues.apache.org/jira/browse/SPARK-21859?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon resolved SPARK-21859.
----------------------------------
Resolution: Incomplete
> SparkFiles.get failed on driver in yarn-cluster and yarn-client mode
> --------------------------------------------------------------------
>
> Key: SPARK-21859
> URL: https://issues.apache.org/jira/browse/SPARK-21859
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 1.6.2
> Reporter: Cyanny
> Priority: Major
> Labels: bulk-closed
>
> when use SparkFiles.get a file on driver in yarn-client or yarn-cluster, it will report file not found exception.
> This exception only happens on driver, SparkFiles.get works fine on executor.
>
> we can reproduce the bug as follows:
> ```scala
> def testOnDriver(fileName: String) = {
> val file = new File(SparkFiles.get(fileName))
> if (!file.exists()) {
> logging.info(s"$file not exist")
> } else {
> // print file content on driver
> val content = Source.fromFile(file).getLines().mkString("\n")
> logging.info(s"File content: ${content}")
> }
> }
> // the output will be file not exist
> ```
>
> ```python
> conf = SparkConf().setAppName("test files")
> sc = SparkContext(appName="spark files test")
>
> def test_on_driver(filename):
> file = SparkFiles.get(filename)
> print("file path: {}".format(file))
> if os.path.exists(file):
> with open(file) as f:
> lines = f.readlines()
> print(lines)
> else:
> print("file doesn't exist")
> run_command("ls .")
> ```
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org