You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Steve Loughran (JIRA)" <ji...@apache.org> on 2016/10/13 16:23:20 UTC
[jira] [Commented] (SPARK-14561) History Server does not see new
logs in S3
[ https://issues.apache.org/jira/browse/SPARK-14561?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15572401#comment-15572401 ]
Steve Loughran commented on SPARK-14561:
----------------------------------------
To clarify: it's not changes in existing files that aren't showing up, *it is new files added to the same destination directory*
If that's the case, something is up with the scanning
#. set the logging of org.apache.spark.deploy.history.FsHistoryProvider to debug
# have a look at the scan interval. Is it too long?
> History Server does not see new logs in S3
> ------------------------------------------
>
> Key: SPARK-14561
> URL: https://issues.apache.org/jira/browse/SPARK-14561
> Project: Spark
> Issue Type: Bug
> Affects Versions: 1.6.1
> Reporter: Miles Crawford
>
> If you set the Spark history server to use a log directory with an s3a:// url, everything appears to work fine at first, but new log files written by applications are not picked up by the server.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org