You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Andrew Or <an...@databricks.com> on 2014/09/04 20:20:29 UTC

Re: Viewing web UI after fact

Hi Grzegorz,

Sorry for the late response. Unfortunately, if the Master UI doesn't know
about your applications (they are "completed" with respect to a different
Master), then it can't regenerate the UIs even if the logs exist. You will
have to use the history server for that.

How did you start the history server? If you are using Spark <=1.0, you can
pass the directory as an argument to the sbin/start-history-server.sh
script. Otherwise, you may need to set the following in your
conf/spark-env.sh to specify the log directory:

export SPARK_HISTORY_OPTS=-Dspark.history.fs.logDirectory=/tmp/spark-events

It could also be a permissions thing. Make sure your logs in
/tmp/spark-events are accessible by the JVM that runs the history server.
Also, there's a chance that "/tmp/spark-events" is interpreted as an HDFS
path depending on which Spark version you're running. To resolve any
ambiguity, you may set the log path to "file:/tmp/spark-events" instead.
But first verify whether they actually exist.

Let me know if you get it working,
-Andrew



2014-08-19 8:23 GMT-07:00 Grzegorz Białek <gr...@codilime.com>:

> Hi,
> Is there any way view history of applications statistics in master ui
> after restarting master server? I have all logs ing /tmp/spark-events/ but
> when I start history server in this directory it says "No Completed
> Applications Found". Maybe I could copy this logs to dir used by master
> server but I couldn't find any. Or maybe I'm doing something wrong
> launching history server.
> Do you have any idea how to solve it?
>
> Thanks,
> Grzegorz
>
>
> On Thu, Aug 14, 2014 at 10:53 AM, Grzegorz Białek <
> grzegorz.bialek@codilime.com> wrote:
>
>> Hi,
>>
>> Thank you both for your answers. Browsing using Master UI works fine.
>> Unfortunately History Server shows "No Completed Applications Found" even
>> if logs exists under given directory, but using Master UI is enough for me.
>>
>> Best regards,
>> Grzegorz
>>
>>
>>
>> On Wed, Aug 13, 2014 at 8:09 PM, Andrew Or <an...@databricks.com> wrote:
>>
>>> The Spark UI isn't available through the same address; otherwise new
>>> applications won't be able to bind to it. Once the old application
>>> finishes, the standalone Master renders the after-the-fact application UI
>>> and exposes it under a different URL. To see this, go to the Master UI
>>> (<master-url>:8080) and click on your application in the "Completed
>>> Applications" table.
>>>
>>>
>>> 2014-08-13 10:56 GMT-07:00 Matei Zaharia <ma...@gmail.com>:
>>>
>>> Take a look at http://spark.apache.org/docs/latest/monitoring.html --
>>>> you need to launch a history server to serve the logs.
>>>>
>>>> Matei
>>>>
>>>> On August 13, 2014 at 2:03:08 AM, grzegorz-bialek (
>>>> grzegorz.bialek@codilime.com) wrote:
>>>>
>>>> Hi,
>>>> I wanted to access Spark web UI after application stops. I set
>>>> spark.eventLog.enabled to true and logs are availaible
>>>> in JSON format in /tmp/spark-event but web UI isn't available under
>>>> address
>>>> http://<driver-node>:4040
>>>> I'm running Spark in standalone mode.
>>>>
>>>> What should I do to access web UI after application ends?
>>>>
>>>> Thanks,
>>>> Grzegorz
>>>>
>>>>
>>>>
>>>> --
>>>> View this message in context:
>>>> http://apache-spark-user-list.1001560.n3.nabble.com/Viewing-web-UI-after-fact-tp12023.html
>>>> Sent from the Apache Spark User List mailing list archive at
>>>> Nabble.com.
>>>>
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>>>> For additional commands, e-mail: user-help@spark.apache.org
>>>>
>>>>
>>>
>>
>

Re: Viewing web UI after fact

Posted by lihu <li...@gmail.com>.
Hi Grzegorz:
      I have a similar  scenario with you, but even I called the sc.stop(),
there is no  APPLICATION_COMPLETE  file in the log directory. can you share
some experiment for this problem. Thanks very much.

On Mon, Sep 15, 2014 at 4:10 PM, Grzegorz Białek <
grzegorz.bialek@codilime.com> wrote:

> Hi Andrew,
>
> sorry for late response. Thank you very much for solving my problem. There
> was no APPLICATION_COMPLETE file in log directory due to not calling
> sc.stop() at the end of program. With stopping spark context everything
> works correctly, so thank you again.
>
> Best regards,
> Grzegorz
>
>
> On Fri, Sep 5, 2014 at 8:06 PM, Andrew Or <an...@databricks.com> wrote:
>
>> Hi Grzegorz,
>>
>> Can you verify that there are "APPLICATION_COMPLETE" files in the event
>> log directories? E.g. Does
>> file:/tmp/spark-events/app-name-1234567890/APPLICATION_COMPLETE exist? If
>> not, it could be that your application didn't call sc.stop(), so the
>> "ApplicationEnd" event is not actually logged. The HistoryServer looks for
>> this special file to identify applications to display. You could also try
>> manually adding the "APPLICATION_COMPLETE" file to this directory; the
>> HistoryServer should pick this up and display the application, though the
>> information displayed will be incomplete because the log did not capture
>> all the events (sc.stop() does a final close() on the file written).
>>
>> Andrew
>>
>>
>> 2014-09-05 1:50 GMT-07:00 Grzegorz Białek <gr...@codilime.com>:
>>
>> Hi Andrew,
>>>
>>> thank you very much for your answer. Unfortunately it still doesn't
>>> work. I'm using Spark 1.0.0, and I start history server running
>>> sbin/start-history-server.sh <dir>, although I also set
>>>  SPARK_HISTORY_OPTS=-Dspark.history.fs.logDirectory in
>>> conf/spark-env.sh. I tried also other dir than /tmp/spark-events which
>>> have all possible permissions enabled. Also adding file: (and file://)
>>> didn't help - history server still shows:
>>> History Server
>>> Event Log Location: file:/tmp/spark-events/
>>> No Completed Applications Found.
>>>
>>> Best regards,
>>> Grzegorz
>>>
>>>
>>> On Thu, Sep 4, 2014 at 8:20 PM, Andrew Or <an...@databricks.com> wrote:
>>>
>>>> Hi Grzegorz,
>>>>
>>>> Sorry for the late response. Unfortunately, if the Master UI doesn't
>>>> know about your applications (they are "completed" with respect to a
>>>> different Master), then it can't regenerate the UIs even if the logs exist.
>>>> You will have to use the history server for that.
>>>>
>>>> How did you start the history server? If you are using Spark <=1.0, you
>>>> can pass the directory as an argument to the sbin/start-history-server.sh
>>>> script. Otherwise, you may need to set the following in your
>>>> conf/spark-env.sh to specify the log directory:
>>>>
>>>> export
>>>> SPARK_HISTORY_OPTS=-Dspark.history.fs.logDirectory=/tmp/spark-events
>>>>
>>>> It could also be a permissions thing. Make sure your logs in
>>>> /tmp/spark-events are accessible by the JVM that runs the history server.
>>>> Also, there's a chance that "/tmp/spark-events" is interpreted as an HDFS
>>>> path depending on which Spark version you're running. To resolve any
>>>> ambiguity, you may set the log path to "file:/tmp/spark-events" instead.
>>>> But first verify whether they actually exist.
>>>>
>>>> Let me know if you get it working,
>>>> -Andrew
>>>>
>>>>
>>>>
>>>> 2014-08-19 8:23 GMT-07:00 Grzegorz Białek <grzegorz.bialek@codilime.com
>>>> >:
>>>>
>>>> Hi,
>>>>> Is there any way view history of applications statistics in master ui
>>>>> after restarting master server? I have all logs ing /tmp/spark-events/ but
>>>>> when I start history server in this directory it says "No Completed
>>>>> Applications Found". Maybe I could copy this logs to dir used by master
>>>>> server but I couldn't find any. Or maybe I'm doing something wrong
>>>>> launching history server.
>>>>> Do you have any idea how to solve it?
>>>>>
>>>>> Thanks,
>>>>> Grzegorz
>>>>>
>>>>>
>>>>> On Thu, Aug 14, 2014 at 10:53 AM, Grzegorz Białek <
>>>>> grzegorz.bialek@codilime.com> wrote:
>>>>>
>>>>>> Hi,
>>>>>>
>>>>>> Thank you both for your answers. Browsing using Master UI works fine.
>>>>>> Unfortunately History Server shows "No Completed Applications Found" even
>>>>>> if logs exists under given directory, but using Master UI is enough for me.
>>>>>>
>>>>>> Best regards,
>>>>>> Grzegorz
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Wed, Aug 13, 2014 at 8:09 PM, Andrew Or <an...@databricks.com>
>>>>>> wrote:
>>>>>>
>>>>>>> The Spark UI isn't available through the same address; otherwise new
>>>>>>> applications won't be able to bind to it. Once the old application
>>>>>>> finishes, the standalone Master renders the after-the-fact application UI
>>>>>>> and exposes it under a different URL. To see this, go to the Master UI
>>>>>>> (<master-url>:8080) and click on your application in the "Completed
>>>>>>> Applications" table.
>>>>>>>
>>>>>>>
>>>>>>> 2014-08-13 10:56 GMT-07:00 Matei Zaharia <ma...@gmail.com>:
>>>>>>>
>>>>>>> Take a look at http://spark.apache.org/docs/latest/monitoring.html
>>>>>>>> -- you need to launch a history server to serve the logs.
>>>>>>>>
>>>>>>>> Matei
>>>>>>>>
>>>>>>>> On August 13, 2014 at 2:03:08 AM, grzegorz-bialek (
>>>>>>>> grzegorz.bialek@codilime.com) wrote:
>>>>>>>>
>>>>>>>> Hi,
>>>>>>>> I wanted to access Spark web UI after application stops. I set
>>>>>>>> spark.eventLog.enabled to true and logs are availaible
>>>>>>>> in JSON format in /tmp/spark-event but web UI isn't available under
>>>>>>>> address
>>>>>>>> http://<driver-node>:4040
>>>>>>>> I'm running Spark in standalone mode.
>>>>>>>>
>>>>>>>> What should I do to access web UI after application ends?
>>>>>>>>
>>>>>>>> Thanks,
>>>>>>>> Grzegorz
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> --
>>>>>>>> View this message in context:
>>>>>>>> http://apache-spark-user-list.1001560.n3.nabble.com/Viewing-web-UI-after-fact-tp12023.html
>>>>>>>> Sent from the Apache Spark User List mailing list archive at
>>>>>>>> Nabble.com.
>>>>>>>>
>>>>>>>> ---------------------------------------------------------------------
>>>>>>>>
>>>>>>>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>>>>>>>> For additional commands, e-mail: user-help@spark.apache.org
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Viewing web UI after fact

Posted by lihu <li...@gmail.com>.
How do you solved this problem? I run the standalone application, but there
is no APPLICATION_COMPLETE file too.

On Sat, Nov 8, 2014 at 2:11 PM, Arun Ahuja <aa...@gmail.com> wrote:

> We are running our applications through YARN and are only somtimes seeing
> them into the History Server.  Most do not seem to have the
> APPLICATION_COMPLETE file.  Specifically any job that ends because of "yarn
> application -kill" does not show up.  For other ones what would be a reason
> for them not to appear in the Spark UI?  Is there any update on this?
>
> Thanks,
> Arun
>
> On Mon, Sep 15, 2014 at 4:10 AM, Grzegorz Białek <
> grzegorz.bialek@codilime.com> wrote:
>
>> Hi Andrew,
>>
>> sorry for late response. Thank you very much for solving my problem.
>> There was no APPLICATION_COMPLETE file in log directory due to not
>> calling sc.stop() at the end of program. With stopping spark context
>> everything works correctly, so thank you again.
>>
>> Best regards,
>> Grzegorz
>>
>>
>> On Fri, Sep 5, 2014 at 8:06 PM, Andrew Or <an...@databricks.com> wrote:
>>
>>> Hi Grzegorz,
>>>
>>> Can you verify that there are "APPLICATION_COMPLETE" files in the event
>>> log directories? E.g. Does
>>> file:/tmp/spark-events/app-name-1234567890/APPLICATION_COMPLETE exist? If
>>> not, it could be that your application didn't call sc.stop(), so the
>>> "ApplicationEnd" event is not actually logged. The HistoryServer looks for
>>> this special file to identify applications to display. You could also try
>>> manually adding the "APPLICATION_COMPLETE" file to this directory; the
>>> HistoryServer should pick this up and display the application, though the
>>> information displayed will be incomplete because the log did not capture
>>> all the events (sc.stop() does a final close() on the file written).
>>>
>>> Andrew
>>>
>>>
>>> 2014-09-05 1:50 GMT-07:00 Grzegorz Białek <gr...@codilime.com>
>>> :
>>>
>>> Hi Andrew,
>>>>
>>>> thank you very much for your answer. Unfortunately it still doesn't
>>>> work. I'm using Spark 1.0.0, and I start history server running
>>>> sbin/start-history-server.sh <dir>, although I also set
>>>>  SPARK_HISTORY_OPTS=-Dspark.history.fs.logDirectory in
>>>> conf/spark-env.sh. I tried also other dir than /tmp/spark-events which
>>>> have all possible permissions enabled. Also adding file: (and file://)
>>>> didn't help - history server still shows:
>>>> History Server
>>>> Event Log Location: file:/tmp/spark-events/
>>>> No Completed Applications Found.
>>>>
>>>> Best regards,
>>>> Grzegorz
>>>>
>>>>
>>>> On Thu, Sep 4, 2014 at 8:20 PM, Andrew Or <an...@databricks.com>
>>>> wrote:
>>>>
>>>>> Hi Grzegorz,
>>>>>
>>>>> Sorry for the late response. Unfortunately, if the Master UI doesn't
>>>>> know about your applications (they are "completed" with respect to a
>>>>> different Master), then it can't regenerate the UIs even if the logs exist.
>>>>> You will have to use the history server for that.
>>>>>
>>>>> How did you start the history server? If you are using Spark <=1.0,
>>>>> you can pass the directory as an argument to the
>>>>> sbin/start-history-server.sh script. Otherwise, you may need to set the
>>>>> following in your conf/spark-env.sh to specify the log directory:
>>>>>
>>>>> export
>>>>> SPARK_HISTORY_OPTS=-Dspark.history.fs.logDirectory=/tmp/spark-events
>>>>>
>>>>> It could also be a permissions thing. Make sure your logs in
>>>>> /tmp/spark-events are accessible by the JVM that runs the history server.
>>>>> Also, there's a chance that "/tmp/spark-events" is interpreted as an HDFS
>>>>> path depending on which Spark version you're running. To resolve any
>>>>> ambiguity, you may set the log path to "file:/tmp/spark-events" instead.
>>>>> But first verify whether they actually exist.
>>>>>
>>>>> Let me know if you get it working,
>>>>> -Andrew
>>>>>
>>>>>
>>>>>
>>>>> 2014-08-19 8:23 GMT-07:00 Grzegorz Białek <
>>>>> grzegorz.bialek@codilime.com>:
>>>>>
>>>>> Hi,
>>>>>> Is there any way view history of applications statistics in master ui
>>>>>> after restarting master server? I have all logs ing /tmp/spark-events/ but
>>>>>> when I start history server in this directory it says "No Completed
>>>>>> Applications Found". Maybe I could copy this logs to dir used by master
>>>>>> server but I couldn't find any. Or maybe I'm doing something wrong
>>>>>> launching history server.
>>>>>> Do you have any idea how to solve it?
>>>>>>
>>>>>> Thanks,
>>>>>> Grzegorz
>>>>>>
>>>>>>
>>>>>> On Thu, Aug 14, 2014 at 10:53 AM, Grzegorz Białek <
>>>>>> grzegorz.bialek@codilime.com> wrote:
>>>>>>
>>>>>>> Hi,
>>>>>>>
>>>>>>> Thank you both for your answers. Browsing using Master UI works
>>>>>>> fine. Unfortunately History Server shows "No Completed Applications Found"
>>>>>>> even if logs exists under given directory, but using Master UI is enough
>>>>>>> for me.
>>>>>>>
>>>>>>> Best regards,
>>>>>>> Grzegorz
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Wed, Aug 13, 2014 at 8:09 PM, Andrew Or <an...@databricks.com>
>>>>>>> wrote:
>>>>>>>
>>>>>>>> The Spark UI isn't available through the same address; otherwise
>>>>>>>> new applications won't be able to bind to it. Once the old application
>>>>>>>> finishes, the standalone Master renders the after-the-fact application UI
>>>>>>>> and exposes it under a different URL. To see this, go to the Master UI
>>>>>>>> (<master-url>:8080) and click on your application in the "Completed
>>>>>>>> Applications" table.
>>>>>>>>
>>>>>>>>
>>>>>>>> 2014-08-13 10:56 GMT-07:00 Matei Zaharia <ma...@gmail.com>:
>>>>>>>>
>>>>>>>> Take a look at http://spark.apache.org/docs/latest/monitoring.html
>>>>>>>>> -- you need to launch a history server to serve the logs.
>>>>>>>>>
>>>>>>>>> Matei
>>>>>>>>>
>>>>>>>>> On August 13, 2014 at 2:03:08 AM, grzegorz-bialek (
>>>>>>>>> grzegorz.bialek@codilime.com) wrote:
>>>>>>>>>
>>>>>>>>> Hi,
>>>>>>>>> I wanted to access Spark web UI after application stops. I set
>>>>>>>>> spark.eventLog.enabled to true and logs are availaible
>>>>>>>>> in JSON format in /tmp/spark-event but web UI isn't available
>>>>>>>>> under address
>>>>>>>>> http://<driver-node>:4040
>>>>>>>>> I'm running Spark in standalone mode.
>>>>>>>>>
>>>>>>>>> What should I do to access web UI after application ends?
>>>>>>>>>
>>>>>>>>> Thanks,
>>>>>>>>> Grzegorz
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> --
>>>>>>>>> View this message in context:
>>>>>>>>> http://apache-spark-user-list.1001560.n3.nabble.com/Viewing-web-UI-after-fact-tp12023.html
>>>>>>>>> Sent from the Apache Spark User List mailing list archive at
>>>>>>>>> Nabble.com.
>>>>>>>>>
>>>>>>>>> ---------------------------------------------------------------------
>>>>>>>>>
>>>>>>>>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>>>>>>>>> For additional commands, e-mail: user-help@spark.apache.org
>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>


-- 
*Best Wishes!*

*Li Hu(李浒) | Graduate Student*

*Institute for Interdisciplinary Information Sciences(IIIS
<http://iiis.tsinghua.edu.cn/>)*
*Tsinghua University, China*

*Email: lihu723@gmail.com <li...@gmail.com>*
*Homepage: http://iiis.tsinghua.edu.cn/zh/lihu/
<http://iiis.tsinghua.edu.cn/zh/lihu/>*

Re: Viewing web UI after fact

Posted by Arun Ahuja <aa...@gmail.com>.
We are running our applications through YARN and are only somtimes seeing
them into the History Server.  Most do not seem to have the
APPLICATION_COMPLETE file.  Specifically any job that ends because of "yarn
application -kill" does not show up.  For other ones what would be a reason
for them not to appear in the Spark UI?  Is there any update on this?

Thanks,
Arun

On Mon, Sep 15, 2014 at 4:10 AM, Grzegorz Białek <
grzegorz.bialek@codilime.com> wrote:

> Hi Andrew,
>
> sorry for late response. Thank you very much for solving my problem. There
> was no APPLICATION_COMPLETE file in log directory due to not calling
> sc.stop() at the end of program. With stopping spark context everything
> works correctly, so thank you again.
>
> Best regards,
> Grzegorz
>
>
> On Fri, Sep 5, 2014 at 8:06 PM, Andrew Or <an...@databricks.com> wrote:
>
>> Hi Grzegorz,
>>
>> Can you verify that there are "APPLICATION_COMPLETE" files in the event
>> log directories? E.g. Does
>> file:/tmp/spark-events/app-name-1234567890/APPLICATION_COMPLETE exist? If
>> not, it could be that your application didn't call sc.stop(), so the
>> "ApplicationEnd" event is not actually logged. The HistoryServer looks for
>> this special file to identify applications to display. You could also try
>> manually adding the "APPLICATION_COMPLETE" file to this directory; the
>> HistoryServer should pick this up and display the application, though the
>> information displayed will be incomplete because the log did not capture
>> all the events (sc.stop() does a final close() on the file written).
>>
>> Andrew
>>
>>
>> 2014-09-05 1:50 GMT-07:00 Grzegorz Białek <gr...@codilime.com>:
>>
>> Hi Andrew,
>>>
>>> thank you very much for your answer. Unfortunately it still doesn't
>>> work. I'm using Spark 1.0.0, and I start history server running
>>> sbin/start-history-server.sh <dir>, although I also set
>>>  SPARK_HISTORY_OPTS=-Dspark.history.fs.logDirectory in
>>> conf/spark-env.sh. I tried also other dir than /tmp/spark-events which
>>> have all possible permissions enabled. Also adding file: (and file://)
>>> didn't help - history server still shows:
>>> History Server
>>> Event Log Location: file:/tmp/spark-events/
>>> No Completed Applications Found.
>>>
>>> Best regards,
>>> Grzegorz
>>>
>>>
>>> On Thu, Sep 4, 2014 at 8:20 PM, Andrew Or <an...@databricks.com> wrote:
>>>
>>>> Hi Grzegorz,
>>>>
>>>> Sorry for the late response. Unfortunately, if the Master UI doesn't
>>>> know about your applications (they are "completed" with respect to a
>>>> different Master), then it can't regenerate the UIs even if the logs exist.
>>>> You will have to use the history server for that.
>>>>
>>>> How did you start the history server? If you are using Spark <=1.0, you
>>>> can pass the directory as an argument to the sbin/start-history-server.sh
>>>> script. Otherwise, you may need to set the following in your
>>>> conf/spark-env.sh to specify the log directory:
>>>>
>>>> export
>>>> SPARK_HISTORY_OPTS=-Dspark.history.fs.logDirectory=/tmp/spark-events
>>>>
>>>> It could also be a permissions thing. Make sure your logs in
>>>> /tmp/spark-events are accessible by the JVM that runs the history server.
>>>> Also, there's a chance that "/tmp/spark-events" is interpreted as an HDFS
>>>> path depending on which Spark version you're running. To resolve any
>>>> ambiguity, you may set the log path to "file:/tmp/spark-events" instead.
>>>> But first verify whether they actually exist.
>>>>
>>>> Let me know if you get it working,
>>>> -Andrew
>>>>
>>>>
>>>>
>>>> 2014-08-19 8:23 GMT-07:00 Grzegorz Białek <grzegorz.bialek@codilime.com
>>>> >:
>>>>
>>>> Hi,
>>>>> Is there any way view history of applications statistics in master ui
>>>>> after restarting master server? I have all logs ing /tmp/spark-events/ but
>>>>> when I start history server in this directory it says "No Completed
>>>>> Applications Found". Maybe I could copy this logs to dir used by master
>>>>> server but I couldn't find any. Or maybe I'm doing something wrong
>>>>> launching history server.
>>>>> Do you have any idea how to solve it?
>>>>>
>>>>> Thanks,
>>>>> Grzegorz
>>>>>
>>>>>
>>>>> On Thu, Aug 14, 2014 at 10:53 AM, Grzegorz Białek <
>>>>> grzegorz.bialek@codilime.com> wrote:
>>>>>
>>>>>> Hi,
>>>>>>
>>>>>> Thank you both for your answers. Browsing using Master UI works fine.
>>>>>> Unfortunately History Server shows "No Completed Applications Found" even
>>>>>> if logs exists under given directory, but using Master UI is enough for me.
>>>>>>
>>>>>> Best regards,
>>>>>> Grzegorz
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Wed, Aug 13, 2014 at 8:09 PM, Andrew Or <an...@databricks.com>
>>>>>> wrote:
>>>>>>
>>>>>>> The Spark UI isn't available through the same address; otherwise new
>>>>>>> applications won't be able to bind to it. Once the old application
>>>>>>> finishes, the standalone Master renders the after-the-fact application UI
>>>>>>> and exposes it under a different URL. To see this, go to the Master UI
>>>>>>> (<master-url>:8080) and click on your application in the "Completed
>>>>>>> Applications" table.
>>>>>>>
>>>>>>>
>>>>>>> 2014-08-13 10:56 GMT-07:00 Matei Zaharia <ma...@gmail.com>:
>>>>>>>
>>>>>>> Take a look at http://spark.apache.org/docs/latest/monitoring.html
>>>>>>>> -- you need to launch a history server to serve the logs.
>>>>>>>>
>>>>>>>> Matei
>>>>>>>>
>>>>>>>> On August 13, 2014 at 2:03:08 AM, grzegorz-bialek (
>>>>>>>> grzegorz.bialek@codilime.com) wrote:
>>>>>>>>
>>>>>>>> Hi,
>>>>>>>> I wanted to access Spark web UI after application stops. I set
>>>>>>>> spark.eventLog.enabled to true and logs are availaible
>>>>>>>> in JSON format in /tmp/spark-event but web UI isn't available under
>>>>>>>> address
>>>>>>>> http://<driver-node>:4040
>>>>>>>> I'm running Spark in standalone mode.
>>>>>>>>
>>>>>>>> What should I do to access web UI after application ends?
>>>>>>>>
>>>>>>>> Thanks,
>>>>>>>> Grzegorz
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> --
>>>>>>>> View this message in context:
>>>>>>>> http://apache-spark-user-list.1001560.n3.nabble.com/Viewing-web-UI-after-fact-tp12023.html
>>>>>>>> Sent from the Apache Spark User List mailing list archive at
>>>>>>>> Nabble.com.
>>>>>>>>
>>>>>>>> ---------------------------------------------------------------------
>>>>>>>>
>>>>>>>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>>>>>>>> For additional commands, e-mail: user-help@spark.apache.org
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Viewing web UI after fact

Posted by Grzegorz Białek <gr...@codilime.com>.
Hi Andrew,

sorry for late response. Thank you very much for solving my problem. There
was no APPLICATION_COMPLETE file in log directory due to not calling
sc.stop() at the end of program. With stopping spark context everything
works correctly, so thank you again.

Best regards,
Grzegorz


On Fri, Sep 5, 2014 at 8:06 PM, Andrew Or <an...@databricks.com> wrote:

> Hi Grzegorz,
>
> Can you verify that there are "APPLICATION_COMPLETE" files in the event
> log directories? E.g. Does
> file:/tmp/spark-events/app-name-1234567890/APPLICATION_COMPLETE exist? If
> not, it could be that your application didn't call sc.stop(), so the
> "ApplicationEnd" event is not actually logged. The HistoryServer looks for
> this special file to identify applications to display. You could also try
> manually adding the "APPLICATION_COMPLETE" file to this directory; the
> HistoryServer should pick this up and display the application, though the
> information displayed will be incomplete because the log did not capture
> all the events (sc.stop() does a final close() on the file written).
>
> Andrew
>
>
> 2014-09-05 1:50 GMT-07:00 Grzegorz Białek <gr...@codilime.com>:
>
> Hi Andrew,
>>
>> thank you very much for your answer. Unfortunately it still doesn't work.
>> I'm using Spark 1.0.0, and I start history server running
>> sbin/start-history-server.sh <dir>, although I also set
>>  SPARK_HISTORY_OPTS=-Dspark.history.fs.logDirectory in
>> conf/spark-env.sh. I tried also other dir than /tmp/spark-events which
>> have all possible permissions enabled. Also adding file: (and file://)
>> didn't help - history server still shows:
>> History Server
>> Event Log Location: file:/tmp/spark-events/
>> No Completed Applications Found.
>>
>> Best regards,
>> Grzegorz
>>
>>
>> On Thu, Sep 4, 2014 at 8:20 PM, Andrew Or <an...@databricks.com> wrote:
>>
>>> Hi Grzegorz,
>>>
>>> Sorry for the late response. Unfortunately, if the Master UI doesn't
>>> know about your applications (they are "completed" with respect to a
>>> different Master), then it can't regenerate the UIs even if the logs exist.
>>> You will have to use the history server for that.
>>>
>>> How did you start the history server? If you are using Spark <=1.0, you
>>> can pass the directory as an argument to the sbin/start-history-server.sh
>>> script. Otherwise, you may need to set the following in your
>>> conf/spark-env.sh to specify the log directory:
>>>
>>> export
>>> SPARK_HISTORY_OPTS=-Dspark.history.fs.logDirectory=/tmp/spark-events
>>>
>>> It could also be a permissions thing. Make sure your logs in
>>> /tmp/spark-events are accessible by the JVM that runs the history server.
>>> Also, there's a chance that "/tmp/spark-events" is interpreted as an HDFS
>>> path depending on which Spark version you're running. To resolve any
>>> ambiguity, you may set the log path to "file:/tmp/spark-events" instead.
>>> But first verify whether they actually exist.
>>>
>>> Let me know if you get it working,
>>> -Andrew
>>>
>>>
>>>
>>> 2014-08-19 8:23 GMT-07:00 Grzegorz Białek <gr...@codilime.com>
>>> :
>>>
>>> Hi,
>>>> Is there any way view history of applications statistics in master ui
>>>> after restarting master server? I have all logs ing /tmp/spark-events/ but
>>>> when I start history server in this directory it says "No Completed
>>>> Applications Found". Maybe I could copy this logs to dir used by master
>>>> server but I couldn't find any. Or maybe I'm doing something wrong
>>>> launching history server.
>>>> Do you have any idea how to solve it?
>>>>
>>>> Thanks,
>>>> Grzegorz
>>>>
>>>>
>>>> On Thu, Aug 14, 2014 at 10:53 AM, Grzegorz Białek <
>>>> grzegorz.bialek@codilime.com> wrote:
>>>>
>>>>> Hi,
>>>>>
>>>>> Thank you both for your answers. Browsing using Master UI works fine.
>>>>> Unfortunately History Server shows "No Completed Applications Found" even
>>>>> if logs exists under given directory, but using Master UI is enough for me.
>>>>>
>>>>> Best regards,
>>>>> Grzegorz
>>>>>
>>>>>
>>>>>
>>>>> On Wed, Aug 13, 2014 at 8:09 PM, Andrew Or <an...@databricks.com>
>>>>> wrote:
>>>>>
>>>>>> The Spark UI isn't available through the same address; otherwise new
>>>>>> applications won't be able to bind to it. Once the old application
>>>>>> finishes, the standalone Master renders the after-the-fact application UI
>>>>>> and exposes it under a different URL. To see this, go to the Master UI
>>>>>> (<master-url>:8080) and click on your application in the "Completed
>>>>>> Applications" table.
>>>>>>
>>>>>>
>>>>>> 2014-08-13 10:56 GMT-07:00 Matei Zaharia <ma...@gmail.com>:
>>>>>>
>>>>>> Take a look at http://spark.apache.org/docs/latest/monitoring.html
>>>>>>> -- you need to launch a history server to serve the logs.
>>>>>>>
>>>>>>> Matei
>>>>>>>
>>>>>>> On August 13, 2014 at 2:03:08 AM, grzegorz-bialek (
>>>>>>> grzegorz.bialek@codilime.com) wrote:
>>>>>>>
>>>>>>> Hi,
>>>>>>> I wanted to access Spark web UI after application stops. I set
>>>>>>> spark.eventLog.enabled to true and logs are availaible
>>>>>>> in JSON format in /tmp/spark-event but web UI isn't available under
>>>>>>> address
>>>>>>> http://<driver-node>:4040
>>>>>>> I'm running Spark in standalone mode.
>>>>>>>
>>>>>>> What should I do to access web UI after application ends?
>>>>>>>
>>>>>>> Thanks,
>>>>>>> Grzegorz
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> --
>>>>>>> View this message in context:
>>>>>>> http://apache-spark-user-list.1001560.n3.nabble.com/Viewing-web-UI-after-fact-tp12023.html
>>>>>>> Sent from the Apache Spark User List mailing list archive at
>>>>>>> Nabble.com.
>>>>>>>
>>>>>>> ---------------------------------------------------------------------
>>>>>>>
>>>>>>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>>>>>>> For additional commands, e-mail: user-help@spark.apache.org
>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Viewing web UI after fact

Posted by Andrew Or <an...@databricks.com>.
Hi Grzegorz,

Can you verify that there are "APPLICATION_COMPLETE" files in the event log
directories? E.g. Does
file:/tmp/spark-events/app-name-1234567890/APPLICATION_COMPLETE exist? If
not, it could be that your application didn't call sc.stop(), so the
"ApplicationEnd" event is not actually logged. The HistoryServer looks for
this special file to identify applications to display. You could also try
manually adding the "APPLICATION_COMPLETE" file to this directory; the
HistoryServer should pick this up and display the application, though the
information displayed will be incomplete because the log did not capture
all the events (sc.stop() does a final close() on the file written).

Andrew


2014-09-05 1:50 GMT-07:00 Grzegorz Białek <gr...@codilime.com>:

> Hi Andrew,
>
> thank you very much for your answer. Unfortunately it still doesn't work.
> I'm using Spark 1.0.0, and I start history server running
> sbin/start-history-server.sh <dir>, although I also set
>  SPARK_HISTORY_OPTS=-Dspark.history.fs.logDirectory in conf/spark-env.sh.
> I tried also other dir than /tmp/spark-events which have all possible
> permissions enabled. Also adding file: (and file://) didn't help - history
> server still shows:
> History Server
> Event Log Location: file:/tmp/spark-events/
> No Completed Applications Found.
>
> Best regards,
> Grzegorz
>
>
> On Thu, Sep 4, 2014 at 8:20 PM, Andrew Or <an...@databricks.com> wrote:
>
>> Hi Grzegorz,
>>
>> Sorry for the late response. Unfortunately, if the Master UI doesn't know
>> about your applications (they are "completed" with respect to a different
>> Master), then it can't regenerate the UIs even if the logs exist. You will
>> have to use the history server for that.
>>
>> How did you start the history server? If you are using Spark <=1.0, you
>> can pass the directory as an argument to the sbin/start-history-server.sh
>> script. Otherwise, you may need to set the following in your
>> conf/spark-env.sh to specify the log directory:
>>
>> export
>> SPARK_HISTORY_OPTS=-Dspark.history.fs.logDirectory=/tmp/spark-events
>>
>> It could also be a permissions thing. Make sure your logs in
>> /tmp/spark-events are accessible by the JVM that runs the history server.
>> Also, there's a chance that "/tmp/spark-events" is interpreted as an HDFS
>> path depending on which Spark version you're running. To resolve any
>> ambiguity, you may set the log path to "file:/tmp/spark-events" instead.
>> But first verify whether they actually exist.
>>
>> Let me know if you get it working,
>> -Andrew
>>
>>
>>
>> 2014-08-19 8:23 GMT-07:00 Grzegorz Białek <gr...@codilime.com>:
>>
>> Hi,
>>> Is there any way view history of applications statistics in master ui
>>> after restarting master server? I have all logs ing /tmp/spark-events/ but
>>> when I start history server in this directory it says "No Completed
>>> Applications Found". Maybe I could copy this logs to dir used by master
>>> server but I couldn't find any. Or maybe I'm doing something wrong
>>> launching history server.
>>> Do you have any idea how to solve it?
>>>
>>> Thanks,
>>> Grzegorz
>>>
>>>
>>> On Thu, Aug 14, 2014 at 10:53 AM, Grzegorz Białek <
>>> grzegorz.bialek@codilime.com> wrote:
>>>
>>>> Hi,
>>>>
>>>> Thank you both for your answers. Browsing using Master UI works fine.
>>>> Unfortunately History Server shows "No Completed Applications Found" even
>>>> if logs exists under given directory, but using Master UI is enough for me.
>>>>
>>>> Best regards,
>>>> Grzegorz
>>>>
>>>>
>>>>
>>>> On Wed, Aug 13, 2014 at 8:09 PM, Andrew Or <an...@databricks.com>
>>>> wrote:
>>>>
>>>>> The Spark UI isn't available through the same address; otherwise new
>>>>> applications won't be able to bind to it. Once the old application
>>>>> finishes, the standalone Master renders the after-the-fact application UI
>>>>> and exposes it under a different URL. To see this, go to the Master UI
>>>>> (<master-url>:8080) and click on your application in the "Completed
>>>>> Applications" table.
>>>>>
>>>>>
>>>>> 2014-08-13 10:56 GMT-07:00 Matei Zaharia <ma...@gmail.com>:
>>>>>
>>>>> Take a look at http://spark.apache.org/docs/latest/monitoring.html --
>>>>>> you need to launch a history server to serve the logs.
>>>>>>
>>>>>> Matei
>>>>>>
>>>>>> On August 13, 2014 at 2:03:08 AM, grzegorz-bialek (
>>>>>> grzegorz.bialek@codilime.com) wrote:
>>>>>>
>>>>>> Hi,
>>>>>> I wanted to access Spark web UI after application stops. I set
>>>>>> spark.eventLog.enabled to true and logs are availaible
>>>>>> in JSON format in /tmp/spark-event but web UI isn't available under
>>>>>> address
>>>>>> http://<driver-node>:4040
>>>>>> I'm running Spark in standalone mode.
>>>>>>
>>>>>> What should I do to access web UI after application ends?
>>>>>>
>>>>>> Thanks,
>>>>>> Grzegorz
>>>>>>
>>>>>>
>>>>>>
>>>>>> --
>>>>>> View this message in context:
>>>>>> http://apache-spark-user-list.1001560.n3.nabble.com/Viewing-web-UI-after-fact-tp12023.html
>>>>>> Sent from the Apache Spark User List mailing list archive at
>>>>>> Nabble.com.
>>>>>>
>>>>>> ---------------------------------------------------------------------
>>>>>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>>>>>> For additional commands, e-mail: user-help@spark.apache.org
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Viewing web UI after fact

Posted by Grzegorz Białek <gr...@codilime.com>.
Hi Andrew,

thank you very much for your answer. Unfortunately it still doesn't work.
I'm using Spark 1.0.0, and I start history server running
sbin/start-history-server.sh <dir>, although I also set
 SPARK_HISTORY_OPTS=-Dspark.history.fs.logDirectory in conf/spark-env.sh. I
tried also other dir than /tmp/spark-events which have all possible
permissions enabled. Also adding file: (and file://) didn't help - history
server still shows:
History Server
Event Log Location: file:/tmp/spark-events/
No Completed Applications Found.

Best regards,
Grzegorz


On Thu, Sep 4, 2014 at 8:20 PM, Andrew Or <an...@databricks.com> wrote:

> Hi Grzegorz,
>
> Sorry for the late response. Unfortunately, if the Master UI doesn't know
> about your applications (they are "completed" with respect to a different
> Master), then it can't regenerate the UIs even if the logs exist. You will
> have to use the history server for that.
>
> How did you start the history server? If you are using Spark <=1.0, you
> can pass the directory as an argument to the sbin/start-history-server.sh
> script. Otherwise, you may need to set the following in your
> conf/spark-env.sh to specify the log directory:
>
> export SPARK_HISTORY_OPTS=-Dspark.history.fs.logDirectory=/tmp/spark-events
>
> It could also be a permissions thing. Make sure your logs in
> /tmp/spark-events are accessible by the JVM that runs the history server.
> Also, there's a chance that "/tmp/spark-events" is interpreted as an HDFS
> path depending on which Spark version you're running. To resolve any
> ambiguity, you may set the log path to "file:/tmp/spark-events" instead.
> But first verify whether they actually exist.
>
> Let me know if you get it working,
> -Andrew
>
>
>
> 2014-08-19 8:23 GMT-07:00 Grzegorz Białek <gr...@codilime.com>:
>
> Hi,
>> Is there any way view history of applications statistics in master ui
>> after restarting master server? I have all logs ing /tmp/spark-events/ but
>> when I start history server in this directory it says "No Completed
>> Applications Found". Maybe I could copy this logs to dir used by master
>> server but I couldn't find any. Or maybe I'm doing something wrong
>> launching history server.
>> Do you have any idea how to solve it?
>>
>> Thanks,
>> Grzegorz
>>
>>
>> On Thu, Aug 14, 2014 at 10:53 AM, Grzegorz Białek <
>> grzegorz.bialek@codilime.com> wrote:
>>
>>> Hi,
>>>
>>> Thank you both for your answers. Browsing using Master UI works fine.
>>> Unfortunately History Server shows "No Completed Applications Found" even
>>> if logs exists under given directory, but using Master UI is enough for me.
>>>
>>> Best regards,
>>> Grzegorz
>>>
>>>
>>>
>>> On Wed, Aug 13, 2014 at 8:09 PM, Andrew Or <an...@databricks.com>
>>> wrote:
>>>
>>>> The Spark UI isn't available through the same address; otherwise new
>>>> applications won't be able to bind to it. Once the old application
>>>> finishes, the standalone Master renders the after-the-fact application UI
>>>> and exposes it under a different URL. To see this, go to the Master UI
>>>> (<master-url>:8080) and click on your application in the "Completed
>>>> Applications" table.
>>>>
>>>>
>>>> 2014-08-13 10:56 GMT-07:00 Matei Zaharia <ma...@gmail.com>:
>>>>
>>>> Take a look at http://spark.apache.org/docs/latest/monitoring.html --
>>>>> you need to launch a history server to serve the logs.
>>>>>
>>>>> Matei
>>>>>
>>>>> On August 13, 2014 at 2:03:08 AM, grzegorz-bialek (
>>>>> grzegorz.bialek@codilime.com) wrote:
>>>>>
>>>>> Hi,
>>>>> I wanted to access Spark web UI after application stops. I set
>>>>> spark.eventLog.enabled to true and logs are availaible
>>>>> in JSON format in /tmp/spark-event but web UI isn't available under
>>>>> address
>>>>> http://<driver-node>:4040
>>>>> I'm running Spark in standalone mode.
>>>>>
>>>>> What should I do to access web UI after application ends?
>>>>>
>>>>> Thanks,
>>>>> Grzegorz
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> View this message in context:
>>>>> http://apache-spark-user-list.1001560.n3.nabble.com/Viewing-web-UI-after-fact-tp12023.html
>>>>> Sent from the Apache Spark User List mailing list archive at
>>>>> Nabble.com.
>>>>>
>>>>> ---------------------------------------------------------------------
>>>>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>>>>> For additional commands, e-mail: user-help@spark.apache.org
>>>>>
>>>>>
>>>>
>>>
>>
>