You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by gi...@git.apache.org on 2017/08/22 09:14:02 UTC
[GitHub] ChidanandKumarKS opened a new issue #7557: Parsing Training accuracy, Training loss log files to plot Training accuracy, Training loss curves
ChidanandKumarKS opened a new issue #7557: Parsing Training accuracy, Training loss log files to plot Training accuracy, Training loss curves
URL: https://github.com/apache/incubator-mxnet/issues/7557
For bugs or installation issues, please provide the following information.
The more information you provide, the more likely people will be able to help you.
## Environment info
Operating System: Ubuntu
Compiler: python
Package used (Python/R/Scala/Julia):
MXNet version:0.9.5
Or if installed from source:
MXNet commit hash (`git rev-parse HEAD`):
If you are using python package, please provide
Python version and distribution: 2.7
**I had a log file which containing training accuracy/loss metric for every batch iteration of 10. I need to parse log file and plot the Training accuracy, Training loss curves. Apending the log file . Kindly provide solution**
Epoch[0] Batch [10] Speed: 3.45 samples/sec Train-FCNLogLoss=2.907518,
Epoch[0] Batch [20] Speed: 3.40 samples/sec Train-FCNLogLoss=2.873399,
Epoch[0] Batch [30] Speed: 3.43 samples/sec Train-FCNLogLoss=2.829898,
Epoch[0] Batch [40] Speed: 3.43 samples/sec Train-FCNLogLoss=2.774505,
Epoch[0] Batch [50] Speed: 3.42 samples/sec Train-FCNLogLoss=2.707620,
Epoch[0] Batch [60] Speed: 3.42 samples/sec Train-FCNLogLoss=2.613852,
Epoch[0] Batch [70] Speed: 3.41 samples/sec Train-FCNLogLoss=2.504943,
Epoch[0] Batch [80] Speed: 3.42 samples/sec Train-FCNLogLoss=2.423512,
Epoch[0] Batch [90] Speed: 3.44 samples/sec Train-FCNLogLoss=2.337298,
Epoch[0] Batch [100] Speed: 3.44 samples/sec Train-FCNLogLoss=2.285616,
Epoch[0] Batch [110] Speed: 3.42 samples/sec Train-FCNLogLoss=2.207729,
Epoch[0] Batch [120] Speed: 3.43 samples/sec Train-FCNLogLoss=2.137323,
Epoch[0] Batch [130] Speed: 3.43 samples/sec Train-FCNLogLoss=2.068040,
Epoch[0] Batch [140] Speed: 3.39 samples/sec Train-FCNLogLoss=2.013316,
Epoch[0] Batch [150] Speed: 3.39 samples/sec Train-FCNLogLoss=1.964707,
Epoch[0] Batch [160] Speed: 3.28 samples/sec Train-FCNLogLoss=1.915245,
Epoch[0] Batch [170] Speed: 3.21 samples/sec Train-FCNLogLoss=1.861571,
Epoch[0] Batch [180] Speed: 3.41 samples/sec Train-FCNLogLoss=1.806498,
Epoch[0] Batch [190] Speed: 3.23 samples/sec Train-FCNLogLoss=1.783524,
Epoch[0] Batch [200] Speed: 3.39 samples/sec Train-FCNLogLoss=1.738048,
Epoch[0] Batch [210] Speed: 3.32 samples/sec Train-FCNLogLoss=1.692679,
Epoch[0] Batch [220] Speed: 3.29 samples/sec Train-FCNLogLoss=1.659245,
Epoch[0] Batch [230] Speed: 3.39 samples/sec Train-FCNLogLoss=1.623011,
Epoch[0] Batch [240] Speed: 3.38 samples/sec Train-FCNLogLoss=1.595014,
Epoch[0] Batch [250] Speed: 3.26 samples/sec Train-FCNLogLoss=1.566186,
Epoch[0] Batch [260] Speed: 3.33 samples/sec Train-FCNLogLoss=1.531082,
Epoch[0] Batch [270] Speed: 3.39 samples/sec Train-FCNLogLoss=1.498752,
Epoch[0] Batch [280] Speed: 3.33 samples/sec Train-FCNLogLoss=1.471670,
Epoch[0] Batch [290] Speed: 3.24 samples/sec Train-FCNLogLoss=1.449113,
Epoch[0] Batch [300] Speed: 3.36 samples/sec Train-FCNLogLoss=1.422568,
Epoch[0] Batch [310] Speed: 3.29 samples/sec Train-FCNLogLoss=1.398356,
Epoch[0] Batch [320] Speed: 3.38 samples/sec Train-FCNLogLoss=1.373092,
Epoch[0] Batch [330] Speed: 3.23 samples/sec Train-FCNLogLoss=1.348424,
Epoch[0] Batch [340] Speed: 3.24 samples/sec Train-FCNLogLoss=1.335490,
Epoch[0] Batch [350] Speed: 3.42 samples/sec Train-FCNLogLoss=1.319229,
Epoch[0] Batch [360] Speed: 3.28 samples/sec Train-FCNLogLoss=1.300550,
Epoch[0] Batch [370] Speed: 3.35 samples/sec Train-FCNLogLoss=1.287138,
Epoch[0] Batch [380] Speed: 3.36 samples/sec Train-FCNLogLoss=1.270809,
Epoch[0] Batch [390] Speed: 3.40 samples/sec Train-FCNLogLoss=1.257176,
Epoch[0] Batch [400] Speed: 3.21 samples/sec Train-FCNLogLoss=1.239309,
Epoch[0] Batch [410] Speed: 3.26 samples/sec Train-FCNLogLoss=1.224328,
Epoch[0] Batch [420] Speed: 3.41 samples/sec Train-FCNLogLoss=1.208191,
Epoch[0] Batch [430] Speed: 3.36 samples/sec Train-FCNLogLoss=1.192171,
Epoch[0] Batch [440] Speed: 3.24 samples/sec Train-FCNLogLoss=1.177886,
Epoch[0] Batch [450] Speed: 3.35 samples/sec Train-FCNLogLoss=1.164295,
Epoch[0] Batch [460] Speed: 3.24 samples/sec Train-FCNLogLoss=1.148030,
Epoch[0] Batch [470] Speed: 3.28 samples/sec Train-FCNLogLoss=1.132920,
Epoch[0] Batch [480] Speed: 3.34 samples/sec Train-FCNLogLoss=1.123244,
Epoch[0] Batch [490] Speed: 3.28 samples/sec Train-FCNLogLoss=1.110626,
Epoch[0] Batch [500] Speed: 3.39 samples/sec Train-FCNLogLoss=1.098466,
Epoch[0] Batch [510] Speed: 3.30 samples/sec Train-FCNLogLoss=1.090236,
Epoch[0] Batch [520] Speed: 3.29 samples/sec Train-FCNLogLoss=1.080485,
Epoch[0] Batch [530] Speed: 3.40 samples/sec Train-FCNLogLoss=1.074670,
Epoch[0] Batch [540] Speed: 3.25 samples/sec Train-FCNLogLoss=1.063372,
Epoch[0] Batch [550] Speed: 3.45 samples/sec Train-FCNLogLoss=1.051009,
Epoch[0] Batch [560] Speed: 3.25 samples/sec Train-FCNLogLoss=1.042338,
Epoch[0] Batch [570] Speed: 3.28 samples/sec Train-FCNLogLoss=1.038065,
Epoch[0] Batch [580] Speed: 3.32 samples/sec Train-FCNLogLoss=1.028653,
Epoch[0] Batch [590] Speed: 3.28 samples/sec Train-FCNLogLoss=1.021546,
Epoch[0] Batch [600] Speed: 3.23 samples/sec Train-FCNLogLoss=1.013576,
Epoch[0] Batch [610] Speed: 3.40 samples/sec Train-FCNLogLoss=1.004878,
Epoch[0] Batch [620] Speed: 3.38 samples/sec Train-FCNLogLoss=0.995920,
Epoch[0] Batch [630] Speed: 3.24 samples/sec Train-FCNLogLoss=0.989954,
Epoch[0] Batch [640] Speed: 3.41 samples/sec Train-FCNLogLoss=0.982139,
Epoch[0] Batch [650] Speed: 3.23 samples/sec Train-FCNLogLoss=0.978145,
Epoch[0] Batch [660] Speed: 3.42 samples/sec Train-FCNLogLoss=0.970109,
Epoch[0] Batch [670] Speed: 3.25 samples/sec Train-FCNLogLoss=0.964531,
Epoch[0] Batch [680] Speed: 3.33 samples/sec Train-FCNLogLoss=0.958440,
Epoch[0] Batch [690] Speed: 3.33 samples/sec Train-FCNLogLoss=0.951381,
Epoch[0] Batch [700] Speed: 3.28 samples/sec Train-FCNLogLoss=0.943728,
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
users@infra.apache.org
With regards,
Apache Git Services