You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by jx...@apache.org on 2017/12/18 20:16:51 UTC

[incubator-mxnet] branch master updated: Update linear-regression.md (#9103)

This is an automated email from the ASF dual-hosted git repository.

jxie pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
     new 5dcdd5d  Update linear-regression.md (#9103)
5dcdd5d is described below

commit 5dcdd5d3d856fde86bd9a2a2d8480fbb33f1d289
Author: Steffen Rochel <sr...@users.noreply.github.com>
AuthorDate: Mon Dec 18 12:16:46 2017 -0800

    Update linear-regression.md (#9103)
    
    * Update linear-regression.md
    
    Reduce number of epoch to 20, as validation accuracy doesn't improve any further.
    Added assertion to check for achieved accuracy in preparation for tutorial regression.
    
    * fixed location of assertion
    
    moved assertion to correct location
---
 docs/tutorials/python/linear-regression.md | 5 +++--
 1 file changed, 3 insertions(+), 2 deletions(-)

diff --git a/docs/tutorials/python/linear-regression.md b/docs/tutorials/python/linear-regression.md
index c26435d..fc3e713 100644
--- a/docs/tutorials/python/linear-regression.md
+++ b/docs/tutorials/python/linear-regression.md
@@ -156,9 +156,9 @@ parameters of the model to fit the training data. This is accomplished using the
 ```python
 model.fit(train_iter, eval_iter,
             optimizer_params={'learning_rate':0.005, 'momentum': 0.9},
-            num_epoch=50,
+            num_epoch=20,
             eval_metric='mse',
-            batch_end_callback = mx.callback.Speedometer(batch_size, 2))
+            batch_end_callback = mx.callback.Speedometer(batch_size, 2))	    
 ```
 
 ## Using a trained model: (Testing and Inference)
@@ -176,6 +176,7 @@ evaluating our model's mean squared error (MSE) on the evaluation data.
 ```python
 metric = mx.metric.MSE()
 model.score(eval_iter, metric)
+assert model.score(eval_iter, metric)[0][1] < 0.01001, "Achieved MSE (%f) is larger than expected (0.01001)" % model.score(eval_iter, metric)[0][1]
 ```
 
 Let us try and add some noise to the evaluation data and see how the MSE changes:

-- 
To stop receiving notification emails like this one, please contact
['"commits@mxnet.apache.org" <co...@mxnet.apache.org>'].