You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@singa.apache.org by wa...@apache.org on 2020/09/07 06:22:36 UTC

[singa] branch dev updated: fixed imdb train script args

This is an automated email from the ASF dual-hosted git repository.

wangwei pushed a commit to branch dev
in repository https://gitbox.apache.org/repos/asf/singa.git


The following commit(s) were added to refs/heads/dev by this push:
     new 8a46d23  fixed imdb train script args
     new 2cdb881  Merge pull request #789 from dcslin/imdb_script
8a46d23 is described below

commit 8a46d23e48c24f30412646ba2c62d91ae33bd7a6
Author: root <13...@users.noreply.github.com>
AuthorDate: Sat Sep 5 03:22:55 2020 +0000

    fixed imdb train script args
---
 examples/rnn/imdb_train.py | 14 ++++++++------
 1 file changed, 8 insertions(+), 6 deletions(-)

diff --git a/examples/rnn/imdb_train.py b/examples/rnn/imdb_train.py
index e9de5af..669c5ad 100644
--- a/examples/rnn/imdb_train.py
+++ b/examples/rnn/imdb_train.py
@@ -68,16 +68,18 @@ parser.add_argument('--mode',
                     default='lstm',
                     help='relu, tanh, lstm, gru',
                     dest='mode')
-parser.add_argument('--return-sequences',
-                    default='False',
+parser.add_argument('-s', '--return-sequences',
+                    default=False,
+                    action='store_true',
                     help='return sequences',
                     dest='return_sequences')
-parser.add_argument('--bidirectional',
-                    default='False',
+parser.add_argument('-d', '--bidirectional',
+                    default=False,
+                    action='store_true',
                     help='bidirectional lstm',
                     dest='bidirectional')
-parser.add_argument('--num-layers',
-                    default=1,
+parser.add_argument('-n', '--num-layers',
+                    default=2,
                     type=int,
                     help='num layers',
                     dest='num_layers')