You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@tvm.apache.org by GitBox <gi...@apache.org> on 2021/10/18 23:17:50 UTC

[GitHub] [tvm] apivovarov commented on a change in pull request #9303: [Keras] Support return_sequences in LSTM

apivovarov commented on a change in pull request #9303:
URL: https://github.com/apache/tvm/pull/9303#discussion_r731380430



##########
File path: tests/python/frontend/keras/test_forward.py
##########
@@ -417,12 +417,20 @@ def test_forward_reuse_layers(self, keras):
         keras_model = keras.models.Model(data, z)
         verify_keras_frontend(keras_model)
 
+    def test_forward_lstm(self, keras):
+        data = keras.layers.Input(shape=(10, 32))
+        rnn_funcs = [
+            keras.layers.LSTM(16),
+            keras.layers.LSTM(16, return_sequences=True),
+        ]
+        for rnn_func in rnn_funcs:
+            x = rnn_func(data)
+            keras_model = keras.models.Model(data, x)
+            verify_keras_frontend(keras_model, need_transpose=False)
+
     def test_forward_rnn(self, keras):
         data = keras.layers.Input(shape=(1, 32))
         rnn_funcs = [
-            keras.layers.LSTM(

Review comment:
       ok, I returned LSTM test back to test_forward_rnn.  Lets also test both LSTMs (with and without return_sequence flag) in `test_forward_lstm`. New test `test_forward_lstm` uses more realistic input with shape `(10, 32)` instead of `(1, 32)`. Using one word for LSTM input does not test the Recursion logic inside LSTM impl.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@tvm.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org