You are viewing a plain text version of this content. The canonical link for it is here.
Posted to github@beam.apache.org by "damccorm (via GitHub)" <gi...@apache.org> on 2023/04/28 15:31:50 UTC

[GitHub] [beam] damccorm commented on a diff in pull request #26472: [Python] Add saved_weights example to tf notebook

damccorm commented on code in PR #26472:
URL: https://github.com/apache/beam/pull/26472#discussion_r1180544965


##########
examples/notebooks/beam-ml/run_inference_tensorflow.ipynb:
##########
@@ -101,20 +110,44 @@
         "To use RunInference with the built-in Tensorflow model handler, install Apache Beam version 2.46.0 or later."
       ],
       "metadata": {
-        "id": "gVCtGOKTHMm4"
+        "id": "YDHPlMjZRuY0"
       }
     },
     {
       "cell_type": "code",
       "metadata": {
-        "id": "jBakpNZnAhqk"
+        "id": "jBakpNZnAhqk",
+        "colab": {
+          "base_uri": "https://localhost:8080/"
+        },
+        "outputId": "375cd47b-b837-4091-88e2-cdac12e3a4a1"
       },
       "source": [
         "!pip install protobuf --quiet\n",
         "!pip install apache_beam==2.46.0 --quiet"
       ],
-      "execution_count": null,
-      "outputs": []
+      "execution_count": 1,
+      "outputs": [
+        {
+          "output_type": "stream",
+          "name": "stdout",
+          "text": [
+            "\u001b[2K     \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m14.2/14.2 MB\u001b[0m \u001b[31m24.8 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+            "\u001b[2K     \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m152.0/152.0 kB\u001b[0m \u001b[31m12.5 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+            "\u001b[?25h  Preparing metadata (setup.py) ... \u001b[?25l\u001b[?25hdone\n",
+            "\u001b[2K     \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m135.6/135.6 kB\u001b[0m \u001b[31m2.8 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+            "\u001b[2K     \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m2.7/2.7 MB\u001b[0m \u001b[31m51.3 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+            "\u001b[2K     \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m516.2/516.2 kB\u001b[0m \u001b[31m14.1 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+            "\u001b[2K     \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m2.6/2.6 MB\u001b[0m \u001b[31m53.9 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+            "\u001b[2K     \u001b[90m━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━\u001b[0m \u001b[32m89.7/89.7 kB\u001b[0m \u001b[31m5.7 MB/s\u001b[0m eta \u001b[36m0:00:00\u001b[0m\n",
+            "\u001b[?25h  Preparing metadata (setup.py) ... \u001b[?25l\u001b[?25hdone\n",
+            "  Preparing metadata (setup.py) ... \u001b[?25l\u001b[?25hdone\n",
+            "  Building wheel for crcmod (setup.py) ... \u001b[?25l\u001b[?25hdone\n",
+            "  Building wheel for dill (setup.py) ... \u001b[?25l\u001b[?25hdone\n",
+            "  Building wheel for docopt (setup.py) ... \u001b[?25l\u001b[?25hdone\n"
+          ]
+        }
+      ]

Review Comment:
   ```suggestion
         "execution_count": null,
         "outputs": []
   ```
   
   Lets drop this output for the sake of a cleaner notebook



##########
examples/notebooks/beam-ml/run_inference_tensorflow.ipynb:
##########
@@ -313,14 +350,34 @@
       "metadata": {
         "id": "2JbE7WkGcAkK"
       },
-      "execution_count": 8,
+      "execution_count": 18,
+      "outputs": []
+    },
+    {
+      "cell_type": "markdown",
+      "source": [
+        "Save the weights."

Review Comment:
   Maybe something along the lines of: `Instead of saving the model, you can also save/load a model using just the model weights and class. This is a smaller, more efficient way to represent your model.`
   
   Basically, I want something to justify to the user why we're having them save things 2 ways so they know that in reality they just have to pick one.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org