You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by GitBox <gi...@apache.org> on 2022/05/20 11:16:33 UTC

[GitHub] [incubator-mxnet] bgawrych opened a new pull request, #21034: [FEATURE] Add tanh approximation for GeLU activation

bgawrych opened a new pull request, #21034:
URL: https://github.com/apache/incubator-mxnet/pull/21034

   ## Description ##
   Add support for GELU activation with tanh approximation - it is used e.g. in GPT2 model in GluonNLP
   https://github.com/dmlc/gluon-nlp/blob/5ff0519aa5a89e7e2a2c0afab164e17de55231b4/src/gluonnlp/layers.py#L324
   ## Checklist ##
   ### Essentials ###
   - [x] PR's title starts with a category (e.g. [BUGFIX], [MODEL], [TUTORIAL], [FEATURE], [DOC], etc)
   - [x] Changes are complete (i.e. I finished coding on this PR)
   - [x] All changes have test coverage
   - [x] Code is well-documented
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@mxnet.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-mxnet] mxnet-bot commented on pull request #21034: [FEATURE] Add tanh approximation for GeLU activation

Posted by GitBox <gi...@apache.org>.
mxnet-bot commented on PR #21034:
URL: https://github.com/apache/incubator-mxnet/pull/21034#issuecomment-1164334610

   Jenkins CI successfully triggered : [clang]


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@mxnet.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-mxnet] mxnet-bot commented on pull request #21034: [FEATURE] Add tanh approximation for GeLU activation

Posted by GitBox <gi...@apache.org>.
mxnet-bot commented on PR #21034:
URL: https://github.com/apache/incubator-mxnet/pull/21034#issuecomment-1147281914

   Jenkins CI successfully triggered : [unix-gpu, clang, edge, unix-cpu]


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@mxnet.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-mxnet] mxnet-bot commented on pull request #21034: [FEATURE] Add tanh approximation for GeLU activation

Posted by GitBox <gi...@apache.org>.
mxnet-bot commented on PR #21034:
URL: https://github.com/apache/incubator-mxnet/pull/21034#issuecomment-1161333785

   Jenkins CI successfully triggered : [windows-gpu, unix-cpu]


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@mxnet.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-mxnet] bgawrych commented on a diff in pull request #21034: [FEATURE] Add tanh approximation for GeLU activation

Posted by GitBox <gi...@apache.org>.
bgawrych commented on code in PR #21034:
URL: https://github.com/apache/incubator-mxnet/pull/21034#discussion_r881341406


##########
src/api/operator/numpy_extension/npx_leaky_relu_op.cc:
##########
@@ -41,8 +41,10 @@ inline int String2ActType(const std::string& s) {
     return leakyrelu::kELU;
   } else if (s == "selu") {
     return leakyrelu::kSELU;
-  } else if (s == "gelu") {
-    return leakyrelu::kGELU;
+  } else if (s == "gelu" || s == "gelu_erf") {

Review Comment:
   backward compability



##########
src/operator/leaky_relu.cc:
##########
@@ -166,7 +166,10 @@ when the input is negative and has a slope of one when input is positive.
 The following modified ReLU Activation functions are supported:
 
 - *elu*: Exponential Linear Unit. `y = x > 0 ? x : slope * (exp(x)-1)`
-- *gelu*: Gaussian Error Linear Unit. `y = 0.5 * x * (1 + erf(x / sqrt(2)))`
+- *gelu*: Same as gelu_erf
+- *gelu_erf*: Gaussian Error Linear Unit. `y = 0.5 * x * (1 + erf(x / sqrt(2)))`

Review Comment:
   done



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@mxnet.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-mxnet] bgawrych commented on pull request #21034: [FEATURE] Add tanh approximation for GeLU activation

Posted by GitBox <gi...@apache.org>.
bgawrych commented on PR #21034:
URL: https://github.com/apache/incubator-mxnet/pull/21034#issuecomment-1160493623

   @mxnet-bot run ci [centos-cpu, unix-cpu]


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@mxnet.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-mxnet] bgawrych commented on a diff in pull request #21034: [FEATURE] Add tanh approximation for GeLU activation

Posted by GitBox <gi...@apache.org>.
bgawrych commented on code in PR #21034:
URL: https://github.com/apache/incubator-mxnet/pull/21034#discussion_r904916530


##########
src/operator/operator_tune.cc:
##########
@@ -277,6 +277,7 @@ IMPLEMENT_UNARY_WORKLOAD_BWD(mxnet::op::mshadow_op::relu_grad);
 IMPLEMENT_UNARY_WORKLOAD_FWD(mxnet::op::mshadow_op::selu);                         // NOLINT()
 IMPLEMENT_UNARY_WORKLOAD_BWD(mxnet::op::mshadow_op::selu_grad);                    // NOLINT()
 IMPLEMENT_UNARY_WORKLOAD_FWD(mxnet::op::mshadow_op::gelu);                         // NOLINT()

Review Comment:
   done



##########
src/operator/mshadow_op.h:
##########
@@ -617,9 +617,25 @@ MXNET_UNARY_MATH_OP(gelu,
                           (1.0f + math::erf(static_cast<float>(a) / SQRT_2))));
 
 MXNET_BINARY_MATH_OP_NC(gelu_grad,
-                        DType(0.5f * (1.0f + math::erf(static_cast<float>(a) / SQRT_2) +
-                                      static_cast<float>(a) *
-                                          erf_grad::Map(static_cast<float>(a) / SQRT_2) / SQRT_2)));
+                        DType(static_cast<float>(b) / static_cast<float>(a) +
+                              0.5f * static_cast<float>(a) *
+                                  erf_grad::Map(static_cast<float>(a) / SQRT_2) / SQRT_2));
+
+MXNET_UNARY_MATH_OP(gelu_tanh,
+                    DType(0.5f * static_cast<float>(a) *
+                          (1.0f + math::tanh(math::sqrt(2.0f / PI) *
+                                             (static_cast<float>(a) +
+                                              0.044715 * math::pow(static_cast<float>(a), 3))))));
+
+MXNET_BINARY_MATH_OP_NC(
+    gelu_tanh_grad,
+    DType(static_cast<float>(b) *
+          (1.0f / static_cast<float>(a) +
+           (1.0f -
+            math::tanh(math::sqrt(2.0f / PI) *
+                       (static_cast<float>(a) + 0.044715 * math::pow(static_cast<float>(a), 3))) *
+                (math::sqrt(2.0f / PI) *
+                 (1.0f + 0.134145 * math::pow(static_cast<float>(a), 2)))))));

Review Comment:
   done



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@mxnet.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-mxnet] bartekkuncer commented on a diff in pull request #21034: [FEATURE] Add tanh approximation for GeLU activation

Posted by GitBox <gi...@apache.org>.
bartekkuncer commented on code in PR #21034:
URL: https://github.com/apache/incubator-mxnet/pull/21034#discussion_r902744238


##########
src/operator/leaky_relu.cc:
##########
@@ -167,6 +179,9 @@ The following modified ReLU Activation functions are supported:
 
 - *elu*: Exponential Linear Unit. `y = x > 0 ? x : slope * (exp(x)-1)`
 - *gelu*: Gaussian Error Linear Unit. `y = 0.5 * x * (1 + erf(x / sqrt(2)))`
+- *gelu_erf*: Same as gelu

Review Comment:
   ```suggestion
   - *gelu_erf*: Same as gelu.
   ```



##########
python/mxnet/gluon/nn/activations.py:
##########
@@ -200,18 +200,26 @@ class GELU(HybridBlock):
         "Gaussian Error Linear Units (GELUs)", Hendrycks et al, 2016
         https://arxiv.org/abs/1606.08415
 
+    Parameters
+    ----------
+    approximation : string
+        Which approximation of GELU calculation to use (erf or tanh) 
 
     Inputs:
         - **data**: input tensor with arbitrary shape.
 
     Outputs:
         - **out**: output tensor with the same shape as `data`.
     """
-    def __init__(self, **kwargs):
+    def __init__(self, approximation='erf', **kwargs):
+        if approximation not in ['erf', 'tanh']:
+            raise ValueError("Unsupported approximation! Support values are 'erf' and 'tanh', "

Review Comment:
   ```suggestion
               raise ValueError("Unsupported approximation! Supported values are 'erf' and 'tanh', "
   ```



##########
src/operator/leaky_relu.cc:
##########
@@ -157,6 +157,18 @@ static bool LRChangeLayout(nnvm::NodeAttrs* attrs,
   return false;
 }
 
+static void LeakyReLUParamParser(nnvm::NodeAttrs* attrs) {
+  // For backward compatible, replace gelu to gelu_erf

Review Comment:
   ```suggestion
     // For backward compatibility, replace gelu to gelu_erf
   ```



##########
tests/python/unittest/test_operator.py:
##########
@@ -634,6 +634,29 @@ def fselu_grad(grad, x, y):
 
 
 def test_gelu():
+    np_erf = np.vectorize(math.erf)
+    def fgelu(x):
+        return 0.5 * x * (1.0 + np_erf(x/np.sqrt(2)))
+
+    def fgelu_grad(grad, x, y):
+        return grad * (y / x + x / np.sqrt(2 * math.pi) * np.exp(-0.5*(x**2)))
+
+    shape = (3, 4)
+    x = mx.sym.Variable("x")
+    y = mx.sym.LeakyReLU(data=x, act_type="gelu")
+    for dtype in [np.float16, np.float32, np.float64]:
+        xa = np.random.uniform(low=-0.1,high=0.1,size=shape).astype(dtype)

Review Comment:
   ```suggestion
           xa = np.random.uniform(low=-0.1, high=0.1, size=shape).astype(dtype)
   ```



##########
tests/python/unittest/test_operator.py:
##########
@@ -634,6 +634,29 @@ def fselu_grad(grad, x, y):
 
 
 def test_gelu():
+    np_erf = np.vectorize(math.erf)
+    def fgelu(x):
+        return 0.5 * x * (1.0 + np_erf(x/np.sqrt(2)))
+
+    def fgelu_grad(grad, x, y):
+        return grad * (y / x + x / np.sqrt(2 * math.pi) * np.exp(-0.5*(x**2)))

Review Comment:
   Inconsistent whitespaces.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@mxnet.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-mxnet] mxnet-bot commented on pull request #21034: [FEATURE] Add tanh approximation for GeLU activation

Posted by GitBox <gi...@apache.org>.
mxnet-bot commented on PR #21034:
URL: https://github.com/apache/incubator-mxnet/pull/21034#issuecomment-1132781032

   Hey @bgawrych , Thanks for submitting the PR 
   All tests are already queued to run once. If tests fail, you can trigger one or more tests again with the following commands: 
   - To trigger all jobs: @mxnet-bot run ci [all] 
   - To trigger specific jobs: @mxnet-bot run ci [job1, job2] 
   *** 
   **CI supported jobs**: [clang, centos-gpu, website, edge, miscellaneous, windows-cpu, windows-gpu, sanity, centos-cpu, unix-gpu, unix-cpu]
   *** 
   _Note_: 
    Only following 3 categories can trigger CI :PR Author, MXNet Committer, Jenkins Admin. 
   All CI tests must pass before the PR can be merged. 
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@mxnet.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-mxnet] bgawrych commented on pull request #21034: [FEATURE] Add tanh approximation for GeLU activation

Posted by GitBox <gi...@apache.org>.
bgawrych commented on PR #21034:
URL: https://github.com/apache/incubator-mxnet/pull/21034#issuecomment-1161333706

   @mxnet-bot run ci [windows-gpu, unix-cpu]


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@mxnet.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-mxnet] bartekkuncer commented on a diff in pull request #21034: [FEATURE] Add tanh approximation for GeLU activation

Posted by GitBox <gi...@apache.org>.
bartekkuncer commented on code in PR #21034:
URL: https://github.com/apache/incubator-mxnet/pull/21034#discussion_r880208764


##########
src/api/operator/numpy_extension/npx_leaky_relu_op.cc:
##########
@@ -41,8 +41,10 @@ inline int String2ActType(const std::string& s) {
     return leakyrelu::kELU;
   } else if (s == "selu") {
     return leakyrelu::kSELU;
-  } else if (s == "gelu") {
-    return leakyrelu::kGELU;
+  } else if (s == "gelu" || s == "gelu_erf") {

Review Comment:
   Why did you leave the old name here?



##########
src/operator/leaky_relu.cc:
##########
@@ -166,7 +166,10 @@ when the input is negative and has a slope of one when input is positive.
 The following modified ReLU Activation functions are supported:
 
 - *elu*: Exponential Linear Unit. `y = x > 0 ? x : slope * (exp(x)-1)`
-- *gelu*: Gaussian Error Linear Unit. `y = 0.5 * x * (1 + erf(x / sqrt(2)))`
+- *gelu*: Same as gelu_erf
+- *gelu_erf*: Gaussian Error Linear Unit. `y = 0.5 * x * (1 + erf(x / sqrt(2)))`

Review Comment:
   Would it not be better to have full explanation first and then "Same as XXX"?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@mxnet.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-mxnet] bgawrych commented on pull request #21034: [FEATURE] Add tanh approximation for GeLU activation

Posted by GitBox <gi...@apache.org>.
bgawrych commented on PR #21034:
URL: https://github.com/apache/incubator-mxnet/pull/21034#issuecomment-1147281813

   @mxnet-bot run ci [unix-gpu, unix-cpu, edge , clang]


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@mxnet.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-mxnet] bgawrych commented on pull request #21034: [FEATURE] Add tanh approximation for GeLU activation

Posted by GitBox <gi...@apache.org>.
bgawrych commented on PR #21034:
URL: https://github.com/apache/incubator-mxnet/pull/21034#issuecomment-1164334524

   @mxnet-bot run ci [clang]


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@mxnet.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-mxnet] mxnet-bot commented on pull request #21034: [FEATURE] Add tanh approximation for GeLU activation

Posted by GitBox <gi...@apache.org>.
mxnet-bot commented on PR #21034:
URL: https://github.com/apache/incubator-mxnet/pull/21034#issuecomment-1156035674

   Jenkins CI successfully triggered : [edge, unix-gpu, clang, unix-cpu]


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@mxnet.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-mxnet] mxnet-bot commented on pull request #21034: [FEATURE] Add tanh approximation for GeLU activation

Posted by GitBox <gi...@apache.org>.
mxnet-bot commented on PR #21034:
URL: https://github.com/apache/incubator-mxnet/pull/21034#issuecomment-1164302425

   Jenkins CI successfully triggered : [clang, unix-cpu, miscellaneous]


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@mxnet.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-mxnet] bgawrych commented on pull request #21034: [FEATURE] Add tanh approximation for GeLU activation

Posted by GitBox <gi...@apache.org>.
bgawrych commented on PR #21034:
URL: https://github.com/apache/incubator-mxnet/pull/21034#issuecomment-1162676126

   @mxnet-bot run ci [centos-cpu]


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@mxnet.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-mxnet] RafLit commented on a diff in pull request #21034: [FEATURE] Add tanh approximation for GeLU activation

Posted by GitBox <gi...@apache.org>.
RafLit commented on code in PR #21034:
URL: https://github.com/apache/incubator-mxnet/pull/21034#discussion_r903408675


##########
src/operator/operator_tune.cc:
##########
@@ -277,6 +277,7 @@ IMPLEMENT_UNARY_WORKLOAD_BWD(mxnet::op::mshadow_op::relu_grad);
 IMPLEMENT_UNARY_WORKLOAD_FWD(mxnet::op::mshadow_op::selu);                         // NOLINT()
 IMPLEMENT_UNARY_WORKLOAD_BWD(mxnet::op::mshadow_op::selu_grad);                    // NOLINT()
 IMPLEMENT_UNARY_WORKLOAD_FWD(mxnet::op::mshadow_op::gelu);                         // NOLINT()

Review Comment:
   Maybe change the name to gelu_erf to keep it consistent?



##########
src/operator/mshadow_op.h:
##########
@@ -617,9 +617,25 @@ MXNET_UNARY_MATH_OP(gelu,
                           (1.0f + math::erf(static_cast<float>(a) / SQRT_2))));
 
 MXNET_BINARY_MATH_OP_NC(gelu_grad,
-                        DType(0.5f * (1.0f + math::erf(static_cast<float>(a) / SQRT_2) +
-                                      static_cast<float>(a) *
-                                          erf_grad::Map(static_cast<float>(a) / SQRT_2) / SQRT_2)));
+                        DType(static_cast<float>(b) / static_cast<float>(a) +
+                              0.5f * static_cast<float>(a) *
+                                  erf_grad::Map(static_cast<float>(a) / SQRT_2) / SQRT_2));
+
+MXNET_UNARY_MATH_OP(gelu_tanh,
+                    DType(0.5f * static_cast<float>(a) *
+                          (1.0f + math::tanh(math::sqrt(2.0f / PI) *
+                                             (static_cast<float>(a) +
+                                              0.044715 * math::pow(static_cast<float>(a), 3))))));
+
+MXNET_BINARY_MATH_OP_NC(
+    gelu_tanh_grad,
+    DType(static_cast<float>(b) *
+          (1.0f / static_cast<float>(a) +
+           (1.0f -
+            math::tanh(math::sqrt(2.0f / PI) *
+                       (static_cast<float>(a) + 0.044715 * math::pow(static_cast<float>(a), 3))) *
+                (math::sqrt(2.0f / PI) *
+                 (1.0f + 0.134145 * math::pow(static_cast<float>(a), 2)))))));

Review Comment:
   It would be cleaner to define 0.044715 and 0.134145 as constants.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@mxnet.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-mxnet] bgawrych commented on pull request #21034: [FEATURE] Add tanh approximation for GeLU activation

Posted by GitBox <gi...@apache.org>.
bgawrych commented on PR #21034:
URL: https://github.com/apache/incubator-mxnet/pull/21034#issuecomment-1156035617

   @mxnet-bot run ci [unix-gpu, unix-cpu, edge , clang]


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@mxnet.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-mxnet] mxnet-bot commented on pull request #21034: [FEATURE] Add tanh approximation for GeLU activation

Posted by GitBox <gi...@apache.org>.
mxnet-bot commented on PR #21034:
URL: https://github.com/apache/incubator-mxnet/pull/21034#issuecomment-1160493696

   Jenkins CI successfully triggered : [unix-cpu, centos-cpu]


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@mxnet.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-mxnet] bgawrych commented on pull request #21034: [FEATURE] Add tanh approximation for GeLU activation

Posted by GitBox <gi...@apache.org>.
bgawrych commented on PR #21034:
URL: https://github.com/apache/incubator-mxnet/pull/21034#issuecomment-1164302365

   @mxnet-bot run ci [clang, miscellaneous, unix-cpu]


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@mxnet.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-mxnet] mxnet-bot commented on pull request #21034: [FEATURE] Add tanh approximation for GeLU activation

Posted by GitBox <gi...@apache.org>.
mxnet-bot commented on PR #21034:
URL: https://github.com/apache/incubator-mxnet/pull/21034#issuecomment-1162676204

   Jenkins CI successfully triggered : [centos-cpu]


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@mxnet.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [incubator-mxnet] bgawrych merged pull request #21034: [FEATURE] Add tanh approximation for GeLU activation

Posted by GitBox <gi...@apache.org>.
bgawrych merged PR #21034:
URL: https://github.com/apache/incubator-mxnet/pull/21034


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@mxnet.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org