You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@mxnet.apache.org by zh...@apache.org on 2018/09/17 18:14:40 UTC

[incubator-mxnet] branch master updated: Updating news, readme files and bumping master version to 1.3.1 (#12525)

This is an automated email from the ASF dual-hosted git repository.

zhasheng pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-mxnet.git


The following commit(s) were added to refs/heads/master by this push:
     new 9032e93  Updating news, readme files and bumping master version to 1.3.1 (#12525)
9032e93 is described below

commit 9032e9326d4e46e9fe202a37296daf8323abaaeb
Author: Roshani Nagmote <ro...@gmail.com>
AuthorDate: Mon Sep 17 11:14:11 2018 -0700

    Updating news, readme files and bumping master version to 1.3.1 (#12525)
    
    * news, readme update on master
    
    * Bumping version on master
    
    * clojure files version updated
---
 NEWS.md                                            | 191 +++++++++++++++++++++
 R-package/DESCRIPTION                              |   2 +-
 README.md                                          |   1 +
 contrib/clojure-package/README.md                  |   8 +-
 .../examples/cnn-text-classification/project.clj   |   2 +-
 contrib/clojure-package/examples/gan/project.clj   |   2 +-
 .../examples/imclassification/project.clj          |   2 +-
 .../clojure-package/examples/module/project.clj    |   2 +-
 .../examples/multi-label/project.clj               |   2 +-
 .../examples/neural-style/project.clj              |   2 +-
 .../examples/pre-trained-models/project.clj        |   2 +-
 .../clojure-package/examples/profiler/project.clj  |   2 +-
 contrib/clojure-package/examples/rnn/project.clj   |   2 +-
 .../clojure-package/examples/tutorial/project.clj  |   2 +-
 .../examples/visualization/project.clj             |   2 +-
 contrib/clojure-package/project.clj                |   4 +-
 include/mxnet/base.h                               |   2 +-
 python/mxnet/libinfo.py                            |   2 +-
 scala-package/assembly/linux-x86_64-cpu/pom.xml    |   8 +-
 scala-package/assembly/linux-x86_64-gpu/pom.xml    |   8 +-
 scala-package/assembly/osx-x86_64-cpu/pom.xml      |   8 +-
 scala-package/assembly/pom.xml                     |   2 +-
 scala-package/core/pom.xml                         |   6 +-
 scala-package/examples/pom.xml                     |   6 +-
 scala-package/infer/pom.xml                        |   4 +-
 scala-package/init-native/linux-x86_64/pom.xml     |   4 +-
 scala-package/init-native/osx-x86_64/pom.xml       |   4 +-
 scala-package/init-native/pom.xml                  |   2 +-
 scala-package/init/pom.xml                         |   2 +-
 scala-package/macros/pom.xml                       |   6 +-
 scala-package/native/linux-x86_64-cpu/pom.xml      |   4 +-
 scala-package/native/linux-x86_64-gpu/pom.xml      |   4 +-
 scala-package/native/osx-x86_64-cpu/pom.xml        |   4 +-
 scala-package/native/pom.xml                       |   2 +-
 scala-package/pom.xml                              |   2 +-
 scala-package/spark/pom.xml                        |   4 +-
 snapcraft.yaml                                     |   2 +-
 37 files changed, 253 insertions(+), 61 deletions(-)

diff --git a/NEWS.md b/NEWS.md
index 461bb6d..4dcf8bf 100644
--- a/NEWS.md
+++ b/NEWS.md
@@ -1,5 +1,196 @@
 MXNet Change Log
 ================
+## 1.3.0
+
+### New Features - Gluon RNN layers are now HybridBlocks
+- In this release, Gluon RNN layers such as `gluon.rnn.RNN`, `gluon.rnn.LSTM`, `gluon.rnn.GRU` becomes `HybridBlock`s as part of [gluon.rnn improvements project](https://github.com/apache/incubator-mxnet/projects/11) (#11482).
+- This is the result of newly available fused RNN operators added for CPU: LSTM([#10104](https://github.com/apache/incubator-mxnet/pull/10104)), vanilla RNN([#11399](https://github.com/apache/incubator-mxnet/pull/11399)), GRU([#10311](https://github.com/apache/incubator-mxnet/pull/10311))
+- Now many dynamic networks that are based on Gluon RNN layers can now be completely hybridized, exported, and used in the inference APIs in other language bindings such as R, Scala, etc.
+
+### MKL-DNN improvements
+- Introducing more functionality support for MKL-DNN as follows:
+  - Added support for more activation functions like, "sigmoid", "tanh", "softrelu". ([#10336](https://github.com/apache/incubator-mxnet/pull/10336))
+  - Added Debugging functionality: Result check ([#12069](https://github.com/apache/incubator-mxnet/pull/12069)) and Backend switch ([#12058](https://github.com/apache/incubator-mxnet/pull/12058)).
+
+### New Features - Gluon Model Zoo Pre-trained Models
+- Gluon Vision Model Zoo now provides MobileNetV2 pre-trained models (#10879) in addition to
+  AlexNet, DenseNet, Inception V3, MobileNetV1, ResNet V1 and V2, SqueezeNet 1.0 and 1.1, and VGG
+  pretrained models.
+- Updated pre-trained models provide state-of-the-art performance on all resnetv1, resnetv2, and vgg16, vgg19, vgg16_bn, vgg19_bn models (#11327 #11860 #11830).
+
+### New Features - Clojure package (experimental)
+- MXNet now supports the Clojure programming language. The MXNet Clojure package brings flexible and efficient GPU computing and state-of-art deep learning to Clojure. It enables you to write seamless tensor/matrix computation with multiple GPUs in Clojure. It also lets you construct and customize the state-of-art deep learning models in Clojure, and apply them to tasks, such as image classification and data science challenges.([#11205](https://github.com/apache/incubator-mxnet/pull/11205))
+- Checkout examples and API documentation [here](http://mxnet.incubator.apache.org/api/clojure/index.html).
+
+### New Features - Synchronized Cross-GPU Batch Norm (experimental)
+- Gluon now supports Synchronized Batch Normalization (#11502).
+- This enables stable training on large-scale networks with high memory consumption such as FCN for image segmentation.
+
+### New Features - Sparse Tensor Support for Gluon (experimental)
+- Sparse gradient support is added to `gluon.nn.Embedding`. Set `sparse_grad=True` to enable when constructing the Embedding block. ([#10924](https://github.com/apache/incubator-mxnet/pull/10924))
+- Gluon Parameter now supports "row_sparse" storage type, which reduces communication cost and memory consumption for multi-GPU training for large models. `gluon.contrib.nn.SparseEmbedding` is an example empowered by this. ([#11001](https://github.com/apache/incubator-mxnet/pull/11001), [#11429](https://github.com/apache/incubator-mxnet/pull/11429))
+- Gluon HybridBlock now supports hybridization with sparse operators ([#11306](https://github.com/apache/incubator-mxnet/pull/11306)).
+
+### New Features - Control flow operators (experimental)
+- This is the first step towards optimizing dynamic neural networks with variable computation graphs, by adding symbolic and imperative control flow operators. [Proposal](https://cwiki.apache.org/confluence/display/MXNET/Optimize+dynamic+neural+network+models+with+control+flow+operators).
+- New operators introduced: foreach([#11531](https://github.com/apache/incubator-mxnet/pull/11531)), while_loop([#11566](https://github.com/apache/incubator-mxnet/pull/11566)), cond([#11760](https://github.com/apache/incubator-mxnet/pull/11760)).
+
+### New Features - Scala API Improvements (experimental)
+- Improvements to MXNet Scala API usability([#10660](https://github.com/apache/incubator-mxnet/pull/10660), [#10787](https://github.com/apache/incubator-mxnet/pull/10787), [#10991](https://github.com/apache/incubator-mxnet/pull/10991))
+- Symbol.api and NDArray.api would bring new set of functions that have complete definition for all arguments.
+- Please see this [Type safe API design document](https://cwiki.apache.org/confluence/display/MXNET/Scala+Type-safe+API+Design+Doc) for more details.
+
+### New Features - Rounding GPU Memory Pool for dynamic networks with variable-length inputs and outputs (experimental)
+- MXNet now supports a new memory pool type for GPU memory (#11041).
+- Unlike the default memory pool requires exact size match to reuse released memory chunks, this new memory pool uses exponential-linear rounding so that similar sized memory chunks can all be reused, which is more suitable for all the workloads with dynamic-shape inputs and outputs. Set environment variable `MXNET_GPU_MEM_POOL_TYPE=Round` to enable.
+
+### New Features - Topology-aware AllReduce (experimental)
+- This features uses trees to perform the Reduce and Broadcast. It uses the idea of minimum spanning trees to do a binary tree Reduce communication pattern to improve it. This topology aware approach reduces the existing limitations for single machine communication shown by mehods like parameter server and NCCL ring reduction. It is an experimental feature ([#11591](https://github.com/apache/incubator-mxnet/pull/11591)).
+- Paper followed for implementation: [Optimal message scheduling for aggregation](https://www.sysml.cc/doc/178.pdf).
+- Set environment variable `MXNET_KVSTORE_USETREE=1` to enable.
+
+### New Features - Export MXNet models to ONNX format (experimental)
+- With this feature, now MXNet models can be exported to ONNX format([#11213](https://github.com/apache/incubator-mxnet/pull/11213)). Currently, MXNet supports ONNX v1.2.1. [API documentation](http://mxnet.incubator.apache.org/api/python/contrib/onnx.html).
+- Checkout this [tutorial](http://mxnet.incubator.apache.org/tutorials/onnx/export_mxnet_to_onnx.html) which shows how to use MXNet to ONNX exporter APIs. ONNX protobuf so that those models can be imported in other frameworks for inference.
+
+### New Features - TensorRT Runtime Integration (experimental)
+- [TensorRT](https://developer.nvidia.com/tensorrt) provides significant acceleration of model inference on NVIDIA GPUs compared to running the full graph in MxNet using unfused GPU operators. In addition to faster fp32 inference, TensorRT optimizes fp16 inference, and is capable of int8 inference (provided the quantization steps are performed). Besides increasing throughput, TensorRT significantly reduces inference latency, especially for small batches.
+- This feature in MXNet now introduces runtime integration of TensorRT into MXNet, in order to accelerate inference.([#11325](https://github.com/apache/incubator-mxnet/pull/11325))
+- Currently, its in contrib package.
+
+### New Examples - Scala
+- Refurnished Scala Examples with improved API, documentation and CI test coverage. ([#11753](https://github.com/apache/incubator-mxnet/pull/11753), [#11621](https://github.com/apache/incubator-mxnet/pull/11621) )
+- Now all Scala examples have:
+  - No bugs block in the middle
+  - Good Readme to start with
+  - with Type-safe API usage inside
+  - monitored in CI in each PR runs
+
+### Maintenance - Flaky Tests improvement effort
+- Fixed 130 flaky tests on CI. Tracked progress of the project [here](https://github.com/apache/incubator-mxnet/projects/9).
+- Add flakiness checker (#11572)
+
+### Maintenance - MXNet Model Backwards Compatibility Checker
+- This tool ([#11626](https://github.com/apache/incubator-mxnet/pull/11626)) helps in ensuring consistency and sanity while performing inference on the latest version of MXNet using models trained on older versions of MXNet.
+- This tool will help in detecting issues earlier in the development cycle which break backwards compatibility on MXNet and would contribute towards ensuring a healthy and stable release of MXNet.
+
+### Maintenance - Integrated testing for "the Straight Dope"
+- ["Deep Learning - The Straight Dope"](http://gluon.mxnet.io) is a deep learning book based on Apache MXNet Gluon that are contributed by many Gluon users.
+- Now the testing of this book is integrated in the nightly tests.
+
+### Bug-fixes
+- Fix gperftools/jemalloc and lapack warning bug. (#11110)
+- Fix mkldnn performance regression + improve test logging (#11262)
+- Fix row_sparse_param.save() (#11266)
+- Fix trainer init_kvstore (#11266)
+- Fix axis Bug in MKLDNN Softmax (#11335)
+- Fix 'AttributeError: '_thread._local' object has no attribute 'value'' on distributed processing applications (#11332)
+- Fix recordfile dataset with multi worker (#11370)
+- Manually check node existence in CachedOp (#11545)
+- Javadoc fix (#11239)
+- Fix bugs in MKLDNN operators to handle the kAddTo request (#11129)
+- Fix InferStorage for sparse fallback in FullyConnected (#11498)
+- Fix batchnorm problem with sparse matrices when fix_gamma=True (#11656)
+- Fix rnn layer save (#11776)
+- Fix BucketSentenceIter bug related to #11430 (#11580)
+- Fix for _backward_softsign activation (#11827)
+- Fix a bug in CachedOp. (#11675)
+- Fix quantization divide by zero errors (#11833)
+- Refactor R optimizers to fix memory leak (#11374)
+- Avoid use of troublesome cudnnFind() results when grad_req='add' (#11338)
+- Fix shared memory with gluon dataloader, add option pin_memory (#11908)
+- Fix quantized graph pass bug (#11937)
+- Fix MXPredReshape in the c_predict_api (#11493)
+- Fix the topk regression issue (#12197)
+- Fix image-classification example and add missing optimizers w/ momentum support (#11826)
+
+### Performance Improvements
+- Added static allocation and static shape for HybridBloc gluon (#11320)
+- Fix RecordIO augmentation speed (#11474)
+- Improve sparse pull performance for gluon trainer (#11429)
+- CTC operator performance improvement from HawkAaron/MXNet-CTC (#11834)
+- Improve performance of broadcast ops backward pass (#11252)
+- Improved numerical stability as a result of using stable L2 norm (#11573)
+- Accelerate the performance of topk for GPU and CPU side (#12085 #10997 ; This changes the behavior of topk when nan values occur in the input) 
+- Support for dot(dns, csr) = dns and dot(dns, csr.T) = dns on CPU ([#11113](https://github.com/apache/incubator-mxnet/pull/11113))
+- Performance improvement for Batch Dot on CPU from mshadow ([mshadow PR#342](https://github.com/dmlc/mshadow/pull/342))
+
+### API Changes
+- Allow Scala users to specify data/label names for NDArrayIter (#11256)
+- Allow user to define unknown token symbol to rnn encode_sentences() (#10461)
+- Added count_include_pad argument for Avg Pooling (#11021)
+- Add standard ResNet data augmentation for ImageRecordIter (#11027)
+- Add seed_aug parameter for ImageRecordIter to fix random seed for default augmentation (#11247)
+- Add support for accepting MXNet NDArrays in ColorNormalizeAug (#11606)
+- Enhancement of take operator (#11326)
+- Add temperature parameter in Softmax operator (#11466)
+- Add support for 1D inputs in leaky relu (#11850)
+- Add verify_ssl option to gluon.utils.download (#11546)
+
+### Other features
+- Added ccache reporting to CI (#11322)
+- Restructure dockcross dockerfiles to fix caching (#11302)
+- Added tests for MKLDNN backward operators  (#11232)
+- Add elemwise_add/sub between rsp and rsp on GPU (#11179)
+- Add clip_global_norm(row_sparse_grad) (#11266)
+- Add subgraph storage type inference to CachedOp  (#11306)
+- Enable support for dense weight and sparse grad Adagrad updates (#11355)
+- Added Histogram Operator (#10931)
+- Added Matthew's Correlation Coefficient to metrics (#10524)
+- Added support for add_n(dense, csr, dense) = dense on CPU & GPU (#11330)
+- Added support for add_n(any combination longer than 4 with at least one dense storage) = dense on CPU & GPU (#11330)
+- L1 Normalization (#11229)
+- Add support for int64 data type in CSVIter (#11446)
+- Add test for new int64 type in CSVIter (#11499)
+- Add sample ratio for ROI Align (#11145)
+- Shape and Size Operator (#10889)
+- Add HybidSequentialRNNCell, which can be nested in HybridBlock (#11003)
+- Support for a bunch of unary functions for csr matrices (#11559)
+- Added NDArrayCollector to dispose intermediate allocated NDArrays automatically (#11751)
+- Added the diag() operator (#11643)
+- Added broadcast_like operator (#11820)
+- Allow Partial shape infer for Slice (#11406)
+- Added support to profile kvstore server during distributed training  (#11215)
+- Add function for GPU Memory Query to C API (#12083)
+- Generalized reshape_like operator to be more flexible (#11928)
+- Add support for selu activation function (#12059)
+- Add support for accepting NDArray as input to Module predict API (#12166)
+- Add DataDesc type for the Scala Package (#11844)
+
+### Usability Improvements
+- Added NDArray auto-collector for Scala (#11751, #12232)
+- Added docs for mx.initializer.Constant (#10637)
+- Added build from souce instructions on windows (#11276)
+- Added a tutorial explaining how to use the profiler (#11274)
+- Added two tutorials on Learning Rate Schedules (#11296)
+- Added a tutorial for mixed precision training with float16 (#10391)
+- Create CPP test for concat MKLDNN operator (#11371)
+- Update large word language model example (#11405)
+- MNIST Examples for Scala new API (#11250)
+- Updated installation info to have latest packages and more clarity (#11503)
+- GAN MNIST Examples for Scala new API (#11547)
+- Added Learning Rate Finder tutorial (#11304)
+- Fix Installation instructions for R bindings on Linux systems. (#11590)
+- Integration Test for Scala (#11596)
+- Documentation enhancement for optimizers (#11657)
+- Update rcnn example (#11373)
+- Gluon ModelZoo, Gluon examples for Perl APIs (#11642)
+- Fix R installation in CI (#11761,#11755, #11768, #11805, #11954, #11976)
+- CNN Examples for Scala new API (#11292)
+- Custom Operator Example for Scala (#11401)
+- Added detailed doc about global pool layers in Gluon (#11832)
+- Updated MultiTask example to use new infer api (#11605)
+- Added logistic regression tutorial (#11651)
+- Added Support for integer type in ImageIter (#11864)
+- Added depth_to_space and space_to_depth operators (#11587)
+- Increased operator support for ONNX to MXNet importer (#11856)
+- Add linux and macos MKLDNN Building Instruction (#11049)
+- Add download utility for Scala APIs (#11866)
+- Improving documentation and error messages for Async distributed training with Gluon (#11910)
+- Added NeuralStyle Example for Scala (#11621)
+
+For more information and examples, see [full release notes](https://cwiki.apache.org/confluence/display/MXNET/Apache+MXNet+%28incubating%29+1.3.0+Release+Notes)
+
 ## 1.2.0
 ### New Features - Added Scala Inference APIs
 - Implemented new [Scala Inference APIs](https://cwiki.apache.org/confluence/display/MXNET/MXNetScalaInferenceAPI) which offer an easy-to-use, Scala Idiomatic and thread-safe high level APIs for performing predictions with deep learning models trained with MXNet (#9678). Implemented a new ImageClassifier class which provides APIs for classification tasks on a Java BufferedImage using a pre-trained model you provide (#10054). Implemented a new ObjectDetector class which provides APIs for  [...]
diff --git a/R-package/DESCRIPTION b/R-package/DESCRIPTION
index bd6a84d..0990c63 100644
--- a/R-package/DESCRIPTION
+++ b/R-package/DESCRIPTION
@@ -1,7 +1,7 @@
 Package: mxnet
 Type: Package
 Title: MXNet: A Flexible and Efficient Machine Learning Library for Heterogeneous Distributed Systems
-Version: 1.3.0
+Version: 1.3.1
 Date: 2017-06-27
 Author: Tianqi Chen, Qiang Kou, Tong He
 Maintainer: Qiang Kou <qk...@qkou.info>
diff --git a/README.md b/README.md
index 3d570ee..23b9d32 100644
--- a/README.md
+++ b/README.md
@@ -33,6 +33,7 @@ How to Contribute
 
 What's New
 ----------
+* [Version 1.3.0 Release](https://github.com/apache/incubator-mxnet/releases/tag/1.3.0) - MXNet 1.3.0 Release.
 * [Version 1.2.0 Release](https://github.com/apache/incubator-mxnet/releases/tag/1.2.0) - MXNet 1.2.0 Release.
 * [Version 1.1.0 Release](https://github.com/apache/incubator-mxnet/releases/tag/1.1.0) - MXNet 1.1.0 Release.
 * [Version 1.0.0 Release](https://github.com/apache/incubator-mxnet/releases/tag/1.0.0) - MXNet 1.0.0 Release.
diff --git a/contrib/clojure-package/README.md b/contrib/clojure-package/README.md
index ea678cc..66ba77b 100644
--- a/contrib/clojure-package/README.md
+++ b/contrib/clojure-package/README.md
@@ -42,11 +42,11 @@ and _Install MXNet dependencies_
 
 To use the prebuilt jars (easiest), you will need to replace the native version of the line in the project dependencies with your configuration.
 
-`[org.apache.mxnet/mxnet-full_2.11-linux-x86_64-gpu "1.2.1"]`
+`[org.apache.mxnet/mxnet-full_2.11-linux-x86_64-gpu "1.3.0"]`
 or
-`[org.apache.mxnet/mxnet-full_2.11-linux-x86_64-cpu "1.2.1"]`
+`[org.apache.mxnet/mxnet-full_2.11-linux-x86_64-cpu "1.3.0"]`
 or
-`[org.apache.mxnet/mxnet-full_2.11-osx-x86_64-cpu "1.2.1"]`
+`[org.apache.mxnet/mxnet-full_2.11-osx-x86_64-cpu "1.3.0"]`
 
 If you are using the prebuilt jars they may have a slightly different dependencies then building from source:
 
@@ -116,7 +116,7 @@ Checkout the latest SHA from the main package:
 
 If you need to checkout a particular release you can do it with:
 
-`git checkout tags/1.2.1 -b release-1.2.1`
+`git checkout tags/1.3.0 -b release-1.3.0`
 
 `git submodule update --init --recursive`
 
diff --git a/contrib/clojure-package/examples/cnn-text-classification/project.clj b/contrib/clojure-package/examples/cnn-text-classification/project.clj
index c592dcd..f441449 100644
--- a/contrib/clojure-package/examples/cnn-text-classification/project.clj
+++ b/contrib/clojure-package/examples/cnn-text-classification/project.clj
@@ -19,6 +19,6 @@
   :description "CNN text classification with MXNet"
   :plugins [[lein-cljfmt "0.5.7"]]
   :dependencies [[org.clojure/clojure "1.9.0"]
-                 [org.apache.mxnet.contrib.clojure/clojure-mxnet "1.3.0-SNAPSHOT"]]
+                 [org.apache.mxnet.contrib.clojure/clojure-mxnet "1.3.1-SNAPSHOT"]]
   :pedantic? :skip
   :main cnn-text-classification.classifier)
diff --git a/contrib/clojure-package/examples/gan/project.clj b/contrib/clojure-package/examples/gan/project.clj
index bebbc20..06f6232 100644
--- a/contrib/clojure-package/examples/gan/project.clj
+++ b/contrib/clojure-package/examples/gan/project.clj
@@ -19,6 +19,6 @@
   :description "GAN MNIST with MXNet"
   :plugins [[lein-cljfmt "0.5.7"]]
   :dependencies [[org.clojure/clojure "1.9.0"]
-                 [org.apache.mxnet.contrib.clojure/clojure-mxnet "1.3.0-SNAPSHOT"]
+                 [org.apache.mxnet.contrib.clojure/clojure-mxnet "1.3.1-SNAPSHOT"]
                  [nu.pattern/opencv "2.4.9-7"]]
   :main gan.gan-mnist)
diff --git a/contrib/clojure-package/examples/imclassification/project.clj b/contrib/clojure-package/examples/imclassification/project.clj
index e4c34e7..ad0a28a 100644
--- a/contrib/clojure-package/examples/imclassification/project.clj
+++ b/contrib/clojure-package/examples/imclassification/project.clj
@@ -19,6 +19,6 @@
   :description "Clojure examples for image classification"
   :plugins [[lein-cljfmt "0.5.7"]]
   :dependencies [[org.clojure/clojure "1.9.0"]
-                 [org.apache.mxnet.contrib.clojure/clojure-mxnet "1.3.0-SNAPSHOT"]]
+                 [org.apache.mxnet.contrib.clojure/clojure-mxnet "1.3.1-SNAPSHOT"]]
   :pedantic? :skip
   :main imclassification.train-mnist)
diff --git a/contrib/clojure-package/examples/module/project.clj b/contrib/clojure-package/examples/module/project.clj
index 7b32f32..487ceec 100644
--- a/contrib/clojure-package/examples/module/project.clj
+++ b/contrib/clojure-package/examples/module/project.clj
@@ -19,7 +19,7 @@
   :description "Clojure examples for module"
   :plugins [[lein-cljfmt "0.5.7"]]
   :dependencies [[org.clojure/clojure "1.9.0"]
-                 [org.apache.mxnet.contrib.clojure/clojure-mxnet "1.3.0-SNAPSHOT"]]
+                 [org.apache.mxnet.contrib.clojure/clojure-mxnet "1.3.1-SNAPSHOT"]]
   :pedantic? :skip
   :main mnist-mlp)
 
diff --git a/contrib/clojure-package/examples/multi-label/project.clj b/contrib/clojure-package/examples/multi-label/project.clj
index 88e3cff..a41c7fd 100644
--- a/contrib/clojure-package/examples/multi-label/project.clj
+++ b/contrib/clojure-package/examples/multi-label/project.clj
@@ -19,5 +19,5 @@
   :description "Example of multi-label classification"
   :plugins [[lein-cljfmt "0.5.7"]]
   :dependencies [[org.clojure/clojure "1.9.0"]
-                 [org.apache.mxnet.contrib.clojure/clojure-mxnet "1.3.0-SNAPSHOT"]]
+                 [org.apache.mxnet.contrib.clojure/clojure-mxnet "1.3.1-SNAPSHOT"]]
   :main multi-label.core)
diff --git a/contrib/clojure-package/examples/neural-style/project.clj b/contrib/clojure-package/examples/neural-style/project.clj
index f179618..f8d6e28 100644
--- a/contrib/clojure-package/examples/neural-style/project.clj
+++ b/contrib/clojure-package/examples/neural-style/project.clj
@@ -19,7 +19,7 @@
   :description "Neural Style Transfer with MXNet"
   :plugins [[lein-cljfmt "0.5.7"]]
   :dependencies [[org.clojure/clojure "1.9.0"]
-                 [org.apache.mxnet.contrib.clojure/clojure-mxnet "1.3.0-SNAPSHOT"]
+                 [org.apache.mxnet.contrib.clojure/clojure-mxnet "1.3.1-SNAPSHOT"]
                  [net.mikera/imagez "0.12.0"]
                  [thinktopic/think.image "0.4.16"]]
   :main neural-style.core)
diff --git a/contrib/clojure-package/examples/pre-trained-models/project.clj b/contrib/clojure-package/examples/pre-trained-models/project.clj
index e689e9a..c61e9a6 100644
--- a/contrib/clojure-package/examples/pre-trained-models/project.clj
+++ b/contrib/clojure-package/examples/pre-trained-models/project.clj
@@ -19,7 +19,7 @@
   :description "Example of using pre-trained models with MXNet"
   :plugins [[lein-cljfmt "0.5.7"]]
   :dependencies [[org.clojure/clojure "1.9.0"]
-                 [org.apache.mxnet.contrib.clojure/clojure-mxnet "1.3.0-SNAPSHOT"]
+                 [org.apache.mxnet.contrib.clojure/clojure-mxnet "1.3.1-SNAPSHOT"]
                  [net.mikera/imagez "0.12.0"]
                  [thinktopic/think.image "0.4.16"]]
   :main pre-trained-models.fine-tune)
diff --git a/contrib/clojure-package/examples/profiler/project.clj b/contrib/clojure-package/examples/profiler/project.clj
index 6ea2dfe..dfcfea7 100644
--- a/contrib/clojure-package/examples/profiler/project.clj
+++ b/contrib/clojure-package/examples/profiler/project.clj
@@ -18,5 +18,5 @@
 (defproject profiler "0.1.0-SNAPSHOT"
   :plugins [[lein-cljfmt "0.5.7"]]
   :dependencies [[org.clojure/clojure "1.9.0"]
-                 [org.apache.mxnet.contrib.clojure/clojure-mxnet "1.3.0-SNAPSHOT"]]
+                 [org.apache.mxnet.contrib.clojure/clojure-mxnet "1.3.1-SNAPSHOT"]]
   :main profiler.core)
diff --git a/contrib/clojure-package/examples/rnn/project.clj b/contrib/clojure-package/examples/rnn/project.clj
index 0175591..f411a54 100644
--- a/contrib/clojure-package/examples/rnn/project.clj
+++ b/contrib/clojure-package/examples/rnn/project.clj
@@ -19,5 +19,5 @@
   :description "RNN example"
   :plugins [[lein-cljfmt "0.5.7"]]
   :dependencies [[org.clojure/clojure "1.9.0"]
-                 [org.apache.mxnet.contrib.clojure/clojure-mxnet "1.3.0-SNAPSHOT"]]
+                 [org.apache.mxnet.contrib.clojure/clojure-mxnet "1.3.1-SNAPSHOT"]]
   :main rnn.train-char-rnn)
diff --git a/contrib/clojure-package/examples/tutorial/project.clj b/contrib/clojure-package/examples/tutorial/project.clj
index ab7606e..4910886 100644
--- a/contrib/clojure-package/examples/tutorial/project.clj
+++ b/contrib/clojure-package/examples/tutorial/project.clj
@@ -19,4 +19,4 @@
   :description "MXNET tutorials"
   :plugins [[lein-cljfmt "0.5.7"]]
   :dependencies [[org.clojure/clojure "1.9.0"]
-                 [org.apache.mxnet.contrib.clojure/clojure-mxnet "1.3.0-SNAPSHOT"]])
+                 [org.apache.mxnet.contrib.clojure/clojure-mxnet "1.3.1-SNAPSHOT"]])
diff --git a/contrib/clojure-package/examples/visualization/project.clj b/contrib/clojure-package/examples/visualization/project.clj
index 99d0e40..18882d4 100644
--- a/contrib/clojure-package/examples/visualization/project.clj
+++ b/contrib/clojure-package/examples/visualization/project.clj
@@ -19,5 +19,5 @@
   :description "Visualization example"
   :plugins [[lein-cljfmt "0.5.7"]]
   :dependencies [[org.clojure/clojure "1.9.0"]
-                 [org.apache.mxnet.contrib.clojure/clojure-mxnet "1.3.0-SNAPSHOT"]]
+                 [org.apache.mxnet.contrib.clojure/clojure-mxnet "1.3.1-SNAPSHOT"]]
   :main visualization.core)
diff --git a/contrib/clojure-package/project.clj b/contrib/clojure-package/project.clj
index 01a6b80..3779c82 100644
--- a/contrib/clojure-package/project.clj
+++ b/contrib/clojure-package/project.clj
@@ -15,7 +15,7 @@
 ;; limitations under the License.
 ;;
 
-(defproject org.apache.mxnet.contrib.clojure/clojure-mxnet "1.3.0-SNAPSHOT"
+(defproject org.apache.mxnet.contrib.clojure/clojure-mxnet "1.3.1-SNAPSHOT"
   :description "Clojure package for MXNet"
   :dependencies [[org.clojure/clojure "1.9.0"]
                  [t6/from-scala "0.3.0"]
@@ -26,7 +26,7 @@
                  ;[org.apache.mxnet/mxnet-full_2.11-linux-x86_64-gpu "1.2.1"]
 
                  ;;; CI
-                 [org.apache.mxnet/mxnet-full_2.11-linux-x86_64-cpu "1.3.0-SNAPSHOT"]
+                 [org.apache.mxnet/mxnet-full_2.11-linux-x86_64-cpu "1.3.1-SNAPSHOT"]
 
                  [org.clojure/tools.logging "0.4.0"]
                  [org.apache.logging.log4j/log4j-core "2.8.1"]
diff --git a/include/mxnet/base.h b/include/mxnet/base.h
index 75784a3..dfe1899 100644
--- a/include/mxnet/base.h
+++ b/include/mxnet/base.h
@@ -104,7 +104,7 @@
 /*! \brief minor version */
 #define MXNET_MINOR 3
 /*! \brief patch version */
-#define MXNET_PATCH 0
+#define MXNET_PATCH 1
 /*! \brief mxnet version */
 #define MXNET_VERSION (MXNET_MAJOR*10000 + MXNET_MINOR*100 + MXNET_PATCH)
 /*! \brief helper for making version number */
diff --git a/python/mxnet/libinfo.py b/python/mxnet/libinfo.py
index 74617f7..b445051 100644
--- a/python/mxnet/libinfo.py
+++ b/python/mxnet/libinfo.py
@@ -78,4 +78,4 @@ def find_lib_path():
 
 
 # current version
-__version__ = "1.3.0"
+__version__ = "1.3.1"
diff --git a/scala-package/assembly/linux-x86_64-cpu/pom.xml b/scala-package/assembly/linux-x86_64-cpu/pom.xml
index b410de1..a6a7bbd 100644
--- a/scala-package/assembly/linux-x86_64-cpu/pom.xml
+++ b/scala-package/assembly/linux-x86_64-cpu/pom.xml
@@ -6,7 +6,7 @@
   <parent>
     <groupId>org.apache.mxnet</groupId>
     <artifactId>mxnet-full-parent_2.11</artifactId>
-    <version>1.3.0-SNAPSHOT</version>
+    <version>1.3.1-SNAPSHOT</version>
     <relativePath>../pom.xml</relativePath>
   </parent>
 
@@ -18,18 +18,18 @@
     <dependency>
       <groupId>org.apache.mxnet</groupId>
       <artifactId>mxnet-core_${scala.binary.version}</artifactId>
-      <version>1.3.0-SNAPSHOT</version>
+      <version>1.3.1-SNAPSHOT</version>
     </dependency>
     <dependency>
       <groupId>org.apache.mxnet</groupId>
       <artifactId>libmxnet-scala-linux-x86_64-cpu</artifactId>
-      <version>1.3.0-SNAPSHOT</version>
+      <version>1.3.1-SNAPSHOT</version>
       <type>so</type>
     </dependency>
     <dependency>
       <groupId>org.apache.mxnet</groupId>
       <artifactId>mxnet-infer_${scala.binary.version}</artifactId>
-      <version>1.3.0-SNAPSHOT</version>
+      <version>1.3.1-SNAPSHOT</version>
     </dependency>
   </dependencies>
 
diff --git a/scala-package/assembly/linux-x86_64-gpu/pom.xml b/scala-package/assembly/linux-x86_64-gpu/pom.xml
index 0a1e795..9923489 100644
--- a/scala-package/assembly/linux-x86_64-gpu/pom.xml
+++ b/scala-package/assembly/linux-x86_64-gpu/pom.xml
@@ -6,7 +6,7 @@
   <parent>
     <groupId>org.apache.mxnet</groupId>
     <artifactId>mxnet-full-parent_2.11</artifactId>
-    <version>1.3.0-SNAPSHOT</version>
+    <version>1.3.1-SNAPSHOT</version>
     <relativePath>../pom.xml</relativePath>
   </parent>
 
@@ -18,18 +18,18 @@
     <dependency>
       <groupId>org.apache.mxnet</groupId>
       <artifactId>mxnet-core_${scala.binary.version}</artifactId>
-      <version>1.3.0-SNAPSHOT</version>
+      <version>1.3.1-SNAPSHOT</version>
     </dependency>
     <dependency>
       <groupId>org.apache.mxnet</groupId>
       <artifactId>libmxnet-scala-linux-x86_64-gpu</artifactId>
-      <version>1.3.0-SNAPSHOT</version>
+      <version>1.3.1-SNAPSHOT</version>
       <type>so</type>
     </dependency>
     <dependency>
       <groupId>org.apache.mxnet</groupId>
       <artifactId>mxnet-infer_${scala.binary.version}</artifactId>
-      <version>1.3.0-SNAPSHOT</version>
+      <version>1.3.1-SNAPSHOT</version>
     </dependency>
   </dependencies>
 
diff --git a/scala-package/assembly/osx-x86_64-cpu/pom.xml b/scala-package/assembly/osx-x86_64-cpu/pom.xml
index 8a12d80..d877333 100644
--- a/scala-package/assembly/osx-x86_64-cpu/pom.xml
+++ b/scala-package/assembly/osx-x86_64-cpu/pom.xml
@@ -6,7 +6,7 @@
   <parent>
     <groupId>org.apache.mxnet</groupId>
     <artifactId>mxnet-full-parent_2.11</artifactId>
-    <version>1.3.0-SNAPSHOT</version>
+    <version>1.3.1-SNAPSHOT</version>
     <relativePath>../pom.xml</relativePath>
   </parent>
 
@@ -18,18 +18,18 @@
     <dependency>
       <groupId>org.apache.mxnet</groupId>
       <artifactId>mxnet-core_${scala.binary.version}</artifactId>
-      <version>1.3.0-SNAPSHOT</version>
+      <version>1.3.1-SNAPSHOT</version>
     </dependency>
     <dependency>
       <groupId>org.apache.mxnet</groupId>
       <artifactId>libmxnet-scala-osx-x86_64-cpu</artifactId>
-      <version>1.3.0-SNAPSHOT</version>
+      <version>1.3.1-SNAPSHOT</version>
       <type>jnilib</type>
     </dependency>
     <dependency>
       <groupId>org.apache.mxnet</groupId>
       <artifactId>mxnet-infer_${scala.binary.version}</artifactId>
-      <version>1.3.0-SNAPSHOT</version>
+      <version>1.3.1-SNAPSHOT</version>
     </dependency>
   </dependencies>
 
diff --git a/scala-package/assembly/pom.xml b/scala-package/assembly/pom.xml
index aef50ce..127e69e 100644
--- a/scala-package/assembly/pom.xml
+++ b/scala-package/assembly/pom.xml
@@ -6,7 +6,7 @@
   <parent>
     <groupId>org.apache.mxnet</groupId>
     <artifactId>mxnet-parent_2.11</artifactId>
-    <version>1.3.0-SNAPSHOT</version>
+    <version>1.3.1-SNAPSHOT</version>
     <relativePath>../pom.xml</relativePath>
   </parent>
 
diff --git a/scala-package/core/pom.xml b/scala-package/core/pom.xml
index c74b00f..0ee7494 100644
--- a/scala-package/core/pom.xml
+++ b/scala-package/core/pom.xml
@@ -6,7 +6,7 @@
   <parent>
     <groupId>org.apache.mxnet</groupId>
     <artifactId>mxnet-parent_2.11</artifactId>
-    <version>1.3.0-SNAPSHOT</version>
+    <version>1.3.1-SNAPSHOT</version>
     <relativePath>../pom.xml</relativePath>
   </parent>
 
@@ -95,13 +95,13 @@
     <dependency>
       <groupId>org.apache.mxnet</groupId>
       <artifactId>mxnet-init_${scala.binary.version}</artifactId>
-      <version>1.3.0-SNAPSHOT</version>
+      <version>1.3.1-SNAPSHOT</version>
       <scope>provided</scope>
     </dependency>
     <dependency>
       <groupId>org.apache.mxnet</groupId>
       <artifactId>mxnet-macros_${scala.binary.version}</artifactId>
-      <version>1.3.0-SNAPSHOT</version>
+      <version>1.3.1-SNAPSHOT</version>
       <scope>provided</scope>
     </dependency>
     <dependency>
diff --git a/scala-package/examples/pom.xml b/scala-package/examples/pom.xml
index d24785b..436f299 100644
--- a/scala-package/examples/pom.xml
+++ b/scala-package/examples/pom.xml
@@ -6,7 +6,7 @@
   <parent>
     <groupId>org.apache.mxnet</groupId>
     <artifactId>mxnet-parent_2.11</artifactId>
-    <version>1.3.0-SNAPSHOT</version>
+    <version>1.3.1-SNAPSHOT</version>
     <relativePath>../pom.xml</relativePath>
   </parent>
 
@@ -151,13 +151,13 @@
     <dependency>
       <groupId>org.apache.mxnet</groupId>
       <artifactId>mxnet-core_${scala.binary.version}</artifactId>
-      <version>1.3.0-SNAPSHOT</version>
+      <version>1.3.1-SNAPSHOT</version>
       <scope>provided</scope>
     </dependency>
     <dependency>
       <groupId>org.apache.mxnet</groupId>
       <artifactId>mxnet-infer_${scala.binary.version}</artifactId>
-      <version>1.3.0-SNAPSHOT</version>
+      <version>1.3.1-SNAPSHOT</version>
       <scope>provided</scope>
     </dependency>
     <dependency>
diff --git a/scala-package/infer/pom.xml b/scala-package/infer/pom.xml
index 71c85af..e501001 100644
--- a/scala-package/infer/pom.xml
+++ b/scala-package/infer/pom.xml
@@ -6,7 +6,7 @@
     <parent>
         <artifactId>mxnet-parent_2.11</artifactId>
         <groupId>org.apache.mxnet</groupId>
-        <version>1.3.0-SNAPSHOT</version>
+        <version>1.3.1-SNAPSHOT</version>
         <relativePath>../pom.xml</relativePath>
     </parent>
 
@@ -91,7 +91,7 @@
         <dependency>
             <groupId>org.apache.mxnet</groupId>
             <artifactId>mxnet-core_${scala.binary.version}</artifactId>
-            <version>1.3.0-SNAPSHOT</version>
+            <version>1.3.1-SNAPSHOT</version>
             <scope>provided</scope>
         </dependency>
         <!-- https://mvnrepository.com/artifact/org.mockito/mockito-all -->
diff --git a/scala-package/init-native/linux-x86_64/pom.xml b/scala-package/init-native/linux-x86_64/pom.xml
index 7a6d908..811308e 100644
--- a/scala-package/init-native/linux-x86_64/pom.xml
+++ b/scala-package/init-native/linux-x86_64/pom.xml
@@ -6,7 +6,7 @@
   <parent>
     <groupId>org.apache.mxnet</groupId>
     <artifactId>mxnet-scala-init-native-parent</artifactId>
-    <version>1.3.0-SNAPSHOT</version>
+    <version>1.3.1-SNAPSHOT</version>
     <relativePath>../pom.xml</relativePath>
   </parent>
 
@@ -20,7 +20,7 @@
     <dependency>
       <groupId>org.apache.mxnet</groupId>
       <artifactId>mxnet-init_${scala.binary.version}</artifactId>
-      <version>1.3.0-SNAPSHOT</version>
+      <version>1.3.1-SNAPSHOT</version>
       <type>jar</type>
       <scope>compile</scope>
     </dependency>
diff --git a/scala-package/init-native/osx-x86_64/pom.xml b/scala-package/init-native/osx-x86_64/pom.xml
index 0eaf4e3..1ab17ea 100644
--- a/scala-package/init-native/osx-x86_64/pom.xml
+++ b/scala-package/init-native/osx-x86_64/pom.xml
@@ -6,7 +6,7 @@
   <parent>
     <groupId>org.apache.mxnet</groupId>
     <artifactId>mxnet-scala-init-native-parent</artifactId>
-    <version>1.3.0-SNAPSHOT</version>
+    <version>1.3.1-SNAPSHOT</version>
     <relativePath>../pom.xml</relativePath>
   </parent>
 
@@ -20,7 +20,7 @@
     <dependency>
       <groupId>org.apache.mxnet</groupId>
       <artifactId>mxnet-init_${scala.binary.version}</artifactId>
-      <version>1.3.0-SNAPSHOT</version>
+      <version>1.3.1-SNAPSHOT</version>
       <type>jar</type>
       <scope>compile</scope>
     </dependency>
diff --git a/scala-package/init-native/pom.xml b/scala-package/init-native/pom.xml
index 1cd79a8..834a699 100644
--- a/scala-package/init-native/pom.xml
+++ b/scala-package/init-native/pom.xml
@@ -6,7 +6,7 @@
   <parent>
     <groupId>org.apache.mxnet</groupId>
     <artifactId>mxnet-parent_2.11</artifactId>
-    <version>1.3.0-SNAPSHOT</version>
+    <version>1.3.1-SNAPSHOT</version>
     <relativePath>../pom.xml</relativePath>
   </parent>
 
diff --git a/scala-package/init/pom.xml b/scala-package/init/pom.xml
index 2c00ca5..80f883d 100644
--- a/scala-package/init/pom.xml
+++ b/scala-package/init/pom.xml
@@ -6,7 +6,7 @@
   <parent>
     <groupId>org.apache.mxnet</groupId>
     <artifactId>mxnet-parent_2.11</artifactId>
-    <version>1.3.0-SNAPSHOT</version>
+    <version>1.3.1-SNAPSHOT</version>
 <!--  <relativePath>../pom.xml</relativePath>-->
   </parent>
 
diff --git a/scala-package/macros/pom.xml b/scala-package/macros/pom.xml
index 7959009..ae4c60a 100644
--- a/scala-package/macros/pom.xml
+++ b/scala-package/macros/pom.xml
@@ -6,7 +6,7 @@
   <parent>
     <groupId>org.apache.mxnet</groupId>
     <artifactId>mxnet-parent_2.11</artifactId>
-    <version>1.3.0-SNAPSHOT</version>
+    <version>1.3.1-SNAPSHOT</version>
     <relativePath>../pom.xml</relativePath>
   </parent>
 
@@ -53,13 +53,13 @@
     <dependency>
       <groupId>org.apache.mxnet</groupId>
       <artifactId>mxnet-init_${scala.binary.version}</artifactId>
-      <version>1.3.0-SNAPSHOT</version>
+      <version>1.3.1-SNAPSHOT</version>
       <scope>provided</scope>
     </dependency>
     <dependency>
       <groupId>org.apache.mxnet</groupId>
       <artifactId>libmxnet-init-scala-${platform}</artifactId>
-      <version>1.3.0-SNAPSHOT</version>
+      <version>1.3.1-SNAPSHOT</version>
       <scope>provided</scope>
       <type>${libtype}</type>
     </dependency>
diff --git a/scala-package/native/linux-x86_64-cpu/pom.xml b/scala-package/native/linux-x86_64-cpu/pom.xml
index c45635e..7603c03 100644
--- a/scala-package/native/linux-x86_64-cpu/pom.xml
+++ b/scala-package/native/linux-x86_64-cpu/pom.xml
@@ -6,7 +6,7 @@
   <parent>
     <groupId>org.apache.mxnet</groupId>
     <artifactId>mxnet-scala-native-parent</artifactId>
-    <version>1.3.0-SNAPSHOT</version>
+    <version>1.3.1-SNAPSHOT</version>
     <relativePath>../pom.xml</relativePath>
   </parent>
 
@@ -20,7 +20,7 @@
     <dependency>
       <groupId>org.apache.mxnet</groupId>
       <artifactId>mxnet-core_${scala.binary.version}</artifactId>
-      <version>1.3.0-SNAPSHOT</version>
+      <version>1.3.1-SNAPSHOT</version>
       <type>jar</type>
       <scope>compile</scope>
     </dependency>
diff --git a/scala-package/native/linux-x86_64-gpu/pom.xml b/scala-package/native/linux-x86_64-gpu/pom.xml
index a1f5ec3..df2ec91 100644
--- a/scala-package/native/linux-x86_64-gpu/pom.xml
+++ b/scala-package/native/linux-x86_64-gpu/pom.xml
@@ -6,7 +6,7 @@
   <parent>
     <groupId>org.apache.mxnet</groupId>
     <artifactId>mxnet-scala-native-parent</artifactId>
-    <version>1.3.0-SNAPSHOT</version>
+    <version>1.3.1-SNAPSHOT</version>
     <relativePath>../pom.xml</relativePath>
   </parent>
 
@@ -20,7 +20,7 @@
     <dependency>
       <groupId>org.apache.mxnet</groupId>
       <artifactId>mxnet-core_${scala.binary.version}</artifactId>
-      <version>1.3.0-SNAPSHOT</version>
+      <version>1.3.1-SNAPSHOT</version>
       <type>jar</type>
       <scope>compile</scope>
     </dependency>
diff --git a/scala-package/native/osx-x86_64-cpu/pom.xml b/scala-package/native/osx-x86_64-cpu/pom.xml
index e1c6310..cff0b14 100644
--- a/scala-package/native/osx-x86_64-cpu/pom.xml
+++ b/scala-package/native/osx-x86_64-cpu/pom.xml
@@ -6,7 +6,7 @@
   <parent>
     <groupId>org.apache.mxnet</groupId>
     <artifactId>mxnet-scala-native-parent</artifactId>
-    <version>1.3.0-SNAPSHOT</version>
+    <version>1.3.1-SNAPSHOT</version>
     <relativePath>../pom.xml</relativePath>
   </parent>
 
@@ -20,7 +20,7 @@
     <dependency>
       <groupId>org.apache.mxnet</groupId>
       <artifactId>mxnet-core_${scala.binary.version}</artifactId>
-      <version>1.3.0-SNAPSHOT</version>
+      <version>1.3.1-SNAPSHOT</version>
       <type>jar</type>
       <scope>compile</scope>
     </dependency>
diff --git a/scala-package/native/pom.xml b/scala-package/native/pom.xml
index 485b69f..49922a9 100644
--- a/scala-package/native/pom.xml
+++ b/scala-package/native/pom.xml
@@ -6,7 +6,7 @@
   <parent>
     <groupId>org.apache.mxnet</groupId>
     <artifactId>mxnet-parent_2.11</artifactId>
-    <version>1.3.0-SNAPSHOT</version>
+    <version>1.3.1-SNAPSHOT</version>
     <relativePath>../pom.xml</relativePath>
   </parent>
 
diff --git a/scala-package/pom.xml b/scala-package/pom.xml
index c221b47..fe78a62 100644
--- a/scala-package/pom.xml
+++ b/scala-package/pom.xml
@@ -10,7 +10,7 @@
   </parent>
   <groupId>org.apache.mxnet</groupId>
   <artifactId>mxnet-parent_2.11</artifactId>
-  <version>1.3.0-SNAPSHOT</version>
+  <version>1.3.1-SNAPSHOT</version>
   <name>MXNet Scala Package - Parent</name>
   <url>https://github.com/apache/incubator-mxnet/tree/master/scala-package</url>
   <description>
diff --git a/scala-package/spark/pom.xml b/scala-package/spark/pom.xml
index f2b8060..f26ba19 100644
--- a/scala-package/spark/pom.xml
+++ b/scala-package/spark/pom.xml
@@ -6,7 +6,7 @@
   <parent>
     <groupId>org.apache.mxnet</groupId>
     <artifactId>mxnet-parent_2.11</artifactId>
-    <version>1.3.0-SNAPSHOT</version>
+    <version>1.3.1-SNAPSHOT</version>
     <relativePath>../pom.xml</relativePath>
   </parent>
 
@@ -40,7 +40,7 @@
     <dependency>
       <groupId>org.apache.mxnet</groupId>
       <artifactId>mxnet-core_${scala.binary.version}</artifactId>
-      <version>1.3.0-SNAPSHOT</version>
+      <version>1.3.1-SNAPSHOT</version>
       <scope>provided</scope>
     </dependency>
     <dependency>
diff --git a/snapcraft.yaml b/snapcraft.yaml
index 284efa2..ba54972 100644
--- a/snapcraft.yaml
+++ b/snapcraft.yaml
@@ -1,5 +1,5 @@
 name: mxnet
-version: '1.3.0'
+version: '1.3.1'
 summary: MXNet is a deep learning framework designed for efficiency and flexibility.
 description: |
   MXNet is a deep learning framework designed for both efficiency and