You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@mxnet.apache.org by Sheng Zha <no...@github.com> on 2020/02/24 22:52:44 UTC

[apache/incubator-mxnet] [RFC] MXNet 2.0 API Deprecation (#17676)

As the MXNet community is working on the next major version of MXNet as described in #16167, this RFC seeks to clarify the scope of API deprecation, to inform the community of the replacement API design, and to ensure informed consensus.

Thanks to the long history of MXNet and the accumulated efforts of the community, MXNet now supports a wide range of neural network model training and deployment use cases. Many of these use cases have seen several generations of API design and implementation. Take model training as an example, there have been the Symbol Model API, Symbol Module API, and Gluon Hybrid Block API, all of which coexist in MXNet. Older generations of API often have a significant body of users and thus require time from the community to maintain, even though the supported use cases are mostly covered by a newer generation of API. Such requirement for maintenance not only consumes time and energy of the MXNet community and can distract the community from its longer term goal, but also causes pressure on CI, binary distribution.

In this RFC, we list several candidate API to be deprecated and the corresponding new generation of API as replacement. Unless otherwise stated, these APIs will continue to be supported in the future 1.x releases that happen in parallel to the 2.0 development. On the other hand, participating in the RFC for the new replacement feature of the feature you are interested in is the best way to ensure continued support in 2.0 for that feature. To make it easier to navigate, the replacement feature RFCs are linked in each section.

To make the discussion more productive, I recommend the following actions:

* If a feature in MXNet that you are interested in currently depend on any of the deprecated API, and you plan to switch from 1.x to 2.0, please participate in the RFC for the replacement feature. Please also direct any related questions with respect to how the replacement feature covers your use cases to the replacement feature RFCs.
* If after discussion in the replacement RFCs, you believe that the replacement feature cannot replace the candidate feature for deprecation, please call out in this RFC.
    * Make sure to include your argument on why it’s the case, and clarify what use case cannot be supported in the new feature.
    * If you wish to commit time and sponsor the continued maintenance beyond what’s specified in the following sections, please state so along with your comment.
    * You may also seek other community members to sponsor the feature as comments in this RFC.
    * The group of sponsors needs to collectively clarify what additional support the feature will receive from them, and commit to the cost of maintenance, development, and operations (such as CI).


Please always keep the discussion civilized and informative. Comments otherwise will be folded.

## mxnet.numpy and mxnet.ndarray

Traditionally MXNet provided `mx.nd` API with operators inspired, but often incompatible with Numpy. Based on RFC [#14253](https://github.com/apache/incubator-mxnet/issues/14253) there has been a large and ongoing effort to provide Numpy compatible operators in `mx.np` namespace.
This means that MXNet currently provides two incompatible APIs with separate backend implementations achieving similar goals, doubling the maintenance burden of developers. Note that there are some deep learning operators in `mx.nd` that don't have the counterparts in `mx.np`. These operators will be migrated to `mx.npx` namespace and will be tracked in #17096.

Given the wide impact of this decision, these people convened on 2/19/2020 and reached consensus on recommending **Removal** and parallel maintenance of 1.x and 2.x as the option forward: @eric-haibin-lin, @mli, @haojin2, @szhengac, @yizhiliu, @sxjscience, @reminisce, @leezu, @zhreshold, @apeforest, @oorqueda, @rondogency

| Options     | Removal                                                                                                                                                                 | Deprecation                                                                                                                                                                                                                                                                                                                                                                                                             | Separate Compatibility API                                                                                                                                                                                                                                                                                                                                                    |
|-------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| Description | 1. Drop `mx.nd`, `mx` namespaces in Python and require analogous changes in other frontends. 2. Remove operators not exposed via `mx.np` and `mx.npx` from the backend. | 1. Keep `mx.nd`, `mx.sym`, and `mx.sym.np` but discourage its use for example via deprecation warning. Only fix regressions introduced in MXNet 2. Remove in MXNet 3. 2. Provide backwards compatibility in `mx.gluon` for `F` and (other MXNet 1 features) 3. May introduce breaking changes on operator level, such as improved Optimizer Operators (PR #17400). Any such change must provide instructions for users. | 1. `mx.v1.nd`, `mx.v1.sym`, and `mx.v1.gluon` frontend namespaces. Discourage use. Remove in MXNet 3. 2. Drop `mx.nd`, `mx.sym`, `mx.sym.np` in MXNet 2 and potentially introduce breaking changes in `mx.gluon`. 3. May introduce breaking changes on operator level, such as improved Optimizer Operators (PR #17400). Any such change must provide instructions for users. |
| Pros        | 1. Simplifies development. No overhead from taking old APIs into consideration. 2. Speed up CI and save costs by removing tests for dropped functionality.              | 1. Easy migration to MXNet 2. Need not spend much resources on maintaining MXNet 1.                                                                                                                                                                                                                                                                                                                                     | 1. Easy migration to MXNet 2. Need not spend much resources on maintaining MXNet 1.                                                                                                                                                                                                                                                                                           |
| Cons        | 1. Existing code will stop working. 2. 1.x branch may need to be maintained for many years as some customers won’t migrate.                                             | 1. Developers must not introduce regressions in both frontend and backend, which may bring about development overhead. 2. No cost / speed improvements for CI.                                                                                                                                                                                                                                                          | 1. Developers must not introduce regressions in backend, which may bring about development overhead. 2. No cost / speed improvements for CI.                                                                                                                                                                                                                                  |

APIs to remove or deprecate: `mx.nd`, `mx.sym`
Replacement APIs: `mx.np`, `mx.npx`


## Symbol and NDArray

Traditionally MXNet recommended users to statically declare their machine learning models with the `mx.sym` symbolic API. In 2018, Amazon in collaboration with Microsoft published [Gluon API](https://github.com/gluon-api/gluon-api) which MXNet community then implemented so that users could enjoy the flexibility of imperative mode together with the benefits of a symbolic computational graph.

Gluon exploited the similarity between `mx.sym` and `mx.nd` and asked users to write code that would work irrespectively of the namespace used in a `gluon.HybridBlock` using a placeholder `F` that could either refer to  `mx.sym` or `mx.nd`. As the basic building blocks of  `mx.sym` and `mx.nd`, `Symbol` and `NDArray` have diverging behaviour the use of  `nn.HybridBlock` required users to learn the details of each.

Taking a step back, exposing the distinction between  `mx.sym` and `mx.nd` in the frontend to users is a sufficient but not necessary approach to provide users with the flexibility of imperative mode together with the benefits of a symbolic computational graph. To improve the user experience, we like to reconsider this approach, providing a unified imperative and symbolic API based on the concept of deferred computation.

Deferred computation (RFC: [#16376](https://github.com/apache/incubator-mxnet/issues/16376), PR: [#17530](https://github.com/apache/incubator-mxnet/pull/17530)) extends the NDArray in the MXNet backend to (when enabled) compute only metadata (such as shape) eagerly while tracking the computational graph in a symbolic fashion and deferring storage allocation and computation until access to the results of the computation is requested. It further provides APIs to export the recorded graph as a Symbol. Together these are used to provide Gluon hybridization and exporting to other language frontends.

APIs to remove or deprecate: `mx.sym`, `mx.sym.np`
Replacement APIs: `mx.np`

### Gluon API


Deferred Compute PR contains the required changes to Gluon API that enables Gluon based on deferred compute: [#17530](https://github.com/apache/incubator-mxnet/pull/17530)
We auto-detect Gluon 1 use (user implements `HybridBlock.hybrid_forward`) vs new API.

```
class MyBlock(mx.gluon.HybridBlock):
    def __init__(self, *, prefix=None, params=None):
        super().__init__(prefix, params)
        with self.name_scope():
            self.dense = mx.gluon.nn.Dense(units=10)
            self.weight = self.params.get('weight', allow_deferred_init=True)

    def infer_shape(self, x):
        self.weight.shape = (x.shape[1], )

    def forward(self, x):
        return self.dense(x) + self.weight.data(x.context)

net = MyBlock()
net.initialize()
net.hybridize()
net(mx.nd.ones(shape=(8, 10), ctx=mx.cpu()))
```



## mx.model and mx.module

Both mx.model and mx.module were introduced before MXNet 0.7 as high level APIs to describe model architecture and parameters associated. The Gluon API was made generally available in MXNet 1.0, and is easier to use compared to model and module APIs. In MXNet 2.0, I propose:

1. Remove the mx.model.Feedforward and mx.module.* python API. To migrate to Gluon APIs, users can use mx.gluon.SymbolBlock to create a Gluon block.
2. In C++, unify the graph executor (the backend for module) and the imperative/cached_op executor (the backend for Gluon), such that Gluon is on par with module in terms of functionality and performance.

## C-API clean-up

As part of the efforts in #17097 to improve performance for imperative execution, we are adopting the PackedFunc based FFI as described in [1]. The design of this FFI can be found in [2]. The implementation of PackedFunc can be found in [3]. Once PackedFunc based FFI is merged, the C APIs will be registered as PackedFunc in the runtime system. This brings the benefit of reducing the need for directly maintaining the optimized FFI such as our cython implementation for a large number of functions.

Note that this change is limited to the C-APIs in `include/mxnet/c_api.h` and it does not affect `include/mxnet/c_predict_api.h` or `include/mxnet/c_api_test.h`.

### Support for other frontend languages

Since MXNet’s frontend languages all rely on the C-API, this implies changes to the other language bindings too. As stated in MXNet 2.0 roadmap RFC [4], the language bindings are expected to move together with this change as initiated by the maintainers of those language bindings.

Currently, the PackedFunc implementation in TVM already support Python, Java/Scala, C++/Rust, and Javascript, and thus it can be directly supported for our existing language bindings for Python, Java, Scala, C++, and Javascript. This leaves the support for Perl and R which is feasible but pending development.

### Deprecated API

As the project evolves, we found the need for extending some of the existing API and thus added new versions of them to supercede and to deprecate the old versions. The old versions were left in the API definition for backward-compatibility, which increases the surface area for support. In MXNet 2.0, we will remove these deprecated versions of API and rename the substitute *Ex API to remove the Ex suffice. This also includes the new API for large tensor support with *64 suffix. The list of such APIs include:

<details>
<summary>List of groups of API</summary>

```
MXNET_DLL int MXAggregateProfileStatsPrintEx(const char **out_str, int reset, int format,
                                            int sort_by, int ascending);
--
MXNET_DLL int MXAggregateProfileStatsPrint(const char **out_str, int reset);



MXNET_DLL int MXNDArrayCreate(const uint32_t *shape,
                              uint32_t ndim,
                              int dev_type,
                              int dev_id,
                              int delay_alloc,
                              NDArrayHandle *out);
--
MXNET_DLL int MXNDArrayCreateEx(const uint32_t *shape,
                                uint32_t ndim,
                                int dev_type,
                                int dev_id,
                                int delay_alloc,
                                int dtype,
                                NDArrayHandle *out);
--
MXNET_DLL int MXNDArrayCreateEx64(const int64_t *shape,
                                  int ndim,
                                  int dev_type,
                                  int dev_id,
                                  int delay_alloc,
                                  int dtype,
                                  NDArrayHandle *out);



MXNET_DLL int MXNDArrayCreateSparseEx(int storage_type,
                                      const uint32_t *shape,
                                      uint32_t ndim,
                                      int dev_type,
                                      int dev_id,
                                      int delay_alloc,
                                      int dtype,
--
MXNET_DLL int MXNDArrayCreateSparseEx64(int storage_type,
                                        const int64_t *shape,
                                        int ndim,
                                        int dev_type,
                                        int dev_id,
                                        int delay_alloc,
                                        int dtype,



MXNET_DLL int MXNDArrayGetShape(NDArrayHandle handle,
                                uint32_t *out_dim,
                                const uint32_t **out_pdata);
--
MXNET_DLL int MXNDArrayGetShapeEx(NDArrayHandle handle,
                                  int *out_dim,
                                  const int **out_pdata);
--
MXNET_DLL int MXNDArrayGetShapeEx64(NDArrayHandle handle,
                                    int *out_dim,
                                    const int64_t **out_pdata);




MXNET_DLL int MXNDArrayFromDLPack(DLManagedTensorHandle dlpack,
                                  NDArrayHandle *out_handle);
--
MXNET_DLL int MXNDArrayFromDLPackEx(DLManagedTensorHandle dlpack,
                                    const bool transient_handle,
                                    NDArrayHandle *out_handle);




MXNET_DLL int MXFuncInvoke(FunctionHandle fun,
                           NDArrayHandle *use_vars,
                           float *scalar_args,
                           NDArrayHandle *mutate_vars);
--
MXNET_DLL int MXFuncInvokeEx(FunctionHandle fun,
                             NDArrayHandle *use_vars,
                             float *scalar_args,
                             NDArrayHandle *mutate_vars,
                             int num_params,
                             char **param_keys,
                             char **param_vals);



MXNET_DLL int MXImperativeInvoke(AtomicSymbolCreator creator,
                                 int num_inputs,
                                 NDArrayHandle *inputs,
                                 int *num_outputs,
                                 NDArrayHandle **outputs,
                                 int num_params,
                                 const char **param_keys,
                                 const char **param_vals);
--
MXNET_DLL int MXImperativeInvokeEx(AtomicSymbolCreator creator,
                                   int num_inputs,
                                   NDArrayHandle *inputs,
                                   int *num_outputs,
                                   NDArrayHandle **outputs,
                                   int num_params,
                                   const char **param_keys,
                                   const char **param_vals,
                                   const int **out_stypes);




MXNET_DLL int MXCreateCachedOp(SymbolHandle handle, CachedOpHandle *out);
--
MXNET_DLL int MXCreateCachedOpEx(SymbolHandle handle,
                                 int num_flags,
                                 const char** keys,
                                 const char** vals,
                                 CachedOpHandle *out);
--
MXNET_DLL int MXCreateCachedOpEX(SymbolHandle handle,
                                 int num_flags,
                                 const char** keys,
                                 const char** vals,
                                 CachedOpHandle *out,
                                 bool thread_safe DEFAULT(false));



MXNET_DLL int MXInvokeCachedOp(CachedOpHandle handle,
                               int num_inputs,
                               NDArrayHandle *inputs,
                               int *num_outputs,
                               NDArrayHandle **outputs);
--
MXNET_DLL int MXInvokeCachedOpEx(CachedOpHandle handle,
                                 int num_inputs,
                                 NDArrayHandle *inputs,
                                 int *num_outputs,
                                 NDArrayHandle **outputs,
                                 const int** out_stypes);



MXNET_DLL int MXSymbolInferShape(SymbolHandle sym,
                                 uint32_t num_args,
                                 const char** keys,
                                 const uint32_t *arg_ind_ptr,
                                 const uint32_t *arg_shape_data,
                                 uint32_t *in_shape_size,
                                 const uint32_t **in_shape_ndim,
                                 const uint32_t ***in_shape_data,
                                 uint32_t *out_shape_size,
                                 const uint32_t **out_shape_ndim,
                                 const uint32_t ***out_shape_data,
                                 uint32_t *aux_shape_size,
                                 const uint32_t **aux_shape_ndim,
                                 const uint32_t ***aux_shape_data,
                                 int *complete);
--
MXNET_DLL int MXSymbolInferShapeEx(SymbolHandle sym,
                                   uint32_t num_args,
                                   const char** keys,
                                   const uint32_t *arg_ind_ptr,
                                   const int *arg_shape_data,
                                   uint32_t *in_shape_size,
                                   const int **in_shape_ndim,
                                   const int ***in_shape_data,
                                   uint32_t *out_shape_size,
                                   const int **out_shape_ndim,
                                   const int ***out_shape_data,
                                   uint32_t *aux_shape_size,
                                   const int **aux_shape_ndim,
                                   const int ***aux_shape_data,
                                   int *complete);
--
MXNET_DLL int MXSymbolInferShapeEx64(SymbolHandle sym,
                                     uint32_t num_args,
                                     const char** keys,
                                     const int64_t *arg_ind_ptr,
                                     const int64_t *arg_shape_data,
                                     size_t *in_shape_size,
                                     const int **in_shape_ndim,
                                     const int64_t ***in_shape_data,
                                     size_t *out_shape_size,
                                     const int **out_shape_ndim,
                                     const int64_t ***out_shape_data,
                                     size_t *aux_shape_size,
                                     const int **aux_shape_ndim,
                                     const int64_t ***aux_shape_data,
                                     int *complete);



MXNET_DLL int MXSymbolInferShapePartial(SymbolHandle sym,
                                        uint32_t num_args,
                                        const char** keys,
                                        const uint32_t *arg_ind_ptr,
                                        const uint32_t *arg_shape_data,
                                        uint32_t *in_shape_size,
                                        const uint32_t **in_shape_ndim,
                                        const uint32_t ***in_shape_data,
                                        uint32_t *out_shape_size,
                                        const uint32_t **out_shape_ndim,
                                        const uint32_t ***out_shape_data,
                                        uint32_t *aux_shape_size,
                                        const uint32_t **aux_shape_ndim,
                                        const uint32_t ***aux_shape_data,
                                        int *complete);
--
MXNET_DLL int MXSymbolInferShapePartialEx(SymbolHandle sym,
                                          uint32_t num_args,
                                          const char** keys,
                                          const uint32_t *arg_ind_ptr,
                                          const int *arg_shape_data,
                                          uint32_t *in_shape_size,
                                          const int **in_shape_ndim,
                                          const int ***in_shape_data,
                                          uint32_t *out_shape_size,
                                          const int **out_shape_ndim,
                                          const int ***out_shape_data,
                                          uint32_t *aux_shape_size,
                                          const int **aux_shape_ndim,
                                          const int ***aux_shape_data,
                                          int *complete);
--
MXNET_DLL int MXSymbolInferShapePartialEx64(SymbolHandle sym,
                                            uint32_t num_args,
                                            const char** keys,
                                            const int64_t *arg_ind_ptr,
                                            const int64_t *arg_shape_data,
                                            size_t *in_shape_size,
                                            const int **in_shape_ndim,
                                            const int64_t ***in_shape_data,
                                            size_t *out_shape_size,
                                            const int **out_shape_ndim,
                                            const int64_t ***out_shape_data,
                                            size_t *aux_shape_size,
                                            const int **aux_shape_ndim,
                                            const int64_t ***aux_shape_data,
                                            int *complete);



MXNET_DLL int MXExecutorBackward(ExecutorHandle handle,
                                 uint32_t len,
                                 NDArrayHandle *head_grads);
--
MXNET_DLL int MXExecutorBackwardEx(ExecutorHandle handle,
                                   uint32_t len,
                                   NDArrayHandle *head_grads,
                                   int is_train);



MXNET_DLL int MXExecutorBind(SymbolHandle symbol_handle,
                             int dev_type,
                             int dev_id,
                             uint32_t len,
                             NDArrayHandle *in_args,
                             NDArrayHandle *arg_grad_store,
                             uint32_t *grad_req_type,
                             uint32_t aux_states_len,
                             NDArrayHandle *aux_states,
                             ExecutorHandle *out);
--
MXNET_DLL int MXExecutorBindX(SymbolHandle symbol_handle,
                              int dev_type,
                              int dev_id,
                              uint32_t num_map_keys,
                              const char** map_keys,
                              const int* map_dev_types,
                              const int* map_dev_ids,
                              uint32_t len,
                              NDArrayHandle *in_args,
                              NDArrayHandle *arg_grad_store,
                              uint32_t *grad_req_type,
                              uint32_t aux_states_len,
                              NDArrayHandle *aux_states,
                              ExecutorHandle *out);
--
MXNET_DLL int MXExecutorBindEX(SymbolHandle symbol_handle,
                               int dev_type,
                               int dev_id,
                               uint32_t num_map_keys,
                               const char** map_keys,
                               const int* map_dev_types,
                               const int* map_dev_ids,
                               uint32_t len,
                               NDArrayHandle *in_args,
                               NDArrayHandle *arg_grad_store,
                               uint32_t *grad_req_type,
                               uint32_t aux_states_len,
                               NDArrayHandle *aux_states,
                               ExecutorHandle shared_exec,
                               ExecutorHandle *out);



MXNET_DLL int MXExecutorSimpleBind(SymbolHandle symbol_handle,
                                   int dev_type,
                                   int dev_id,
                                   const uint32_t num_g2c_keys,
                                   const char** g2c_keys,
                                   const int* g2c_dev_types,
                                   const int* g2c_dev_ids,
                                   const uint32_t provided_grad_req_list_len,
                                   const char** provided_grad_req_names,
                                   const char** provided_grad_req_types,
                                   const uint32_t num_provided_arg_shapes,
                                   const char** provided_arg_shape_names,
                                   const uint32_t* provided_arg_shape_data,
                                   const uint32_t* provided_arg_shape_idx,
                                   const uint32_t num_provided_arg_dtypes,
                                   const char** provided_arg_dtype_names,
                                   const int* provided_arg_dtypes,
                                   const uint32_t num_provided_arg_stypes,
                                   const char** provided_arg_stype_names,
                                   const int* provided_arg_stypes,
                                   const uint32_t num_shared_arg_names,
                                   const char** shared_arg_name_list,
                                   int* shared_buffer_len,
                                   const char** shared_buffer_name_list,
                                   NDArrayHandle* shared_buffer_handle_list,
                                   const char*** updated_shared_buffer_name_list,
                                   NDArrayHandle** updated_shared_buffer_handle_list,
                                   uint32_t* num_in_args,
                                   NDArrayHandle** in_args,
                                   NDArrayHandle** arg_grads,
                                   uint32_t* num_aux_states,
                                   NDArrayHandle** aux_states,
                                   ExecutorHandle shared_exec_handle,
                                   ExecutorHandle* out);
--
MXNET_DLL int MXExecutorSimpleBindEx(SymbolHandle symbol_handle,
                                     int dev_type,
                                     int dev_id,
                                     const uint32_t num_g2c_keys,
                                     const char** g2c_keys,
                                     const int* g2c_dev_types,
                                     const int* g2c_dev_ids,
                                     const uint32_t provided_grad_req_list_len,
                                     const char** provided_grad_req_names,
                                     const char** provided_grad_req_types,
                                     const uint32_t num_provided_arg_shapes,
                                     const char** provided_arg_shape_names,
                                     const int* provided_arg_shape_data,
                                     const uint32_t* provided_arg_shape_idx,
                                     const uint32_t num_provided_arg_dtypes,
                                     const char** provided_arg_dtype_names,
                                     const int* provided_arg_dtypes,
                                     const uint32_t num_provided_arg_stypes,
                                     const char** provided_arg_stype_names,
                                     const int* provided_arg_stypes,
                                     const uint32_t num_shared_arg_names,
                                     const char** shared_arg_name_list,
                                     int* shared_buffer_len,
                                     const char** shared_buffer_name_list,
                                     NDArrayHandle* shared_buffer_handle_list,
                                     const char*** updated_shared_buffer_name_list,
                                     NDArrayHandle** updated_shared_buffer_handle_list,
                                     uint32_t* num_in_args,
                                     NDArrayHandle** in_args,
                                     NDArrayHandle** arg_grads,
                                     uint32_t* num_aux_states,
                                     NDArrayHandle** aux_states,
                                     ExecutorHandle shared_exec_handle,
                                     ExecutorHandle* out);
--
MXNET_DLL int MXExecutorSimpleBindEx64(SymbolHandle symbol_handle,
                                     int dev_type,
                                     int dev_id,
                                     const uint32_t num_g2c_keys,
                                     const char** g2c_keys,
                                     const int* g2c_dev_types,
                                     const int* g2c_dev_ids,
                                     const uint32_t provided_grad_req_list_len,
                                     const char** provided_grad_req_names,
                                     const char** provided_grad_req_types,
                                     const uint32_t num_provided_arg_shapes,
                                     const char** provided_arg_shape_names,
                                     const int64_t* provided_arg_shape_data,
                                     const uint32_t* provided_arg_shape_idx,
                                     const uint32_t num_provided_arg_dtypes,
                                     const char** provided_arg_dtype_names,
                                     const int* provided_arg_dtypes,
                                     const uint32_t num_provided_arg_stypes,
                                     const char** provided_arg_stype_names,
                                     const int* provided_arg_stypes,
                                     const uint32_t num_shared_arg_names,
                                     const char** shared_arg_name_list,
                                     int* shared_buffer_len,
                                     const char** shared_buffer_name_list,
                                     NDArrayHandle* shared_buffer_handle_list,
                                     const char*** updated_shared_buffer_name_list,
                                     NDArrayHandle** updated_shared_buffer_handle_list,
                                     uint32_t* num_in_args,
                                     NDArrayHandle** in_args,
                                     NDArrayHandle** arg_grads,
                                     uint32_t* num_aux_states,
                                     NDArrayHandle** aux_states,
                                     ExecutorHandle shared_exec_handle,
                                     ExecutorHandle* out);



MXNET_DLL int MXExecutorReshape(int partial_shaping,
                                int allow_up_sizing,
                                int dev_type,
                                int dev_id,
                                uint32_t num_map_keys,
                                const char** map_keys,
                                const int* map_dev_types,
                                const int* map_dev_ids,
                                const uint32_t num_provided_arg_shapes,
                                const char** provided_arg_shape_names,
                                const uint32_t* provided_arg_shape_data,
                                const uint32_t* provided_arg_shape_idx,
                                uint32_t* num_in_args,
                                NDArrayHandle** in_args,
                                NDArrayHandle** arg_grads,
                                uint32_t* num_aux_states,
                                NDArrayHandle** aux_states,
                                ExecutorHandle shared_exec,
                                ExecutorHandle *out);
--
MXNET_DLL int MXExecutorReshapeEx(int partial_shaping,
                                  int allow_up_sizing,
                                  int dev_type,
                                  int dev_id,
                                  uint32_t num_map_keys,
                                  const char** map_keys,
                                  const int* map_dev_types,
                                  const int* map_dev_ids,
                                  const uint32_t num_provided_arg_shapes,
                                  const char** provided_arg_shape_names,
                                  const int* provided_arg_shape_data,
                                  const uint32_t* provided_arg_shape_idx,
                                  uint32_t* num_in_args,
                                  NDArrayHandle** in_args,
                                  NDArrayHandle** arg_grads,
                                  uint32_t* num_aux_states,
                                  NDArrayHandle** aux_states,
                                  ExecutorHandle shared_exec,
                                  ExecutorHandle *out);



MXNET_DLL int MXExecutorSetMonitorCallback(ExecutorHandle handle,
                                           ExecutorMonitorCallback callback,
                                           void* callback_handle);
--
MXNET_DLL int MXExecutorSetMonitorCallbackEX(ExecutorHandle handle,
                                             ExecutorMonitorCallback callback,
                                             void *callback_handle, bool monitor_all);



MXNET_DLL int MXKVStoreInit(KVStoreHandle handle,
                            uint32_t num,
                            const int* keys,
                            NDArrayHandle* vals);
--
MXNET_DLL int MXKVStoreInitEx(KVStoreHandle handle,
                              uint32_t num,
                              const char** keys,
                              NDArrayHandle* vals);



MXNET_DLL int MXKVStorePush(KVStoreHandle handle,
                            uint32_t num,
                            const int* keys,
                            NDArrayHandle* vals,
                            int priority);
--
MXNET_DLL int MXKVStorePushEx(KVStoreHandle handle,
                              uint32_t num,
                              const char** keys,
                              NDArrayHandle* vals,
                              int priority);



MXNET_DLL int MXKVStorePullWithSparse(KVStoreHandle handle,
                                      uint32_t num,
                                      const int* keys,
                                      NDArrayHandle* vals,
                                      int priority,
                                      bool ignore_sparse);
--
MXNET_DLL int MXKVStorePullWithSparseEx(KVStoreHandle handle,
                                        uint32_t num,
                                        const char** keys,
                                        NDArrayHandle* vals,
                                        int priority,
                                        bool ignore_sparse);



MXNET_DLL int MXKVStorePull(KVStoreHandle handle,
                            uint32_t num,
                            const int* keys,
                            NDArrayHandle* vals,
                            int priority);
--
MXNET_DLL int MXKVStorePullEx(KVStoreHandle handle,
                              uint32_t num,
                              const char** keys,
                              NDArrayHandle* vals,
                              int priority);



MXNET_DLL int MXKVStorePullRowSparse(KVStoreHandle handle,
                                     uint32_t num,
                                     const int* keys,
                                     NDArrayHandle* vals,
                                     const NDArrayHandle* row_ids,
                                     int priority);
--
MXNET_DLL int MXKVStorePullRowSparseEx(KVStoreHandle handle,
                                       uint32_t num,
                                       const char** keys,
                                       NDArrayHandle* vals,
                                       const NDArrayHandle* row_ids,
                                       int priority);



MXNET_DLL int MXKVStoreBroadcast(KVStoreHandle handle,
                                 mx_uint vnum,
                                 const int* vkeys,
                                 mx_uint onum,
                                 const int* okeys,
                                 NDArrayHandle* vals,
                                 NDArrayHandle* outs,
                                 int priority);
--
MXNET_DLL int MXKVStoreBroadcastEx(KVStoreHandle handle,
                                   mx_uint vnum,
                                   const char** vkeys,
                                   mx_uint onum,
                                   const char** okeys,
                                   NDArrayHandle* vals,
                                   NDArrayHandle* outs,
                                   int priority);



MXNET_DLL int MXKVStorePushPull(KVStoreHandle handle,
                                mx_uint vnum,
                                const int* vkeys,
                                mx_uint onum,
                                const int* okeys,
                                NDArrayHandle* vals,
                                NDArrayHandle* outs,
                                int priority);
--
MXNET_DLL int MXKVStorePushPullEx(KVStoreHandle handle,
                                  mx_uint vnum,
                                  const char** vkeys,
                                  mx_uint onum,
                                  const char** okeys,
                                  NDArrayHandle* vals,
                                  NDArrayHandle* outs,
                                  int priority);



MXNET_DLL int MXKVStoreSetUpdater(KVStoreHandle handle,
                                  MXKVStoreUpdater updater,
                                  void *updater_handle);
--
MXNET_DLL int MXKVStoreSetUpdaterEx(KVStoreHandle handle,
                                    MXKVStoreUpdater updater,
                                    MXKVStoreStrUpdater str_updater,
                                    void *updater_handle);



MXNET_DLL int MXNDArrayGetSharedMemHandle(NDArrayHandle handle, int* shared_pid,
                                          int* shared_id);
--
MXNET_DLL int MXNDArrayCreateFromSharedMemEx(int shared_pid, int shared_id, const int *shape,
                                             int ndim, int dtype, NDArrayHandle *out);

```


## Build System with Makefile

CMake build supports all use-cases of the Makefile based build, but Makefile based build only supports a subset of CMake based build. To simplify maintenance, we thus remove the Makefile based build.


## IO/DataIter API

1. Clean up `mxnet.image` module. Similar functions will be provided in `mxnet.gluon.data`
    1. Removes `mxnet.image.Augmenter` and all subclasses. → Replace with `mxnet.gluon.data.vision.transforms.*`
    2. Removes `mxnet.image.ImageIter` → replace with `mxnet.gluon.data.vision.ImageFolderDataset` or `mxnet.gluon.data.vision.ImageRecordDataset` or `mxnet.gluon.data.vision.ImageListDataset`
    3. Remove everything in `mxnet.image.detection` module, including `mxnet.image.DetAugmenter` and `mxnet.image.ImageDetIter`, →  replace with `mxnet.gluon.data.vision.ImageRecordDataset` or `mxnet.gluon.data.vision.ImageListDataset`

1. Keeps iterators in  `mxnet.io`, however,
    1. adapt iterators to return tuple of NDArrays directly rather than `DataBatch`, to be aligned with `DataLoader`
    2. add auto reset mechanism to all iterators, again, to align with `DataLoader`



## Python 2 Support Deprecation

Python 2 is unmaintained as of January 2020. MXNet 1.6.x series is the last to support Python 2.
See #8703 and consensus in [5].


## References

[1] https://docs.tvm.ai/dev/runtime.html#packedfunc
[2] https://cwiki.apache.org/confluence/display/MXNET/MXNet+FFI+for+Operator+Imperative+Invocation
[3] https://github.com/apache/incubator-mxnet/pull/17510
[4] https://github.com/apache/incubator-mxnet/issues/16167
[5] https://lists.apache.org/thread.html/r3a2db0f22a1680cc56804191446fef2289595798ca19fd17de1ff03e%40%3Cdev.mxnet.apache.org%3E

-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17676

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 API Deprecation (#17676)

Posted by Lanking <no...@github.com>.
> @lanking520 @zachgk @terrytangyuan @aaronmarkham could one of you start a discussion in a new issue on the JVM ecosystem support in 2.0? This topic seems to require extended discussion.

I created one here https://github.com/apache/incubator-mxnet/issues/17783

-- 
You are receiving this because you authored the thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17676#issuecomment-595923653

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 API Deprecation (#17676)

Posted by Sheng Zha <no...@github.com>.
mxnet.rnn module should be deprecated and removed too given it's designed for interacting with symbol API.

-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17676#issuecomment-619624835

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 API Deprecation (#17676)

Posted by Carin Meier <no...@github.com>.
Is there anyway to get the stats on downloads of the maven central scala/clojure jars to see how much current use there is? If the numbers are high or low and what trend is can help shape the decision

-- 
You are receiving this because you authored the thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17676#issuecomment-595511949

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 API Deprecation (#17676)

Posted by Carin Meier <no...@github.com>.
If I understand this correctly, since the Scala, Java, and Clojure bindings use symbol and ndarray exclusively, this will also mean that they will be effectively deprecated as well.

This is fine if what the community decides upon, but it should be called out explicitly.

cc @lanking520 @nswamy @kedarbellare 

-- 
You are receiving this because you authored the thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17676#issuecomment-595245038

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 API Deprecation (#17676)

Posted by Sheng Zha <no...@github.com>.
@yifeim module API will continue to be supported in 1.x and users are free to stay on that version. For 2.x, we will only support numpy/npx API so users who adopt those API will have to reimplement the model anyway. the main function you listed will all be available in 2.x.

-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17676#issuecomment-639131712

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 API Deprecation (#17676)

Posted by yifeim <no...@github.com>.
> the main function you listed will all be available in 2.x.

👍 

-- 
You are receiving this because you authored the thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17676#issuecomment-639266219

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 API Deprecation (#17676)

Posted by yifeim <no...@github.com>.
Hi there, I am too a little concerned about dropping module support. Since a large percent of the user based started with module APIs, dropping that support could alienate the user base.

I got familiar around mxnet-1.3. The main functions I appreciate are:
1. sparse vector symbols - which are still best supported on module API or Symbol-Gluon;
2. control flow operators - which allow people to create basically any arbitrary operator that they would want (though it is still not quite arbitrary);
3. linalg package - some quite sophisticated algorithms that would not be easy to replicate.
4. Hybridize - some amount of input validation is desirable in shared libraries. I am not looking for the speed change, but rather the idea that I can easily catch the edge cases with a defined graph. To the gist of it, I do want to see more supports such as memory estimation / compute time estimation / nan checking, some of which are available in TVM.

-- 
You are receiving this because you authored the thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17676#issuecomment-639087830

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 API Deprecation (#17676)

Posted by Yuan Tang <te...@gmail.com>.
I vote for "upgrade/rewrite Scala API and bring up MXNet 2.0 features" as
it took us a lot of efforts to bring MXNet to Scala originally and there
are already adopters of Scala API in industries.

On Fri, Mar 6, 2020 at 11:02 AM Wang Jiajun <no...@github.com>
wrote:

> > We may also drop ONNX in MXNet 2. I'm not aware of anyone working on
> ONNX in MXNet and TVM can be used as a replacement.
>
> +1 for keeping ONNX support. Although it has a lot of small problems, but
> I've converted a lot of pytorch models to mxnet for deploying with the
> following pipeline:
>
> https://docs.aws.amazon.com/dlami/latest/devguide/tutorial-onnx-pytorch-mxnet.html
>
> --
> You are receiving this because you authored the thread.
> Reply to this email directly or view it on GitHub:
>
> https://github.com/apache/incubator-mxnet/issues/17676#issuecomment-595835658



-- 
Yuan Tang
https://terrytangyuan.github.io/about/ <http://twitter.com/TerryTangYuan>
<https://terrytangyuan.github.io/about/>

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 API Deprecation (#17676)

Posted by Wang Jiajun <no...@github.com>.
> We may also drop ONNX in MXNet 2. I'm not aware of anyone working on ONNX in MXNet and TVM can be used as a replacement.

+1 for keeping ONNX support. Although it has a lot of small problems, but I've converted a lot of pytorch models to mxnet for deploying with the following pipeline:
https://docs.aws.amazon.com/dlami/latest/devguide/tutorial-onnx-pytorch-mxnet.html

-- 
You are receiving this because you authored the thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17676#issuecomment-595835658

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 API Deprecation (#17676)

Posted by Tao Lv <no...@github.com>.
We have `v1` and `v2` APIs like:
https://mxnet.incubator.apache.org/api/python/docs/search.html?q=v1
https://mxnet.incubator.apache.org/api/python/docs/search.html?q=v2

Do we need cover them in the RFC? How to deprecate or unify these APIs?


-- 
You are receiving this because you authored the thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17676#issuecomment-594984804

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 API Deprecation (#17676)

Posted by Haibin Lin <no...@github.com>.
Drop the following loss operators since they are used with Module API:
- mx.symbol.LinearRegressionOutput
- mx.symbol.MAERegressionOutput
- mx.symbol.LogisticRegressionOutput
- mx.symbol.SVMOutput
- mx.symbol.SoftmaxOutput


-- 
You are receiving this because you authored the thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17676#issuecomment-642389680

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 API Deprecation (#17676)

Posted by Yuan Tang <no...@github.com>.
+1 for "upgrade/rewrite Scala API and bring up MXNet 2.0 features" as it took us a lot of efforts to bring MXNet to Scala originally and there are already adopters of Scala API in industries.

-- 
You are receiving this because you authored the thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17676#issuecomment-595840427

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 API Deprecation (#17676)

Posted by Przemyslaw Tredak <no...@github.com>.
I am generally in favor of those deprecations. The scariest part is the removal of `mx.module` API, so definitely `Gluon is on par with module in terms of functionality and performance` is very important for this to be successful. 

-- 
You are receiving this because you authored the thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17676#issuecomment-594945007

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 API Deprecation (#17676)

Posted by Sheng Zha <no...@github.com>.
cc @apache/mxnet-committers 

-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17676#issuecomment-591743914

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 API Deprecation (#17676)

Posted by Carin Meier <no...@github.com>.
Good question. I don't know. There wasn't a new release then. 🤷‍♀ 

-- 
You are receiving this because you authored the thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17676#issuecomment-595885111

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 API Deprecation (#17676)

Posted by Sheng Zha <no...@github.com>.
TensorRT support is currently using ONNX to convert from NNVM: https://github.com/apache/incubator-mxnet/blob/746cbc55fd666bb4529e88d247fed8e0907270f9/src/operator/subgraph/tensorrt/tensorrt.cc#L313-L318<https://github.com/apache/incubator-mxnet/blob/746cbc55fd666bb4529e88d247fed8e0907270f9/src/operator/subgraph/tensorrt/nnvm_to_onnx-inl.h>

Although I would like to see TensorRT support moved away from ONNX with a native integration using the Accelerator API compile support: https://github.com/apache/incubator-mxnet/pull/17623. But the migration from ONNX to AccAPI is still in discussion and the compile support PR is not merged yet (shameless plug: please review! :-D)

Sam

On Feb 28, 2020, at 9:06 PM, JackieWu <no...@github.com>> wrote:

I think we should keep ONNX APIs, since it is able to export many basic models, although it is not perfect. Users will train their models in MXNet 2.0, and export ONNX model,  then use the ONNX model in their deployment frameworks. (http://onnx.ai/supported-tools).

It is useful to attract users to use MXNet 2.0 to train their models with ONNX.

--
You are receiving this because you authored the thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17676#issuecomment-592878029



-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17676#issuecomment-593574187

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 API Deprecation (#17676)

Posted by Zach Kimberg <no...@github.com>.
@gigasquid Yeah, you can view the download statistics from https://repository.apache.org/#central-stat.

-- 
You are receiving this because you authored the thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17676#issuecomment-595522607

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 API Deprecation (#17676)

Posted by Aaron Markham <no...@github.com>.
> Thanks @zachgk - I took a couple of screenshots so I could share here
> 
> Here is the Scala package
> ![scala-mxnet](https://user-images.githubusercontent.com/340299/76096507-3b206a00-5f94-11ea-839a-168fb923a59d.png)
> 
> and here is the Clojure package
> ![clojure-mxnet-downloads](https://user-images.githubusercontent.com/340299/76096528-41164b00-5f94-11ea-8313-5bfbc97ff689.png)
> 
> There are far more Scala downloads that Clojure.
> 
> @lanking520 and other Scala package maintainers. I thank you for all the work that you've done on the Scala package so far. I will support whatever decision makes most sense for the Scala package and for the JVM MXNet users for 2.0.
> 
> Let's just make a plan and coordinate whatever that is so that the current users have the most information to plan accordingly.

What's the big spike in January?

-- 
You are receiving this because you authored the thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17676#issuecomment-595882233

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 API Deprecation (#17676)

Posted by "Joshua Z. Zhang" <no...@github.com>.
Just confirmed that If ndarray is being deprecated, the effort of rewriting model_zoo with `np` and `npx` is minimal.

If consensus is made to move model_zoo for testing purpose to test_utils.py I can follow this up in https://github.com/apache/incubator-mxnet/pull/18480

-- 
You are receiving this because you authored the thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17676#issuecomment-668280770

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 API Deprecation (#17676)

Posted by Carin Meier <no...@github.com>.
Further thinking this through - since the Scala language binding currently provides the base for both Java and Clojure, I would be nice to know what the future plans for the Scala language binding is. Whether or not that path is supported will determine the other JVM langs.

-- 
You are receiving this because you authored the thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17676#issuecomment-595256200

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 API Deprecation (#17676)

Posted by Carin Meier <no...@github.com>.
Thanks @zachgk  - I took a couple of screenshots so I could share here

Here is the Scala package
![scala-mxnet](https://user-images.githubusercontent.com/340299/76096507-3b206a00-5f94-11ea-839a-168fb923a59d.png)

and here is the Clojure package
![clojure-mxnet-downloads](https://user-images.githubusercontent.com/340299/76096528-41164b00-5f94-11ea-8313-5bfbc97ff689.png)

There are far more Scala downloads that Clojure.

@lanking520 and other Scala package maintainers. I thank you for all the work that you've done on the Scala package so far. I will support whatever decision makes most sense for the Scala package makes for the JVM MNNet users for 2.0.

Let's just make a plan and coordinate whatever that is so that the current users have the most information to plan accordingly.



-- 
You are receiving this because you authored the thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17676#issuecomment-595818419

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 API Deprecation (#17676)

Posted by Sheng Zha <no...@github.com>.
I vote for "upgrade/rewrite Scala API and bring up MXNet 2.0 features" as
it took us a lot of efforts to bring MXNet to Scala originally and there
are already adopters of Scala API in industries.

On Fri, Mar 6, 2020 at 11:02 AM Wang Jiajun <no...@github.com>
wrote:

> > We may also drop ONNX in MXNet 2. I'm not aware of anyone working on
> ONNX in MXNet and TVM can be used as a replacement.
>
> +1 for keeping ONNX support. Although it has a lot of small problems, but
> I've converted a lot of pytorch models to mxnet for deploying with the
> following pipeline:
>
> https://docs.aws.amazon.com/dlami/latest/devguide/tutorial-onnx-pytorch-mxnet.html
>
> --
> You are receiving this because you authored the thread.
> Reply to this email directly or view it on GitHub:
>
> https://github.com/apache/incubator-mxnet/issues/17676#issuecomment-595835658



-- 
Yuan Tang
https://terrytangyuan.github.io/about/ <http://twitter.com/TerryTangYuan>
<https://terrytangyuan.github.io/about/>


-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17676#issuecomment-595840489

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 API Deprecation (#17676)

Posted by Sheng Zha <no...@github.com>.
Since the Gluon built-in model zoo is being deprecated and some tests still rely on them, these models will be moved to test_utils.py

-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17676#issuecomment-667741284

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 API Deprecation (#17676)

Posted by Leonard Lausen <no...@github.com>.
NNPACK is currently only supported in the Makefile build (https://github.com/apache/incubator-mxnet/issues/15974), which will be removed. I think oneDNN (mkldnn) replaced it and we can remove it. Any concerns?

-- 
You are receiving this because you authored the thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17676#issuecomment-659039903

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 API Deprecation (#17676)

Posted by Sheng Zha <no...@github.com>.
@TaoLv the search result shows API in the following categories:
- operator (these will be deprecated and the newest version should be covered in https://github.com/apache/incubator-mxnet/issues/17096)
- gluon blocks (e.g. Con**v1**D). they are not legacy ops and will be kept
- io (these will be deprecated and replacement is covered in 2.0 roadmap item 4.8 data API enhancement)
- model zoo (e.g. ResNet**V1**). they are not legacy API and will be kept

-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17676#issuecomment-595018894

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 API Deprecation (#17676)

Posted by Sheng Zha <no...@github.com>.
@zhreshold thanks for bringing this up. Currently the test for that is the longest running one too, so if there's no objection I hope that we could move forward in removing it soon

-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17676#issuecomment-615394771

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 API Deprecation (#17676)

Posted by Leonard Lausen <no...@github.com>.
We may also drop ONNX in MXNet 2. I'm not aware of anyone working on ONNX in MXNet and TVM can be used as a replacement.

-- 
You are receiving this because you authored the thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17676#issuecomment-592782629

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 API Deprecation (#17676)

Posted by Leonard Lausen <no...@github.com>.
The models in the model zoo rely on ndarray operators which are currently recommended for removal. Thus keeping these models in test_utils.py won't work when proceeding with the operator removal.

-- 
You are receiving this because you authored the thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17676#issuecomment-668129995

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 API Deprecation (#17676)

Posted by Tao Lv <no...@github.com>.
What about those v1, v2 APIs?

-- 
You are receiving this because you authored the thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17676#issuecomment-591754890

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 API Deprecation (#17676)

Posted by Sheng Zha <no...@github.com>.
caffe usage is very low now and let's deprecate caffe converter too.

-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17676#issuecomment-613546835

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 API Deprecation (#17676)

Posted by Sheng Zha <no...@github.com>.
@lanking520 @zachgk @terrytangyuan @aaronmarkham could one of you start a discussion in a new issue on the JVM ecosystem support in 2.0? This topic seems to require extended discussion.

-- 
You are receiving this because you are subscribed to this thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17676#issuecomment-595911490

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 API Deprecation (#17676)

Posted by "Joshua Z. Zhang" <no...@github.com>.
In the long run, gluon vision model zoo will be maintained in GluonCV and therefore mxnet.gluon.vision.model_zoo should be deprecated to avoid duplicate maintenance efforts in 2.0

-- 
You are receiving this because you authored the thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17676#issuecomment-613682982

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 API Deprecation (#17676)

Posted by "Skalicky, Sam" <ss...@amazon.com.INVALID>.
TensorRT support is currently using ONNX to convert from NNVM: https://github.com/apache/incubator-mxnet/blob/746cbc55fd666bb4529e88d247fed8e0907270f9/src/operator/subgraph/tensorrt/tensorrt.cc#L313-L318<https://github.com/apache/incubator-mxnet/blob/746cbc55fd666bb4529e88d247fed8e0907270f9/src/operator/subgraph/tensorrt/nnvm_to_onnx-inl.h>

Although I would like to see TensorRT support moved away from ONNX with a native integration using the Accelerator API compile support: https://github.com/apache/incubator-mxnet/pull/17623. But the migration from ONNX to AccAPI is still in discussion and the compile support PR is not merged yet (shameless plug: please review! :-D)

Sam

On Feb 28, 2020, at 9:06 PM, JackieWu <no...@github.com>> wrote:

I think we should keep ONNX APIs, since it is able to export many basic models, although it is not perfect. Users will train their models in MXNet 2.0, and export ONNX model,  then use the ONNX model in their deployment frameworks. (http://onnx.ai/supported-tools).

It is useful to attract users to use MXNet 2.0 to train their models with ONNX.

--
You are receiving this because you authored the thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17676#issuecomment-592878029


Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 API Deprecation (#17676)

Posted by JackieWu <no...@github.com>.
I think we should keep ONNX APIs, since it is able to export many basic models, although it is not perfect. Users will train their models in MXNet 2.0, and export ONNX model,  then use the ONNX model in their deployment frameworks. (http://onnx.ai/supported-tools).

It is useful to attract users to use MXNet 2.0 to train their models with ONNX. 

-- 
You are receiving this because you authored the thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17676#issuecomment-592878029

Re: [apache/incubator-mxnet] [RFC] MXNet 2.0 API Deprecation (#17676)

Posted by Lanking <no...@github.com>.
@gigasquid @zachgk Since the Scala API are built a while ago, I can see some of the deprecated sections: Module, DataparallelGroup, Symbol ... Most of the training component would be invalid. There can be three approaches:
- upgrade/rewrite Scala API and bring up MXNet 2.0 features.
- drop Scala API
- adopt DJL (DeepJavaLibrary) to be the Java/Scala/Clojure backend.

I am not sure which way works the best, please leave any thoughts you have.

-- 
You are receiving this because you authored the thread.
Reply to this email directly or view it on GitHub:
https://github.com/apache/incubator-mxnet/issues/17676#issuecomment-595380125