You are viewing a plain text version of this content. The canonical link for it is here.
Posted to cvs@incubator.apache.org by Apache Wiki <wi...@apache.org> on 2017/12/31 23:51:30 UTC

[Incubator Wiki] Update of "January2018" by smarthi

Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Incubator Wiki" for change notification.

The "January2018" page has been changed by smarthi:
https://wiki.apache.org/incubator/January2018?action=diff&rev1=24&rev2=25

Comment:
Adding MxNet Podling Report

  --------------------
  MXNet
  
- A Flexible and Efficient Library for Deep Learning
+ Apache MXNet is an open-source, scalable, distributed and high-performance deep learning framework that allows you to define, train, and deploy deep neural networks on a wide array of devices, from cloud infrastructure to mobile devices. It is highly scalable, allowing for fast model training, and supports a flexible programming model and multiple languages. Apache MXNet allows you to mix symbolic and imperative programming flavors to maximize both efficiency and productivity. Apache MXNet is built on a dynamic dependency scheduler that automatically parallelizes both symbolic and imperative operations on the fly. A graph optimization layer on top of that makes symbolic execution fast and memory efficient. The Apache MXNet library is portable and lightweight, and it scales to multiple GPUs and multiple machines.
  
+ 
- MXNet has been incubating since 2017-01-23.
+ Apache MXNet has been incubating since 2017-01-23.
  
  Three most important issues to address in the move towards graduation:
  
-   1.
-   2.
-   3.
+   1. Establish a predictable release process consistent with Apache Way -- ONGOING.
+   2. Grow the community -- ONGOING.
+   3. Bring website up to Apache standard -- ONGOING
+   4. Identify remaining ICLAs or SGAs that need signing -- ONGOING
  
  Any issues that the Incubator PMC (IPMC) or ASF Board wish/need to be
  aware of?
  
+  None
  
- 
- How has the community developed since the last report?
- 
- 
- 
- How has the project developed since the last report?
+ How has the community developed since the last report?
  
+ a) Various Slack channels, dev@ mailing lists, and user discussion forums (http://discuss.mxnet.io) are being used actively. The contributors have been working on having all discussions on the public dev@ mailing list as much as possible. If some discussions happen in private, they are eventually brought out on dev@ with all perspectives well-represented. This is an ongoing improvement process where the focus will be to reduce the scope of private discussions to only a few individuals before it is put on the public dev@ mailing list so that the Apache MXNet community gets a fair chance in influencing the final outcome/decision of the discussion.
+  
+  b) O’Reilly published a series of blogs about MXNet, including ones with deep matrix factorization using Apache MXNet:
+ https://www.oreilly.com/ideas/sentiment-analysis-with-apache-mxnet
+ https://www.oreilly.com/ideas/deep-matrix-factorization-using-apache-mxnet
+ https://www.oreilly.com/ideas/apache-mxnet-in-the-wolfram-language
+  
+  c) A blog post published on 25-Oct about MXNet – an open source binary neural network implementation based on MXNet: https://aws.amazon.com/blogs/ai/research-spotlight-bmxnet-an-open-source-binary-neural-network-implementation-based-on-mxnet/
+  
+  d) A blog post published on 01-Nov about the availability of Nvidia Volta GPU support and Sparse Tensor support: https://aws.amazon.com/blogs/ai/apache-mxnet-release-adds-support-for-new-nvidia-volta-gpus-and-sparse-tensor/
+  
+  e) A new blog post published on 08-Nov showing MXNet 0.12 extends Gluon Functionality: https://aws.amazon.com/blogs/ai/apache-mxnet-version-0-12-extends-gluon-functionality-to-support-cutting-edge-research/
+  
+  f) A blog post published on 08-Nov introducing Model Server for MXNet: https://aws.amazon.com/blogs/ai/introducing-model-server-for-apache-mxnet/
+  
+  g) A blog post published on 7-Nov demonstrating performance and scalability of MXNet: https://techburst.io/mxnet-the-real-world-deep-learning-framework-2690e56ef81f
+  
+  h) Members of the community have conducted open meetups to share information on Apache MXNet: https://www.meetup.com/Apache-MXNet-learning-group/
+  
+ i) Talks on Apache MXNet have been held in various universities and conferences across the world including US, China, etc.:
+ https://www.youtube.com/watch?v=me1qOzSg8MU
+ https://www.youtube.com/watch?v=9IrvDHRQaaA
+ https://www.youtube.com/watch?v=4PbSZRYXa3o
+ https://www.youtube.com/watch?v=RRy-3VXA0nw
+  
+ j)  MXNet 1.0 was released on 04-Dec, 2017 with extensive support and help from various community members and timely guidance from the Apache MXNet Mentors.
  
+ How has the project developed since the last report?
- How would you assess the podling's maturity?
- Please feel free to add your own commentary.
  
+ a.  The community released MXNet 1.0 that is production ready, simplifies deep learning experience, and significantly improves performance with cutting-edge features described here: https://blogs.apache.org/mxnet/entry/milestone-v1-0-release-for
+  
+ b.  From a statistics perspective, based on the Github insights, found here: https://github.com/apache/incubator-mxnet/pulse/monthly, in Dec 2017, 51 authors pushed 115 commits to master, with updates to 489 files including 13K additions and 9K deletions. Historically, in Sep 2017, 62 authors pushed 171 commits to master, with updates to 467 files including 26K additions and 7K deletions. Historically, in July 2017, 54 authors pushed 140 commits to master, with updates to 358 files including 22K additions and 3K deletions. We are working on finding more contributors to the project.
+  
+ c. Documentation- Architecture guides, How To’s, Tutorials, and APIs continue to be improved.
+ 
+ d. Support for Perl language bindings - contributed by Sergey Kolychev.
+  
+ e. More advanced features (e.g. sparse tensor, advanced indexing, gradient compression) and bug-fixes requested by the user community continue to be added.
+ 
+ f. Community took complete end-to-end ownership of the continuous integration process in order to enable reliable testing on a wide set of back ends (IoT devices to GPU clusters).
+ 
+ 
+ 
+ How would you assess the podling's maturity?
+ 
+   Podling is still being established in Apache - hence maturity == Low.
+ 
+ Please feel free to add your own commentary.
+ 
    [ ] Initial setup
    [ ] Working towards first release
-   [ ] Community building
+   [X] Community building
    [ ] Nearing graduation
    [ ] Other:
  
  Date of last release:
  
-   XXXX-XX-XX
+   2017-12-04
  
  When were the last committers or PPMC members elected?
  
+ Sergey Kolychev was elected as a committer and PPMC member in October 2017 for contributing the Perl language bindings. 
+ There is a plan to convert more contributors into committers in early 2018.
  
  
  Signed-off-by:

---------------------------------------------------------------------
To unsubscribe, e-mail: cvs-unsubscribe@incubator.apache.org
For additional commands, e-mail: cvs-help@incubator.apache.org