You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@mxnet.apache.org by Zach Boldyga <za...@scalabull.com> on 2019/02/09 20:08:25 UTC

First class support for MxNet?

Any plans to take an approach similar to this for the MxNet library?

https://github.com/tensorflow/swift/blob/master/docs/WhySwiftForTensorFlow.md

-Zach

Re: First class support for MxNet?

Posted by Zach Boldyga <za...@scalabull.com>.
Really interesting stuff, Iblis. Thanks for sharing! I'm excited to stick
around and absorb :D

Zach Boldyga
Scalabull  |  Founder
1 (866) 846-8771 x 101


On Mon, Feb 11, 2019 at 6:25 AM Carin Meier <ca...@gmail.com> wrote:

> +100 on Iblis's thoughts:
>
> "We know tools and frameworks keep changing.
> People learn the lesson from making and attempting.
> It's just the path of the human technology evolution.
> The point is the ideas/experiences
> which this community is going to surprise you at."
>
> - Carin
>
>
> On Mon, Feb 11, 2019 at 9:08 AM iblis <ib...@hs.ntnu.edu.tw> wrote:
>
> > well, I'm not going to talk about technical stuffs.
> > You can find some design concepts on doc or wiki.
> > (
> >
> https://mxnet.incubator.apache.org/versions/master/architecture/index.html
> > )
> >
> > For me, working on MXNet is a rare chance to verify my ideas of
> > a machine learning framework.
> > During implementing MXNet Julia package, I can explicitly compare the
> > experience of MXNet with Flux's
> > ...and than start to complaining about them. :p
> > I think a way to moving forward is comparison.
> > So that's why I said I want to increase the diversity of DL tools in
> Julia.
> >
> > I like the spirit of portability in MXNet community.
> > We welcomed all of language packages and open-minded.
> > Although some of languages might be considered not popular in ML/DL,
> > this community still keep polishing them day in day out.
> > Yeah, someone has to try it, compare and gain experience from this
> > process regardless of how the language has been evaluated in ML.
> > The experience is valuable.
> > (e.g. I think lack of function overloading is a disadvantage
> >   of Python; the file-based namespace does help for maintainability
> >   in Python.
> >   After I did some works in Julia, I can clearly point out pros and
> cons.)
> >
> >  From a long-term view... maybe twenty years after,
> > none of the languages we are using now will be popular.
> > But I believe the meta-rules which extracted from experiences are still
> > applied.
> >
> > So.. why not have a Rust lib? maybe Rust's macro can do something crazy,
> > maybe.
> > e.g. Julia package shows a more elegant way to stack a network than
> Python,
> > thanks to metaprogramming.
> >
> >    mlp = @mx.chain mx.Variable(:data)             =>
> >      mx.FullyConnected(name=:fc1, num_hidden=128) =>
> >      mx.Activation(name=:relu1, act_type=:relu)   =>
> >      mx.FullyConnected(name=:fc2, num_hidden=64)  =>
> >      mx.Activation(name=:relu2, act_type=:relu)   =>
> >      mx.FullyConnected(name=:fc3, num_hidden=10)  =>
> >      mx.SoftmaxOutput(name=:softmax)
> >
> >
> > > Wondering where that leaves MxNet...
> >
> > Actually, I don't case about this issue.
> > We know tools and frameworks keep changing.
> > People learn the lesson from making and attempting.
> > It's just the path of the human technology evolution.
> > The point is the ideas/experiences
> > which this community is going to surprise you at.
> >
> >
> > Iblis Lin
> > 林峻頤
> >
> > On 2/11/19 12:04 PM, Zach Boldyga wrote:
> > > Those are compelling points! There's also another more recent follow-up
> > > from the Julia team:
> > https://julialang.org/blog/2018/12/ml-language-compiler
> > > .
> > >
> > > It seems that Julia will likely have it's place in ML regardless of how
> > > other tools progress; the latest offerings from Julia/Flux are really
> > > compelling.
> > >
> > > Wondering where that leaves MxNet...
> > >
> > > Zach Boldyga
> > > Scalabull  |  Founder
> > > 1 (866) 846-8771 x 101
> > >
> >
>

Re: First class support for MxNet?

Posted by Carin Meier <ca...@gmail.com>.
+100 on Iblis's thoughts:

"We know tools and frameworks keep changing.
People learn the lesson from making and attempting.
It's just the path of the human technology evolution.
The point is the ideas/experiences
which this community is going to surprise you at."

- Carin


On Mon, Feb 11, 2019 at 9:08 AM iblis <ib...@hs.ntnu.edu.tw> wrote:

> well, I'm not going to talk about technical stuffs.
> You can find some design concepts on doc or wiki.
> (
> https://mxnet.incubator.apache.org/versions/master/architecture/index.html
> )
>
> For me, working on MXNet is a rare chance to verify my ideas of
> a machine learning framework.
> During implementing MXNet Julia package, I can explicitly compare the
> experience of MXNet with Flux's
> ...and than start to complaining about them. :p
> I think a way to moving forward is comparison.
> So that's why I said I want to increase the diversity of DL tools in Julia.
>
> I like the spirit of portability in MXNet community.
> We welcomed all of language packages and open-minded.
> Although some of languages might be considered not popular in ML/DL,
> this community still keep polishing them day in day out.
> Yeah, someone has to try it, compare and gain experience from this
> process regardless of how the language has been evaluated in ML.
> The experience is valuable.
> (e.g. I think lack of function overloading is a disadvantage
>   of Python; the file-based namespace does help for maintainability
>   in Python.
>   After I did some works in Julia, I can clearly point out pros and cons.)
>
>  From a long-term view... maybe twenty years after,
> none of the languages we are using now will be popular.
> But I believe the meta-rules which extracted from experiences are still
> applied.
>
> So.. why not have a Rust lib? maybe Rust's macro can do something crazy,
> maybe.
> e.g. Julia package shows a more elegant way to stack a network than Python,
> thanks to metaprogramming.
>
>    mlp = @mx.chain mx.Variable(:data)             =>
>      mx.FullyConnected(name=:fc1, num_hidden=128) =>
>      mx.Activation(name=:relu1, act_type=:relu)   =>
>      mx.FullyConnected(name=:fc2, num_hidden=64)  =>
>      mx.Activation(name=:relu2, act_type=:relu)   =>
>      mx.FullyConnected(name=:fc3, num_hidden=10)  =>
>      mx.SoftmaxOutput(name=:softmax)
>
>
> > Wondering where that leaves MxNet...
>
> Actually, I don't case about this issue.
> We know tools and frameworks keep changing.
> People learn the lesson from making and attempting.
> It's just the path of the human technology evolution.
> The point is the ideas/experiences
> which this community is going to surprise you at.
>
>
> Iblis Lin
> 林峻頤
>
> On 2/11/19 12:04 PM, Zach Boldyga wrote:
> > Those are compelling points! There's also another more recent follow-up
> > from the Julia team:
> https://julialang.org/blog/2018/12/ml-language-compiler
> > .
> >
> > It seems that Julia will likely have it's place in ML regardless of how
> > other tools progress; the latest offerings from Julia/Flux are really
> > compelling.
> >
> > Wondering where that leaves MxNet...
> >
> > Zach Boldyga
> > Scalabull  |  Founder
> > 1 (866) 846-8771 x 101
> >
>

Re: First class support for MxNet?

Posted by iblis <ib...@hs.ntnu.edu.tw>.
well, I'm not going to talk about technical stuffs.
You can find some design concepts on doc or wiki.
(https://mxnet.incubator.apache.org/versions/master/architecture/index.html)

For me, working on MXNet is a rare chance to verify my ideas of
a machine learning framework.
During implementing MXNet Julia package, I can explicitly compare the
experience of MXNet with Flux's
...and than start to complaining about them. :p
I think a way to moving forward is comparison.
So that's why I said I want to increase the diversity of DL tools in Julia.

I like the spirit of portability in MXNet community.
We welcomed all of language packages and open-minded.
Although some of languages might be considered not popular in ML/DL,
this community still keep polishing them day in day out.
Yeah, someone has to try it, compare and gain experience from this
process regardless of how the language has been evaluated in ML.
The experience is valuable.
(e.g. I think lack of function overloading is a disadvantage
  of Python; the file-based namespace does help for maintainability
  in Python.
  After I did some works in Julia, I can clearly point out pros and cons.)

 From a long-term view... maybe twenty years after,
none of the languages we are using now will be popular.
But I believe the meta-rules which extracted from experiences are still applied.

So.. why not have a Rust lib? maybe Rust's macro can do something crazy, maybe.
e.g. Julia package shows a more elegant way to stack a network than Python,
thanks to metaprogramming.

   mlp = @mx.chain mx.Variable(:data)             =>
     mx.FullyConnected(name=:fc1, num_hidden=128) =>
     mx.Activation(name=:relu1, act_type=:relu)   =>
     mx.FullyConnected(name=:fc2, num_hidden=64)  =>
     mx.Activation(name=:relu2, act_type=:relu)   =>
     mx.FullyConnected(name=:fc3, num_hidden=10)  =>
     mx.SoftmaxOutput(name=:softmax)


> Wondering where that leaves MxNet...

Actually, I don't case about this issue.
We know tools and frameworks keep changing.
People learn the lesson from making and attempting.
It's just the path of the human technology evolution.
The point is the ideas/experiences
which this community is going to surprise you at.


Iblis Lin
林峻頤

On 2/11/19 12:04 PM, Zach Boldyga wrote:
> Those are compelling points! There's also another more recent follow-up
> from the Julia team:https://julialang.org/blog/2018/12/ml-language-compiler
> .
> 
> It seems that Julia will likely have it's place in ML regardless of how
> other tools progress; the latest offerings from Julia/Flux are really
> compelling.
> 
> Wondering where that leaves MxNet...
> 
> Zach Boldyga
> Scalabull  |  Founder
> 1 (866) 846-8771 x 101
> 

Re: First class support for MxNet?

Posted by Zach Boldyga <za...@scalabull.com>.
Those are compelling points! There's also another more recent follow-up
from the Julia team: https://julialang.org/blog/2018/12/ml-language-compiler
.

It seems that Julia will likely have it's place in ML regardless of how
other tools progress; the latest offerings from Julia/Flux are really
compelling.

Wondering where that leaves MxNet...

Zach Boldyga
Scalabull  |  Founder
1 (866) 846-8771 x 101


On Sat, Feb 9, 2019 at 11:02 PM Iblis Lin <ib...@hs.ntnu.edu.tw> wrote:

> (well, I'm a Julia programmer, my opinion might be quite bias. :p)
>
> No. I think Python is still dominating at this moment.
> I agree the Julia blog post about ML and PL
> (it also mentioned in that Swift artical):
>    https://julialang.org/blog/2017/12/ml&pl
>
>    (Chinese version)
>    https://julialang.org/blog/2017/12/ml&pl-cn
>    https://julialang.org/blog/2017/12/ml&pl-zh_tw
>
> TL;DR from my view:
> (Quote from the blog)
> "Any sufficiently complicated machine learning system contains an ad-hoc,
> informally-specified, bug-ridden, slow implementation of half of a
> programming language."
>
> Runtime & Ecosystem
>   Basically, I will say that TensorFlow/MXNet/PyTorch are different
> standalone
>   programming languages for specific domain -- numerical computation.
>   They use Python as their interfaces to build models.
>   Where do the models get computed? In their own runtime.
>   This runtime shares nothing with CPython's runtime.
>   User puts "+-*/" symbols and placeholders in Python,
>   but nothing is computed by CPython.
>
>   So...what's the problem about having own runtime?
>   In case of TF/MXNet/PyTorch, they splits and throws away the original
>   ecosystem.
>   For example, MXNet have its own array type 'NDArray'.
>     This type only run on our own runtime (libmxnet).
>     You have to abandon the great works done by scikit-learn from the
>     ecosystem of Scipy project, which people have already devoted tons of
>     efforts to.
>     You need to re-write a porting for NDArray if you want something like
>     Gaussion Process.
>     And this builds a wall between libmxnet and numpy runtime.
>
>   I feel so sorry about another example:
>
> https://mxnet.incubator.apache.org/versions/master/api/python/ndarray/linalg.html
>   This API was added about 1 year ago (or half year ago?).
>   It made me anxious.
>   Tons of numerical systems have more robust and versatile linear algebra
> functions.
>   But some of MXNet developers have to spend their valuable time on
> implement linalg
>   stuffs again.
>
> About Julia's ecosystem
>   (Alought selling Julia is not the point.)
>   Let's talk about what Julia comminuty have done on integrating ecosystem.
>   There is a package named Flux.jl[1].
>   It fully untilized Julia's native Array type and runtime.
>   For a CPU run, the code is written in pure Julia, and the performance is
> quite
>   competitve[2] for the case of all the code written in high-level
> language.
>   So that I can do some small experiemnts on my FreeBSD desktop
>   without compiling any C/Cpp extension.
>   For GPU run, there is a crazy package CUDANative.jl[3] to let user write
> kernel code
>   in pure Julia. It leverages LLVM's PTX backend.
>   This package is baked by JuliaGPU[4] comminuty.
>   About AD stuffs, it's supported by another group of poeple from
> JuliaDiff [5],
>   who is doing reseaches on ODE/PDE.
>   Flux integrates them all and become a part of ecosystem as well.
>   If user want to use some exotic statistic distributions, just plug the
> another
>   package from JuliaStats[6].
>
> > Any plans to take an approach similar to this for the MxNet library?
>
>   TBH, I'm selfish. My answer is Julia. I only care about Julia stuffs.
>   I'm trying to make more re-use of interfaces from Julia's stdlib and
> runtime.
>   It' challange. I hope the MXNet Julia package is more than a binding and
>   connecting with the ecosystem.
>
> So... you might ask that why I'm here and work on MXNet?
>   I want to increase the entroy of DL tools in Julia.
>   I think freedom is the symbol in the world of open source,
>   user should always have anothr choice on softwares.
>   I personally dislike the state of TF -- being a huge, closed ecosystem.
>   Many poeple is porting stuffs into TF's system and nothing fed back
>   (<del> the backprop got truncated :p </del>).
>   I think Julia can find a balance point between MXNet's and original
> ecosystem.
>
>
> [1] https://fluxml.ai/
> [2]
> https://github.com/avik-pal/DeepLearningBenchmarks#cpu-used-----intelr-xeonr-silver-4114-cpu--220ghz
> [3] https://github.com/JuliaGPU/CUDAnative.jl
> [4] https://github.com/JuliaGPU
> [5] https://github.com/JuliaDiff
> [6] https://github.com/JuliaStats
>
> Iblis Lin
> 林峻頤
>
> On 2/10/19 4:08 AM, Zach Boldyga wrote:
> > Any plans to take an approach similar to this for the MxNet library?
> >
> >
> https://github.com/tensorflow/swift/blob/master/docs/WhySwiftForTensorFlow.md
> >
> > -Zach
> >
>

Re: First class support for MxNet?

Posted by Iblis Lin <ib...@hs.ntnu.edu.tw>.
(well, I'm a Julia programmer, my opinion might be quite bias. :p)

No. I think Python is still dominating at this moment.
I agree the Julia blog post about ML and PL
(it also mentioned in that Swift artical):
   https://julialang.org/blog/2017/12/ml&pl

   (Chinese version)
   https://julialang.org/blog/2017/12/ml&pl-cn
   https://julialang.org/blog/2017/12/ml&pl-zh_tw

TL;DR from my view:
(Quote from the blog)
"Any sufficiently complicated machine learning system contains an ad-hoc,
informally-specified, bug-ridden, slow implementation of half of a programming language."

Runtime & Ecosystem
  Basically, I will say that TensorFlow/MXNet/PyTorch are different standalone
  programming languages for specific domain -- numerical computation.
  They use Python as their interfaces to build models.
  Where do the models get computed? In their own runtime.
  This runtime shares nothing with CPython's runtime.
  User puts "+-*/" symbols and placeholders in Python, 
  but nothing is computed by CPython.

  So...what's the problem about having own runtime?
  In case of TF/MXNet/PyTorch, they splits and throws away the original 
  ecosystem.
  For example, MXNet have its own array type 'NDArray'.
    This type only run on our own runtime (libmxnet).
    You have to abandon the great works done by scikit-learn from the
    ecosystem of Scipy project, which people have already devoted tons of
    efforts to.
    You need to re-write a porting for NDArray if you want something like
    Gaussion Process.
    And this builds a wall between libmxnet and numpy runtime.

  I feel so sorry about another example:
  https://mxnet.incubator.apache.org/versions/master/api/python/ndarray/linalg.html
  This API was added about 1 year ago (or half year ago?).
  It made me anxious.
  Tons of numerical systems have more robust and versatile linear algebra functions.
  But some of MXNet developers have to spend their valuable time on implement linalg
  stuffs again.

About Julia's ecosystem
  (Alought selling Julia is not the point.)
  Let's talk about what Julia comminuty have done on integrating ecosystem.
  There is a package named Flux.jl[1].
  It fully untilized Julia's native Array type and runtime.
  For a CPU run, the code is written in pure Julia, and the performance is quite 
  competitve[2] for the case of all the code written in high-level language.
  So that I can do some small experiemnts on my FreeBSD desktop 
  without compiling any C/Cpp extension.
  For GPU run, there is a crazy package CUDANative.jl[3] to let user write kernel code
  in pure Julia. It leverages LLVM's PTX backend.
  This package is baked by JuliaGPU[4] comminuty.
  About AD stuffs, it's supported by another group of poeple from JuliaDiff [5],
  who is doing reseaches on ODE/PDE.
  Flux integrates them all and become a part of ecosystem as well.
  If user want to use some exotic statistic distributions, just plug the another
  package from JuliaStats[6].

> Any plans to take an approach similar to this for the MxNet library?

  TBH, I'm selfish. My answer is Julia. I only care about Julia stuffs.
  I'm trying to make more re-use of interfaces from Julia's stdlib and runtime.
  It' challange. I hope the MXNet Julia package is more than a binding and
  connecting with the ecosystem.

So... you might ask that why I'm here and work on MXNet?
  I want to increase the entroy of DL tools in Julia.
  I think freedom is the symbol in the world of open source,
  user should always have anothr choice on softwares.
  I personally dislike the state of TF -- being a huge, closed ecosystem.
  Many poeple is porting stuffs into TF's system and nothing fed back
  (<del> the backprop got truncated :p </del>).
  I think Julia can find a balance point between MXNet's and original ecosystem.


[1] https://fluxml.ai/
[2] https://github.com/avik-pal/DeepLearningBenchmarks#cpu-used-----intelr-xeonr-silver-4114-cpu--220ghz
[3] https://github.com/JuliaGPU/CUDAnative.jl
[4] https://github.com/JuliaGPU
[5] https://github.com/JuliaDiff
[6] https://github.com/JuliaStats

Iblis Lin
林峻頤

On 2/10/19 4:08 AM, Zach Boldyga wrote:
> Any plans to take an approach similar to this for the MxNet library?
> 
> https://github.com/tensorflow/swift/blob/master/docs/WhySwiftForTensorFlow.md
> 
> -Zach
>