You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@mxnet.apache.org by Markus Weimer <ma...@weimo.de> on 2017/06/29 18:59:40 UTC

NNVM status?

Hi,

I came across the NNVM GitHub which states

> MXNet is moving to NNVM as its intermediate representation layer for symbolic graphs.

Is this still the plan? And, will NNVM be part of Apache MXNet?

Thanks,

Markus

Re: NNVM status?

Posted by Mu Li <mu...@gmail.com>.
NNVM defines the graph structure, and how to write a pass function. What
you referred includes a concrete definition of the operator interface, e.g.
the lapack/blas interface.

A similar project is DLPack (https://github.com/dmlc/dlpack), it currently
defines the tensor interface, but not the operator level now.

A complete DL/ML stack will include

1. Frontend users will see
2. Computation graph IR, a serializable format to define the computation
flow, such as c = a*b+1
3. Operator IR, a common operator interface, e.g. +: a:tensor, b:tensor,
res:tensor
4. Compilers to optimize these IRs
5. Executor to run the workloads

Currently we have

1. apache/incubator-mxnet
2. dmlc/NNVM
3. dmlc/DLPack, dmlc/TVM/topi
4. pass on dmlc/NNVM, dmlc/TVM
5. Exectuor in apache/incubator-mxnet






On Wed, Aug 23, 2017 at 1:54 PM, Markus Weimer <ma...@weimo.de> wrote:

> Warming up this thread :)
>
> I thought a bit about this and wonder: Would a shared type system of
> all the ML projects in the ASF be a good place to start?
>
> Many data science projects I observe seem to be using several of our
> tools, mixed with other open source and proprietary software. This
> introduces all sorts of inefficiencies and breakages, many of which
> have their root cause in the different type systems used (e.g., "wait,
> -1 means missing value her? I thought we used 0??").
>
> If we could agree on a type system here in the ASF, it could provide a
> north star for the community at large. Maybe we can even collaborate
> with Apache Avro to make sure that the whole type system has a defined
> serialized form.
>
> WDYT? How much of my observations did you have? Is a shared type
> system feasible and interesting? If so, we can start a cross-project
> thread on it.
>
> Thanks,
>
> Markus
>

Re: NNVM status?

Posted by Markus Weimer <ma...@weimo.de>.
Warming up this thread :)

I thought a bit about this and wonder: Would a shared type system of
all the ML projects in the ASF be a good place to start?

Many data science projects I observe seem to be using several of our
tools, mixed with other open source and proprietary software. This
introduces all sorts of inefficiencies and breakages, many of which
have their root cause in the different type systems used (e.g., "wait,
-1 means missing value her? I thought we used 0??").

If we could agree on a type system here in the ASF, it could provide a
north star for the community at large. Maybe we can even collaborate
with Apache Avro to make sure that the whole type system has a defined
serialized form.

WDYT? How much of my observations did you have? Is a shared type
system feasible and interesting? If so, we can start a cross-project
thread on it.

Thanks,

Markus

Re: NNVM status?

Posted by Chris Mattmann <ma...@apache.org>.
Thanks this is a fascinating conversation to follow.

Cheers,
Chris




On 6/29/17, 1:23 PM, "Mu Li" <limu.cn@gmail.com on behalf of muli.cmu@gmail.com> wrote:

    +1 for Apache deep learning stack
    
    There are a few exciting projects such as NNVM, DLPack and TVM going on.
    
    On Thu, Jun 29, 2017 at 1:15 PM, Markus Weimer <ma...@weimo.de> wrote:
    
    > On Thu, Jun 29, 2017 at 1:00 PM, Tianqi Chen <tq...@cs.washington.edu>
    > wrote:
    > > Have a separate repo gives possibility of non-mxnet project to use it,
    > e.g.
    > > https://github.com/tqchen/tinyflow NVidia is also considering use NNVM
    > as
    > > their graph IR layer. Putting things into mxnet makes maintenance with
    > > MXNet easier.
    >
    > +1 on keeping it a separate artifact. However, that is not in conflict
    > to moving it into the MXNet project. There doesn't have to be a 1:1
    > relationship between Apache projects and the artifacts they produce.
    > The Apache MXNet community could develop and maintain both mxnet and
    > nnvm.
    >
    > > The most ideal case would to have an Apache deep learning stack that
    > > contains NNVM, and MXNet etc.
    >
    > +1! There is no need for that to be a coordinated effort in one
    > project, though. As long as all the projects involved are somewhat
    > apprised of one another and there is community overlap, this can work
    > well. The example of that is the Apache Big Data stack, which consists
    > of many top- and sublevel projects that all, kinda sorta, work
    > together.
    >
    > Markus
    >
    



Re: NNVM status?

Posted by Mu Li <mu...@gmail.com>.
+1 for Apache deep learning stack

There are a few exciting projects such as NNVM, DLPack and TVM going on.

On Thu, Jun 29, 2017 at 1:15 PM, Markus Weimer <ma...@weimo.de> wrote:

> On Thu, Jun 29, 2017 at 1:00 PM, Tianqi Chen <tq...@cs.washington.edu>
> wrote:
> > Have a separate repo gives possibility of non-mxnet project to use it,
> e.g.
> > https://github.com/tqchen/tinyflow NVidia is also considering use NNVM
> as
> > their graph IR layer. Putting things into mxnet makes maintenance with
> > MXNet easier.
>
> +1 on keeping it a separate artifact. However, that is not in conflict
> to moving it into the MXNet project. There doesn't have to be a 1:1
> relationship between Apache projects and the artifacts they produce.
> The Apache MXNet community could develop and maintain both mxnet and
> nnvm.
>
> > The most ideal case would to have an Apache deep learning stack that
> > contains NNVM, and MXNet etc.
>
> +1! There is no need for that to be a coordinated effort in one
> project, though. As long as all the projects involved are somewhat
> apprised of one another and there is community overlap, this can work
> well. The example of that is the Apache Big Data stack, which consists
> of many top- and sublevel projects that all, kinda sorta, work
> together.
>
> Markus
>

Re: NNVM status?

Posted by Markus Weimer <ma...@weimo.de>.
On Thu, Jun 29, 2017 at 1:00 PM, Tianqi Chen <tq...@cs.washington.edu> wrote:
> Have a separate repo gives possibility of non-mxnet project to use it, e.g.
> https://github.com/tqchen/tinyflow NVidia is also considering use NNVM as
> their graph IR layer. Putting things into mxnet makes maintenance with
> MXNet easier.

+1 on keeping it a separate artifact. However, that is not in conflict
to moving it into the MXNet project. There doesn't have to be a 1:1
relationship between Apache projects and the artifacts they produce.
The Apache MXNet community could develop and maintain both mxnet and
nnvm.

> The most ideal case would to have an Apache deep learning stack that
> contains NNVM, and MXNet etc.

+1! There is no need for that to be a coordinated effort in one
project, though. As long as all the projects involved are somewhat
apprised of one another and there is community overlap, this can work
well. The example of that is the Apache Big Data stack, which consists
of many top- and sublevel projects that all, kinda sorta, work
together.

Markus

Re: NNVM status?

Posted by Tianqi Chen <tq...@cs.washington.edu>.
We can keep an open discussion on this. There are pros and cons of having a
separate nnvm repo vs merge it into mxnet.

Have a separate repo gives possibility of non-mxnet project to use it, e.g.
https://github.com/tqchen/tinyflow NVidia is also considering use NNVM as
their graph IR layer. Putting things into mxnet makes maintenance with
MXNet easier. In terms of project design, the isolation is clear and there
should be no problem of having a separate module.

The most ideal case would to have an Apache deep learning stack that
contains NNVM, and MXNet etc. That goes beyond what we might be able to
tackle in this project and will requires concesus from more folks.

Tianqi

On Thu, Jun 29, 2017 at 12:48 PM, Markus Weimer <ma...@weimo.de> wrote:

> On Thu, Jun 29, 2017 at 12:41 PM, Markus Weimer <ma...@weimo.de> wrote:
> > Sounds good! What license is NNVM under?
>
> Now that was an unnecessary question :)  Sorry about that. I see that
> it is ASL 2.0 licensed. Would you consider putting it into Apache
> MXNet? There are plenty of players in this space that might be more
> comfortable picking it up that way.
>
> Markus
>

Re: NNVM status?

Posted by Markus Weimer <ma...@weimo.de>.
On Thu, Jun 29, 2017 at 12:41 PM, Markus Weimer <ma...@weimo.de> wrote:
> Sounds good! What license is NNVM under?

Now that was an unnecessary question :)  Sorry about that. I see that
it is ASL 2.0 licensed. Would you consider putting it into Apache
MXNet? There are plenty of players in this space that might be more
comfortable picking it up that way.

Markus

Re: NNVM status?

Posted by Markus Weimer <ma...@weimo.de>.
Sounds good! What license is NNVM under? It is one of the key
dependencies of MXNet, right? -- Markus

On Thu, Jun 29, 2017 at 12:02 PM, Tianqi Chen <tq...@cs.washington.edu> wrote:
> Yes Mxnet is already using Nnvm as IR. As for migration the current
> migration is only about Mxnet, as Nnvm is a relatively isolated part and is
> being used in multiple places
>
> Tianqi
> On Thu, Jun 29, 2017 at 3:00 PM Markus Weimer <ma...@weimo.de> wrote:
>
>> Hi,
>>
>> I came across the NNVM GitHub which states
>>
>> > MXNet is moving to NNVM as its intermediate representation layer for
>> symbolic graphs.
>>
>> Is this still the plan? And, will NNVM be part of Apache MXNet?
>>
>> Thanks,
>>
>> Markus
>>

Re: NNVM status?

Posted by Tianqi Chen <tq...@cs.washington.edu>.
Yes Mxnet is already using Nnvm as IR. As for migration the current
migration is only about Mxnet, as Nnvm is a relatively isolated part and is
being used in multiple places

Tianqi
On Thu, Jun 29, 2017 at 3:00 PM Markus Weimer <ma...@weimo.de> wrote:

> Hi,
>
> I came across the NNVM GitHub which states
>
> > MXNet is moving to NNVM as its intermediate representation layer for
> symbolic graphs.
>
> Is this still the plan? And, will NNVM be part of Apache MXNet?
>
> Thanks,
>
> Markus
>