You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@horn.apache.org by Baran Topal <ba...@barantopal.com> on 2016/10/31 12:22:20 UTC

Re: Use vector instead of Iterable in neuron API

Hi Edward;

I was looking at this for a while but got stuck on this. I was changing the
code a lot so I think my approach might not be really correct.

For learning purposes, I appreciate if another developer can take the case
so that I can see the solution.

Br.

2016-09-07 14:10 GMT+02:00 Baran Topal <ba...@barantopal.com>:

> Thanks Edward.
>
> I just cloned the repo and created the fork and update my fork.
>
> I created the following case,
>
> https://issues.apache.org/jira/browse/HORN-31
>
> Please assign me.
>
> Br.
>
> 2016-09-07 1:06 GMT+02:00 Edward J. Yoon <ed...@samsung.com>:
> > If you're using Git, you're probably using pull requests. Here's info
> about
> > pull request:
> > https://cwiki.apache.org/confluence/display/HORN/How+To+Contribute
> >
> > And, please feel free to file this issue on JIRA:
> > http://issues.apache.org/jira/browse/HORN
> >
> > Then, I'll assign to you!
> >
> > --
> > Best Regards, Edward J. Yoon
> >
> >
> > -----Original Message-----
> > From: Baran Topal [mailto:barantopal@barantopal.com]
> > Sent: Wednesday, September 07, 2016 8:02 AM
> > To: dev@horn.incubator.apache.org
> > Subject: Re: Use vector instead of Iterable in neuron API
> >
> > Hi Edward,
> >
> > Many thanks. I will update this as soon as possible.
> >
> > Br.
> >
> > 7 Eylul 2016 Car?amba tarihinde, Edward J. Yoon <edward.yoon@samsung.com
> >
> > yazdı:
> >
> >> > directly into FloatVector and get rid of Synapse totally?
> >>
> >> Yes, I think that's more clear.
> >>
> >> Internally, each task processes neurons' computation on assigned data
> >> split.
> >> The LayeredNeuralNetwork.java contains everything, you can find the
> >> "n.forward(msg);" and "n.backward(msg);" at there. Before calling the
> >> forward() or backward() method, we can set the weight vector associated
> >> with
> >> neuron using setWeightVector() method and set the argument value for the
> >> forward() method like below:
> >>
> >> n.setWeightVector(weight vector e.g., weightMatrix.get(neuronID));
> >> n.forward(outputs from previous layer as a FloatVector);
> >>
> >> Then, user-side program can be done like below:
> >>
> >> forward(FloatVector input) {
> >>   this.getWeightVector(); // returns weight vector associated with
> itself.
> >>   float vectorSum;
> >>   for(float element : input) {
> >>     vectorSum += element;
> >>   }
> >> }
> >>
> >> If you have any trouble, please don't hesitate ask here.
> >>
> >> --
> >> Best Regards, Edward J. Yoon
> >>
> >>
> >> -----Original Message-----
> >> From: Baran Topal [mailto:barantopal@barantopal.com <javascript:;>]
> >> Sent: Tuesday, September 06, 2016 11:42 PM
> >> To: dev@horn.incubator.apache.org <javascript:;>
> >> Subject: Re: Use vector instead of Iterable in neuron API
> >>
> >> Hi team and Edward;
> >>
> >> I have been checking this and got stuck how to convert the list
> >> structure to DenseFloatVector. Can you help on this?
> >>
> >> Let me explain:
> >>
> >> I saw that the concrete FloatVector is actually some sort of array
> >> structure which is not really compatible with Synapse and
> >> FloatWritables. Is the aim to convert Synapse construction logic
> >> directly into FloatVector and get rid of Synapse totally?
> >>
> >> //
> >>
> >> In the original code, we are passing a list with a structure of having
> >> Synapse and FloatWritables. I can see that a synapse can be
> >> constructed with a neuron id and 2 float writables.
> >>
> >> What I tried;
> >>
> >> 1) I added the following function in Neuron.java
> >>
> >>   public DenseFloatVector getWeightVector() {
> >>     DenseFloatVector dfv = new DenseFloatVector(getWeights());
> >>     return dfv;
> >>   }
> >>
> >> 2) I added the following function in NeuronInterface.java
> >>
> >> public void forward2(FloatVector messages) throws IOException;
> >>
> >> 3) I added the following function in TestNeuron.java
> >>
> >>     public void forward2(FloatVector messages) throws IOException {
> >>
> >>       long startTime = System.nanoTime();
> >>
> >>       float sum = messages.dot(this.getWeightVector());
> >>       this.feedforward(this.squashingFunction.apply(sum));
> >>
> >>       long endTime = System.nanoTime();
> >>
> >>       System.out.println("Execution time for the forward2 function is:
> >> " + ((endTime - startTime)));
> >>
> >>     }
> >> 4) I tried to refactor testProp() in TestNeuron.java with several ways
> >> but failed unfortunately with runtime error since weight is not having
> >> a value.
> >>
> >>   MyNeuron n_ = new MyNeuron();
> >>
> >>     FloatVector ds = new DenseFloatVector();
> >>
> >>     Iterator <Synapse<FloatWritable, FloatWritable>> li = x_.iterator();
> >>     // Iterator<FloatVector.FloatVectorElement> ie = ds.iterate();
> >>
> >>     while(li.hasNext()) {
> >>       // ds.add(li.next());
> >>       // ie.
> >>       Synapse<FloatWritable, FloatWritable> ee = li.next();
> >>
> >>       ds.set(ee.getSenderID(), ee.getMessage());
> >>
> >>     }
> >>
> >>     float[] ff = new float[2];
> >>     ff[0] = 1.0f;
> >>     ff[1] = 0.5f;
> >>
> >>     float[] ffa = new float[2];
> >>     ffa[0] = 1.0f;
> >>     ffa[1] = 0.4f;
> >>
> >>     DenseFloatVector dss = new DenseFloatVector(ff);
> >>     DenseFloatVector dssa = new DenseFloatVector(ffa);
> >>
> >>     dss.add(dssa);
> >>
> >>
> >>     FloatWritable a = new FloatWritable(1.0f);
> >>     FloatWritable b = new FloatWritable(0.5f);
> >>
> >>     Synapse s = new Synapse(0, a, b);
> >>
> >>
> >>
> >>   //  dss.set(1, 0.5f);
> >>     //dss.set(1, 0.4f);
> >>
> >>    // DenseFloatVector ds = new DenseFloatVector(ff);
> >>
> >>     n_.forward2(dss); //forward2
> >>
> >>
> >>
> >> 2016-09-05 0:55 GMT+02:00 Baran Topal <barantopal@barantopal.com
> >> <javascript:;>>:
> >> > Hi;
> >> >
> >> > Thanks I am on it.
> >> >
> >> > Br.
> >> >
> >> > 2016-09-04 4:16 GMT+02:00 Edward J. Yoon <edwardyoon@apache.org
> >> <javascript:;>>:
> >> >> P.S., so, if you want to test more, please see FloatVector and
> >> >> DenseFloatVector.
> >> >>
> >> >> On Sun, Sep 4, 2016 at 11:13 AM, Edward J. Yoon <
> edwardyoon@apache.org
> >> <javascript:;>>
> >> >> wrote:
> >> >>> Once we change the iterable input messages to the vector, we can
> >> >>> change the legacy code like below:
> >> >>>
> >> >>> public void forward(FloatVector input) {
> >> >>>   float sum = input.dot(this.getWeightVector());
> >> >>>   this.feedforward(this.squashingFunction.apply(sum));
> >> >>> }
> >> >>>
> >> >>>
> >> >>>
> >> >>>
> >> >>> On Sat, Sep 3, 2016 at 11:10 PM, Baran Topal <
> >> barantopal@barantopal.com <javascript:;>>
> >> >>> wrote:
> >> >>>> Sure.
> >> >>>>
> >> >>>> In the attached, TestNeuron.txt,
> >> >>>>
> >> >>>> 1) I put // baran as a comment for the added functions.
> >> >>>>
> >> >>>> 2) The added functions and created objects have _ as suffix
> >> >>>>
> >> >>>> (e.g. backward_)
> >> >>>>
> >> >>>>
> >> >>>> A correction: above test execution time values were via
> >> >>>> System.nanoTime().
> >> >>>>
> >> >>>> Br.
> >> >>>>
> >> >>>> 2016-09-03 14:05 GMT+02:00 Edward J. Yoon <edwardyoon@apache.org
> >> <javascript:;>>:
> >> >>>>> Interesting. Can you share your test code?
> >> >>>>>
> >> >>>>> On Sat, Sep 3, 2016 at 2:17 AM, Baran Topal <
> >> barantopal@barantopal.com <javascript:;>>
> >> >>>>> wrote:
> >> >>>>>> Hi Edward and team;
> >> >>>>>>
> >> >>>>>> I had a brief test by refactoring Iterable to Vector and on
> >> >>>>>> TestNeuron.java, I can see some improved times. I didn't check
> for
> >> >>>>>> other existing test methods but it seems the execution times are
> >> >>>>>> improving for both forwarding and backwarding.
> >> >>>>>>
> >> >>>>>> These values are via System.currentTimeMillis().
> >> >>>>>>
> >> >>>>>> E.g.
> >> >>>>>>
> >> >>>>>>
> >> >>>>>> Execution time for the forward function is: 5722329
> >> >>>>>> Execution time for the backward function is: 31825
> >> >>>>>>
> >> >>>>>> Execution time for the refactored forward function is: 72330
> >> >>>>>> Execution time for the refactored backward function is: 4665
> >> >>>>>>
> >> >>>>>> Br.
> >> >>>>>>
> >> >>>>>> 2016-09-02 2:14 GMT+02:00 Yeonhee Lee <ssallys0130@gmail.com
> >> <javascript:;>>:
> >> >>>>>>> Hi Edward,
> >> >>>>>>>
> >> >>>>>>> If we don't have that kind of method in the neuron, I guess it's
> >> >>>>>>> appropriate to put the method to the neuron.
> >> >>>>>>> That can be one of the distinct features of Horn.
> >> >>>>>>>
> >> >>>>>>> Regards,
> >> >>>>>>> Yeonhee
> >> >>>>>>>
> >> >>>>>>>
> >> >>>>>>> 2016-08-26 9:40 GMT+09:00 Edward J. Yoon <
> edward.yoon@samsung.com
> >> <javascript:;>>:
> >> >>>>>>>
> >> >>>>>>>> Hi forks,
> >> >>>>>>>>
> >> >>>>>>>> Our current neuron API is designed like:
> >> >>>>>>>> https://github.com/apache/incubator-horn/blob/master/
> >> >>>>>>>> README.md#programming-m
> >> >>>>>>>> odel
> >> >>>>>>>>
> >> >>>>>>>> In forward() method, each neuron receives the pairs of the
> inputs
> >> x1,
> >> >>>>>>>> x2,
> >> >>>>>>>> ... xn from other neurons and weights w1, w2, ... wn like
> below:
> >> >>>>>>>>
> >> >>>>>>>>   public void forward(Iterable<M> messages) throws IOException;
> >> >>>>>>>>
> >> >>>>>>>> Instead of this, I suggest that we use just vector like below:
> >> >>>>>>>>
> >> >>>>>>>>   /**
> >> >>>>>>>>    * @param input vector from other neurons
> >> >>>>>>>>    * /
> >> >>>>>>>>   public void forward(Vector input) throws IOException;
> >> >>>>>>>>
> >> >>>>>>>> And, the neuron provides a getWeightVector() method that
> returns
> >> >>>>>>>> weight
> >> >>>>>>>> vector associated with itself. I think this is more make sense
> >> than
> >> >>>>>>>> current
> >> >>>>>>>> version, and more easy to use GPU in the future.
> >> >>>>>>>>
> >> >>>>>>>> What do you think?
> >> >>>>>>>>
> >> >>>>>>>> Thanks.
> >> >>>>>>>>
> >> >>>>>>>> --
> >> >>>>>>>> Best Regards, Edward J. Yoon
> >> >>>>>>>>
> >> >>>>>>>>
> >> >>>>>>>>
> >> >>>>>>>>
> >> >>>>>>>>
> >> >>>>>
> >> >>>>>
> >> >>>>>
> >> >>>>> --
> >> >>>>> Best Regards, Edward J. Yoon
> >> >>>
> >> >>>
> >> >>>
> >> >>> --
> >> >>> Best Regards, Edward J. Yoon
> >> >>
> >> >>
> >> >>
> >> >> --
> >> >> Best Regards, Edward J. Yoon
> >>
> >>
> >>
> >>
> >
> >
>

Re: Use vector instead of Iterable in neuron API

Posted by Baran Topal <ba...@barantopal.com>.
Hi Edward,

Thanks. I will.

Br.

1 Kasım 2016 Salı tarihinde, Edward J. Yoon <ed...@samsung.com> yazdı:

> Sure, I'll help on it. If you can try to change the user-side code as we
> discussed, I think I can change the internal others.
>
> --
> Best Regards, Edward J. Yoon
>
>
> -----Original Message-----
> From: Baran Topal [mailto:barantopal@barantopal.com <javascript:;>]
> Sent: Monday, October 31, 2016 9:22 PM
> To: dev@horn.incubator.apache.org <javascript:;>
> Subject: Re: Use vector instead of Iterable in neuron API
>
> Hi Edward;
>
> I was looking at this for a while but got stuck on this. I was changing the
> code a lot so I think my approach might not be really correct.
>
> For learning purposes, I appreciate if another developer can take the case
> so that I can see the solution.
>
> Br.
>
> 2016-09-07 14:10 GMT+02:00 Baran Topal <barantopal@barantopal.com
> <javascript:;>>:
>
> > Thanks Edward.
> >
> > I just cloned the repo and created the fork and update my fork.
> >
> > I created the following case,
> >
> > https://issues.apache.org/jira/browse/HORN-31
> >
> > Please assign me.
> >
> > Br.
> >
> > 2016-09-07 1:06 GMT+02:00 Edward J. Yoon <edward.yoon@samsung.com
> <javascript:;>>:
> > > If you're using Git, you're probably using pull requests. Here's info
> > about
> > > pull request:
> > > https://cwiki.apache.org/confluence/display/HORN/How+To+Contribute
> > >
> > > And, please feel free to file this issue on JIRA:
> > > http://issues.apache.org/jira/browse/HORN
> > >
> > > Then, I'll assign to you!
> > >
> > > --
> > > Best Regards, Edward J. Yoon
> > >
> > >
> > > -----Original Message-----
> > > From: Baran Topal [mailto:barantopal@barantopal.com <javascript:;>]
> > > Sent: Wednesday, September 07, 2016 8:02 AM
> > > To: dev@horn.incubator.apache.org <javascript:;>
> > > Subject: Re: Use vector instead of Iterable in neuron API
> > >
> > > Hi Edward,
> > >
> > > Many thanks. I will update this as soon as possible.
> > >
> > > Br.
> > >
> > > 7 Eylul 2016 Car?amba tarihinde, Edward J. Yoon <
> edward.yoon@samsung.com <javascript:;>
> > >
> > > yazdı:
> > >
> > >> > directly into FloatVector and get rid of Synapse totally?
> > >>
> > >> Yes, I think that's more clear.
> > >>
> > >> Internally, each task processes neurons' computation on assigned data
> > >> split.
> > >> The LayeredNeuralNetwork.java contains everything, you can find the
> > >> "n.forward(msg);" and "n.backward(msg);" at there. Before calling the
> > >> forward() or backward() method, we can set the weight vector
> associated
> > >> with
> > >> neuron using setWeightVector() method and set the argument value for
> the
> > >> forward() method like below:
> > >>
> > >> n.setWeightVector(weight vector e.g., weightMatrix.get(neuronID));
> > >> n.forward(outputs from previous layer as a FloatVector);
> > >>
> > >> Then, user-side program can be done like below:
> > >>
> > >> forward(FloatVector input) {
> > >>   this.getWeightVector(); // returns weight vector associated with
> > itself.>
> >  >>   float vectorSum;
> > >>   for(float element : input) {
> > >>     vectorSum += element;
> > >>   }
> > >> }
> > >>
> > >> If you have any trouble, please don't hesitate ask here.
> > >>
> > >> --
> > >> Best Regards, Edward J. Yoon
> > >>
> > >>
> > >> -----Original Message-----
> > >> From: Baran Topal [mailto:barantopal@barantopal.com <javascript:;>
> <javascript:;>]
> > >> Sent: Tuesday, September 06, 2016 11:42 PM
> > >> To: dev@horn.incubator.apache.org <javascript:;> <javascript:;>
> > >> Subject: Re: Use vector instead of Iterable in neuron API
> > >>
> > >> Hi team and Edward;
> > >>
> > >> I have been checking this and got stuck how to convert the list
> > >> structure to DenseFloatVector. Can you help on this?
> > >>
> > >> Let me explain:
> > >>
> > >> I saw that the concrete FloatVector is actually some sort of array
> > >> structure which is not really compatible with Synapse and
> > >> FloatWritables. Is the aim to convert Synapse construction logic
> > >> directly into FloatVector and get rid of Synapse totally?
> > >>
> > >> //
> > >>
> > >> In the original code, we are passing a list with a structure of having
> > >> Synapse and FloatWritables. I can see that a synapse can be
> > >> constructed with a neuron id and 2 float writables.
> > >>
> > >> What I tried;
> > >>
> > >> 1) I added the following function in Neuron.java
> > >>
> > >>   public DenseFloatVector getWeightVector() {
> > >>     DenseFloatVector dfv = new DenseFloatVector(getWeights());
> > >>     return dfv;
> > >>   }
> > >>
> > >> 2) I added the following function in NeuronInterface.java
> > >>
> > >> public void forward2(FloatVector messages) throws IOException;
> > >>
> > >> 3) I added the following function in TestNeuron.java
> > >>
> > >>     public void forward2(FloatVector messages) throws IOException {
> > >>
> > >>       long startTime = System.nanoTime();
> > >>
> > >>       float sum = messages.dot(this.getWeightVector());
> > >>       this.feedforward(this.squashingFunction.apply(sum));
> > >>
> > >>       long endTime = System.nanoTime();
> > >>
> > >>       System.out.println("Execution time for the forward2 function is:
> > >> " + ((endTime - startTime)));
> > >>
> > >>     }
> > >> 4) I tried to refactor testProp() in TestNeuron.java with several ways
> > >> but failed unfortunately with runtime error since weight is not having
> > >> a value.
> > >>
> > >>   MyNeuron n_ = new MyNeuron();
> > >>
> > >>     FloatVector ds = new DenseFloatVector();
> > >>
> > >>     Iterator <Synapse<FloatWritable, FloatWritable>> li =
> x_.iterator();
> > >>     // Iterator<FloatVector.FloatVectorElement> ie = ds.iterate();
> > >>
> > >>     while(li.hasNext()) {
> > >>       // ds.add(li.next());
> > >>       // ie.
> > >>       Synapse<FloatWritable, FloatWritable> ee = li.next();
> > >>
> > >>       ds.set(ee.getSenderID(), ee.getMessage());
> > >>
> > >>     }
> > >>
> > >>     float[] ff = new float[2];
> > >>     ff[0] = 1.0f;
> > >>     ff[1] = 0.5f;
> > >>
> > >>     float[] ffa = new float[2];
> > >>     ffa[0] = 1.0f;
> > >>     ffa[1] = 0.4f;
> > >>
> > >>     DenseFloatVector dss = new DenseFloatVector(ff);
> > >>     DenseFloatVector dssa = new DenseFloatVector(ffa);
> > >>
> > >>     dss.add(dssa);
> > >>
> > >>
> > >>     FloatWritable a = new FloatWritable(1.0f);
> > >>     FloatWritable b = new FloatWritable(0.5f);
> > >>
> > >>     Synapse s = new Synapse(0, a, b);
> > >>
> > >>
> > >>
> > >>   //  dss.set(1, 0.5f);
> > >>     //dss.set(1, 0.4f);
> > >>
> > >>    // DenseFloatVector ds = new DenseFloatVector(ff);
> > >>
> > >>     n_.forward2(dss); //forward2
> > >>
> > >>
> > >>
> > >> 2016-09-05 0:55 GMT+02:00 Baran Topal <barantopal@barantopal.com
> <javascript:;>
> > >> <javascript:;>>:
> > >> > Hi;
> > >> >
> > >> > Thanks I am on it.
> > >> >
> > >> > Br.
> > >> >
> > >> > 2016-09-04 4:16 GMT+02:00 Edward J. Yoon <edwardyoon@apache.org
> <javascript:;>
> > >> <javascript:;>>:
> > >> >> P.S., so, if you want to test more, please see FloatVector and
> > >> >> DenseFloatVector.
> > >> >>
> > >> >> On Sun, Sep 4, 2016 at 11:13 AM, Edward J. Yoon <
> > edwardyoon@apache.org <javascript:;>
> > >> <javascript:;>>
> > >> >> wrote:
> > >> >>> Once we change the iterable input messages to the vector, we can
> > >> >>> change the legacy code like below:
> > >> >>>
> > >> >>> public void forward(FloatVector input) {
> > >> >>>   float sum = input.dot(this.getWeightVector());
> > >> >>>   this.feedforward(this.squashingFunction.apply(sum));
> > >> >>> }
> > >> >>>
> > >> >>>
> > >> >>>
> > >> >>>
> > >> >>> On Sat, Sep 3, 2016 at 11:10 PM, Baran Topal <
> > >> barantopal@barantopal.com <javascript:;> <javascript:;>>
> > >> >>> wrote:
> > >> >>>> Sure.
> > >> >>>>
> > >> >>>> In the attached, TestNeuron.txt,
> > >> >>>>
> > >> >>>> 1) I put // baran as a comment for the added functions.
> > >> >>>>
> > >> >>>> 2) The added functions and created objects have _ as suffix
> > >> >>>>
> > >> >>>> (e.g. backward_)
> > >> >>>>
> > >> >>>>
> > >> >>>> A correction: above test execution time values were via
> > >> >>>> System.nanoTime().
> > >> >>>>
> > >> >>>> Br.
> > >> >>>>
> > >> >>>> 2016-09-03 14:05 GMT+02:00 Edward J. Yoon <edwardyoon@apache.org
> <javascript:;>
> > >> <javascript:;>>:
> > >> >>>>> Interesting. Can you share your test code?
> > >> >>>>>
> > >> >>>>> On Sat, Sep 3, 2016 at 2:17 AM, Baran Topal <
> > >> barantopal@barantopal.com <javascript:;> <javascript:;>>
> > >> >>>>> wrote:
> > >> >>>>>> Hi Edward and team;
> > >> >>>>>>
> > >> >>>>>> I had a brief test by refactoring Iterable to Vector and on
> > >> >>>>>> TestNeuron.java, I can see some improved times. I didn't check
> > for
> > >> >>>>>> other existing test methods but it seems the execution times
> are
> > >> >>>>>> improving for both forwarding and backwarding.
> > >> >>>>>>
> > >> >>>>>> These values are via System.currentTimeMillis().
> > >> >>>>>>
> > >> >>>>>> E.g.
> > >> >>>>>>
> > >> >>>>>>
> > >> >>>>>> Execution time for the forward function is: 5722329
> > >> >>>>>> Execution time for the backward function is: 31825
> > >> >>>>>>
> > >> >>>>>> Execution time for the refactored forward function is: 72330
> > >> >>>>>> Execution time for the refactored backward function is: 4665
> > >> >>>>>>
> > >> >>>>>> Br.
> > >> >>>>>>
> > >> >>>>>> 2016-09-02 2:14 GMT+02:00 Yeonhee Lee <ssallys0130@gmail.com
> <javascript:;>
> > >> <javascript:;>>:
> > >> >>>>>>> Hi Edward,
> > >> >>>>>>>
> > >> >>>>>>> If we don't have that kind of method in the neuron, I guess
> it's
> > >> >>>>>>> appropriate to put the method to the neuron.
> > >> >>>>>>> That can be one of the distinct features of Horn.
> > >> >>>>>>>
> > >> >>>>>>> Regards,
> > >> >>>>>>> Yeonhee
> > >> >>>>>>>
> > >> >>>>>>>
> > >> >>>>>>> 2016-08-26 9:40 GMT+09:00 Edward J. Yoon <
> > edward.yoon@samsung.com <javascript:;>
> > >> <javascript:;>>:
> > >> >>>>>>>
> > >> >>>>>>>> Hi forks,
> > >> >>>>>>>>
> > >> >>>>>>>> Our current neuron API is designed like:
> > >> >>>>>>>> https://github.com/apache/incubator-horn/blob/master/
> > >> >>>>>>>> README.md#programming-m
> > >> >>>>>>>> odel
> > >> >>>>>>>>
> > >> >>>>>>>> In forward() method, each neuron receives the pairs of the
> > inputs
> > >> x1,
> > >> >>>>>>>> x2,
> > >> >>>>>>>> ... xn from other neurons and weights w1, w2, ... wn like
> > below:
> > >> >>>>>>>>
> > >> >>>>>>>>   public void forward(Iterable<M> messages) throws
> IOException;
> > >> >>>>>>>>
> > >> >>>>>>>> Instead of this, I suggest that we use just vector like
> below:
> > >> >>>>>>>>
> > >> >>>>>>>>   /**
> > >> >>>>>>>>    * @param input vector from other neurons
> > >> >>>>>>>>    * /
> > >> >>>>>>>>   public void forward(Vector input) throws IOException;
> > >> >>>>>>>>
> > >> >>>>>>>> And, the neuron provides a getWeightVector() method that
> > returns
> > >> >>>>>>>> weight
> > >> >>>>>>>> vector associated with itself. I think this is more make
> sense
> > >> than
> > >> >>>>>>>> current
> > >> >>>>>>>> version, and more easy to use GPU in the future.
> > >> >>>>>>>>
> > >> >>>>>>>> What do you think?
> > >> >>>>>>>>
> > >> >>>>>>>> Thanks.
> > >> >>>>>>>>
> > >> >>>>>>>> --
> > >> >>>>>>>> Best Regards, Edward J. Yoon
> > >> >>>>>>>>
> > >> >>>>>>>>
> > >> >>>>>>>>
> > >> >>>>>>>>
> > >> >>>>>>>>
> > >> >>>>>
> > >> >>>>>
> > >> >>>>>
> > >> >>>>> --
> > >> >>>>> Best Regards, Edward J. Yoon
> > >> >>>
> > >> >>>
> > >> >>>
> > >> >>> --
> > >> >>> Best Regards, Edward J. Yoon
> > >> >>
> > >> >>
> > >> >>
> > >> >> --
> > >> >> Best Regards, Edward J. Yoon
> > >>
> > >>
> > >>
> > >>
> > >
> > >
> >
>
>
>

RE: Use vector instead of Iterable in neuron API

Posted by "Edward J. Yoon" <ed...@samsung.com>.
Sure, I'll help on it. If you can try to change the user-side code as we 
discussed, I think I can change the internal others.

--
Best Regards, Edward J. Yoon


-----Original Message-----
From: Baran Topal [mailto:barantopal@barantopal.com]
Sent: Monday, October 31, 2016 9:22 PM
To: dev@horn.incubator.apache.org
Subject: Re: Use vector instead of Iterable in neuron API

Hi Edward;

I was looking at this for a while but got stuck on this. I was changing the
code a lot so I think my approach might not be really correct.

For learning purposes, I appreciate if another developer can take the case
so that I can see the solution.

Br.

2016-09-07 14:10 GMT+02:00 Baran Topal <ba...@barantopal.com>:

> Thanks Edward.
>
> I just cloned the repo and created the fork and update my fork.
>
> I created the following case,
>
> https://issues.apache.org/jira/browse/HORN-31
>
> Please assign me.
>
> Br.
>
> 2016-09-07 1:06 GMT+02:00 Edward J. Yoon <ed...@samsung.com>:
> > If you're using Git, you're probably using pull requests. Here's info
> about
> > pull request:
> > https://cwiki.apache.org/confluence/display/HORN/How+To+Contribute
> >
> > And, please feel free to file this issue on JIRA:
> > http://issues.apache.org/jira/browse/HORN
> >
> > Then, I'll assign to you!
> >
> > --
> > Best Regards, Edward J. Yoon
> >
> >
> > -----Original Message-----
> > From: Baran Topal [mailto:barantopal@barantopal.com]
> > Sent: Wednesday, September 07, 2016 8:02 AM
> > To: dev@horn.incubator.apache.org
> > Subject: Re: Use vector instead of Iterable in neuron API
> >
> > Hi Edward,
> >
> > Many thanks. I will update this as soon as possible.
> >
> > Br.
> >
> > 7 Eylul 2016 Car?amba tarihinde, Edward J. Yoon <edward.yoon@samsung.com
> >
> > yazd:
> >
> >> > directly into FloatVector and get rid of Synapse totally?
> >>
> >> Yes, I think that's more clear.
> >>
> >> Internally, each task processes neurons' computation on assigned data
> >> split.
> >> The LayeredNeuralNetwork.java contains everything, you can find the
> >> "n.forward(msg);" and "n.backward(msg);" at there. Before calling the
> >> forward() or backward() method, we can set the weight vector associated
> >> with
> >> neuron using setWeightVector() method and set the argument value for the
> >> forward() method like below:
> >>
> >> n.setWeightVector(weight vector e.g., weightMatrix.get(neuronID));
> >> n.forward(outputs from previous layer as a FloatVector);
> >>
> >> Then, user-side program can be done like below:
> >>
> >> forward(FloatVector input) {
> >>   this.getWeightVector(); // returns weight vector associated with
> itself.> 
>  >>   float vectorSum;
> >>   for(float element : input) {
> >>     vectorSum += element;
> >>   }
> >> }
> >>
> >> If you have any trouble, please don't hesitate ask here.
> >>
> >> --
> >> Best Regards, Edward J. Yoon
> >>
> >>
> >> -----Original Message-----
> >> From: Baran Topal [mailto:barantopal@barantopal.com <javascript:;>]
> >> Sent: Tuesday, September 06, 2016 11:42 PM
> >> To: dev@horn.incubator.apache.org <javascript:;>
> >> Subject: Re: Use vector instead of Iterable in neuron API
> >>
> >> Hi team and Edward;
> >>
> >> I have been checking this and got stuck how to convert the list
> >> structure to DenseFloatVector. Can you help on this?
> >>
> >> Let me explain:
> >>
> >> I saw that the concrete FloatVector is actually some sort of array
> >> structure which is not really compatible with Synapse and
> >> FloatWritables. Is the aim to convert Synapse construction logic
> >> directly into FloatVector and get rid of Synapse totally?
> >>
> >> //
> >>
> >> In the original code, we are passing a list with a structure of having
> >> Synapse and FloatWritables. I can see that a synapse can be
> >> constructed with a neuron id and 2 float writables.
> >>
> >> What I tried;
> >>
> >> 1) I added the following function in Neuron.java
> >>
> >>   public DenseFloatVector getWeightVector() {
> >>     DenseFloatVector dfv = new DenseFloatVector(getWeights());
> >>     return dfv;
> >>   }
> >>
> >> 2) I added the following function in NeuronInterface.java
> >>
> >> public void forward2(FloatVector messages) throws IOException;
> >>
> >> 3) I added the following function in TestNeuron.java
> >>
> >>     public void forward2(FloatVector messages) throws IOException {
> >>
> >>       long startTime = System.nanoTime();
> >>
> >>       float sum = messages.dot(this.getWeightVector());
> >>       this.feedforward(this.squashingFunction.apply(sum));
> >>
> >>       long endTime = System.nanoTime();
> >>
> >>       System.out.println("Execution time for the forward2 function is:
> >> " + ((endTime - startTime)));
> >>
> >>     }
> >> 4) I tried to refactor testProp() in TestNeuron.java with several ways
> >> but failed unfortunately with runtime error since weight is not having
> >> a value.
> >>
> >>   MyNeuron n_ = new MyNeuron();
> >>
> >>     FloatVector ds = new DenseFloatVector();
> >>
> >>     Iterator <Synapse<FloatWritable, FloatWritable>> li = x_.iterator();
> >>     // Iterator<FloatVector.FloatVectorElement> ie = ds.iterate();
> >>
> >>     while(li.hasNext()) {
> >>       // ds.add(li.next());
> >>       // ie.
> >>       Synapse<FloatWritable, FloatWritable> ee = li.next();
> >>
> >>       ds.set(ee.getSenderID(), ee.getMessage());
> >>
> >>     }
> >>
> >>     float[] ff = new float[2];
> >>     ff[0] = 1.0f;
> >>     ff[1] = 0.5f;
> >>
> >>     float[] ffa = new float[2];
> >>     ffa[0] = 1.0f;
> >>     ffa[1] = 0.4f;
> >>
> >>     DenseFloatVector dss = new DenseFloatVector(ff);
> >>     DenseFloatVector dssa = new DenseFloatVector(ffa);
> >>
> >>     dss.add(dssa);
> >>
> >>
> >>     FloatWritable a = new FloatWritable(1.0f);
> >>     FloatWritable b = new FloatWritable(0.5f);
> >>
> >>     Synapse s = new Synapse(0, a, b);
> >>
> >>
> >>
> >>   //  dss.set(1, 0.5f);
> >>     //dss.set(1, 0.4f);
> >>
> >>    // DenseFloatVector ds = new DenseFloatVector(ff);
> >>
> >>     n_.forward2(dss); //forward2
> >>
> >>
> >>
> >> 2016-09-05 0:55 GMT+02:00 Baran Topal <barantopal@barantopal.com
> >> <javascript:;>>:
> >> > Hi;
> >> >
> >> > Thanks I am on it.
> >> >
> >> > Br.
> >> >
> >> > 2016-09-04 4:16 GMT+02:00 Edward J. Yoon <edwardyoon@apache.org
> >> <javascript:;>>:
> >> >> P.S., so, if you want to test more, please see FloatVector and
> >> >> DenseFloatVector.
> >> >>
> >> >> On Sun, Sep 4, 2016 at 11:13 AM, Edward J. Yoon <
> edwardyoon@apache.org
> >> <javascript:;>>
> >> >> wrote:
> >> >>> Once we change the iterable input messages to the vector, we can
> >> >>> change the legacy code like below:
> >> >>>
> >> >>> public void forward(FloatVector input) {
> >> >>>   float sum = input.dot(this.getWeightVector());
> >> >>>   this.feedforward(this.squashingFunction.apply(sum));
> >> >>> }
> >> >>>
> >> >>>
> >> >>>
> >> >>>
> >> >>> On Sat, Sep 3, 2016 at 11:10 PM, Baran Topal <
> >> barantopal@barantopal.com <javascript:;>>
> >> >>> wrote:
> >> >>>> Sure.
> >> >>>>
> >> >>>> In the attached, TestNeuron.txt,
> >> >>>>
> >> >>>> 1) I put // baran as a comment for the added functions.
> >> >>>>
> >> >>>> 2) The added functions and created objects have _ as suffix
> >> >>>>
> >> >>>> (e.g. backward_)
> >> >>>>
> >> >>>>
> >> >>>> A correction: above test execution time values were via
> >> >>>> System.nanoTime().
> >> >>>>
> >> >>>> Br.
> >> >>>>
> >> >>>> 2016-09-03 14:05 GMT+02:00 Edward J. Yoon <edwardyoon@apache.org
> >> <javascript:;>>:
> >> >>>>> Interesting. Can you share your test code?
> >> >>>>>
> >> >>>>> On Sat, Sep 3, 2016 at 2:17 AM, Baran Topal <
> >> barantopal@barantopal.com <javascript:;>>
> >> >>>>> wrote:
> >> >>>>>> Hi Edward and team;
> >> >>>>>>
> >> >>>>>> I had a brief test by refactoring Iterable to Vector and on
> >> >>>>>> TestNeuron.java, I can see some improved times. I didn't check
> for
> >> >>>>>> other existing test methods but it seems the execution times are
> >> >>>>>> improving for both forwarding and backwarding.
> >> >>>>>>
> >> >>>>>> These values are via System.currentTimeMillis().
> >> >>>>>>
> >> >>>>>> E.g.
> >> >>>>>>
> >> >>>>>>
> >> >>>>>> Execution time for the forward function is: 5722329
> >> >>>>>> Execution time for the backward function is: 31825
> >> >>>>>>
> >> >>>>>> Execution time for the refactored forward function is: 72330
> >> >>>>>> Execution time for the refactored backward function is: 4665
> >> >>>>>>
> >> >>>>>> Br.
> >> >>>>>>
> >> >>>>>> 2016-09-02 2:14 GMT+02:00 Yeonhee Lee <ssallys0130@gmail.com
> >> <javascript:;>>:
> >> >>>>>>> Hi Edward,
> >> >>>>>>>
> >> >>>>>>> If we don't have that kind of method in the neuron, I guess it's
> >> >>>>>>> appropriate to put the method to the neuron.
> >> >>>>>>> That can be one of the distinct features of Horn.
> >> >>>>>>>
> >> >>>>>>> Regards,
> >> >>>>>>> Yeonhee
> >> >>>>>>>
> >> >>>>>>>
> >> >>>>>>> 2016-08-26 9:40 GMT+09:00 Edward J. Yoon <
> edward.yoon@samsung.com
> >> <javascript:;>>:
> >> >>>>>>>
> >> >>>>>>>> Hi forks,
> >> >>>>>>>>
> >> >>>>>>>> Our current neuron API is designed like:
> >> >>>>>>>> https://github.com/apache/incubator-horn/blob/master/
> >> >>>>>>>> README.md#programming-m
> >> >>>>>>>> odel
> >> >>>>>>>>
> >> >>>>>>>> In forward() method, each neuron receives the pairs of the
> inputs
> >> x1,
> >> >>>>>>>> x2,
> >> >>>>>>>> ... xn from other neurons and weights w1, w2, ... wn like
> below:
> >> >>>>>>>>
> >> >>>>>>>>   public void forward(Iterable<M> messages) throws IOException;
> >> >>>>>>>>
> >> >>>>>>>> Instead of this, I suggest that we use just vector like below:
> >> >>>>>>>>
> >> >>>>>>>>   /**
> >> >>>>>>>>    * @param input vector from other neurons
> >> >>>>>>>>    * /
> >> >>>>>>>>   public void forward(Vector input) throws IOException;
> >> >>>>>>>>
> >> >>>>>>>> And, the neuron provides a getWeightVector() method that
> returns
> >> >>>>>>>> weight
> >> >>>>>>>> vector associated with itself. I think this is more make sense
> >> than
> >> >>>>>>>> current
> >> >>>>>>>> version, and more easy to use GPU in the future.
> >> >>>>>>>>
> >> >>>>>>>> What do you think?
> >> >>>>>>>>
> >> >>>>>>>> Thanks.
> >> >>>>>>>>
> >> >>>>>>>> --
> >> >>>>>>>> Best Regards, Edward J. Yoon
> >> >>>>>>>>
> >> >>>>>>>>
> >> >>>>>>>>
> >> >>>>>>>>
> >> >>>>>>>>
> >> >>>>>
> >> >>>>>
> >> >>>>>
> >> >>>>> --
> >> >>>>> Best Regards, Edward J. Yoon
> >> >>>
> >> >>>
> >> >>>
> >> >>> --
> >> >>> Best Regards, Edward J. Yoon
> >> >>
> >> >>
> >> >>
> >> >> --
> >> >> Best Regards, Edward J. Yoon
> >>
> >>
> >>
> >>
> >
> >
>