You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@horn.apache.org by "Edward J. Yoon" <ed...@samsung.com> on 2016/08/26 00:40:36 UTC

Use vector instead of Iterable in neuron API

Hi forks,

Our current neuron API is designed like:
https://github.com/apache/incubator-horn/blob/master/README.md#programming-m
odel

In forward() method, each neuron receives the pairs of the inputs x1, x2,
... xn from other neurons and weights w1, w2, ... wn like below:

  public void forward(Iterable<M> messages) throws IOException;

Instead of this, I suggest that we use just vector like below:

  /**
   * @param input vector from other neurons
   * /
  public void forward(Vector input) throws IOException;

And, the neuron provides a getWeightVector() method that returns weight
vector associated with itself. I think this is more make sense than current
version, and more easy to use GPU in the future.

What do you think?

Thanks.

--
Best Regards, Edward J. Yoon





RE: Use vector instead of Iterable in neuron API

Posted by "Edward J. Yoon" <ed...@samsung.com>.
Sorry, I never tried to build on Windows. You can build master with following 
command: % mvn install

Hadoop stack is required only when setting the fully distributed mode on the 
cluster. Local mode doesn't requires Hadoop.

--
Best Regards, Edward J. Yoon

-----Original Message-----
From: Baran Topal [mailto:barantopal@barantopal.com]
Sent: Thursday, September 01, 2016 8:18 PM
To: dev@horn.incubator.apache.org
Subject: Re: Use vector instead of Iterable in neuron API

Hi Edward and team;

I was checking the code and some trouble while building it from maven.
I tried with master and end up different dependency problems. I also
went for incubator-horn-d88a785. The core problem is that for some
reason the pom content at the end of the file is not recognized.

<artifactItems>
  <artifactItem>
    <groupId>org.slf4j</groupId>
    <artifactId>slf4j-log4j12</artifactId>
    <outputDirectory>${project.basedir}/lib</outputDirectory>
  </artifactItem>
  <artifactItem>
    <groupId>org.slf4j</groupId>
    <artifactId>slf4j-api</artifactId>
    <outputDirectory>${project.basedir}/lib</outputDirectory>
  </artifactItem>
</artifactItems>
<excludeTransitive>true</excludeTransitive>

<fileMode>755</fileMode>

And possibly, due to this, the test case I am writing is failing with
java.lang.NoClassDefFoundError:
org/apache/commons/collections/map/UnmodifiableMap

1) Are you sure that the artifact item place is correct? (This is
happening in d88a785 and master)

2) I am using intellij 14.1.4 and using jdk 1.8.0_45. I am using
bundled maven 3 setup in intellij. An addition is that I am under
Windows and the file mode seems not really accurate for me?

3) A newbie question: Do I need a hadoop stack to be in place to test the 
code?

4) Should we have a session some time and maybe, we can record the
session with these basic questions and setups so the threshold for
newbies would be faster and the project will have more attention?

Br.

2016-08-26 11:43 GMT+02:00 Edward J. Yoon <ed...@apache.org>:
> Hi,
>
> Good point. I was aimed to parallelization of neuron objects using
> multi-threading. The original purpose of iterable msgs, is related
> with memory consumption issue (message serialization). BTW, thread
> within thread seems possible [1].
>
> And, In forward() method, we usually computes the weighted sum. This
> computation can be parallelized and thread safe.
>
> 1. http://stackoverflow.com/questions/7224670/threads-within-threads-in-java
>
>
> On Fri, Aug 26, 2016 at 6:01 PM, Baran Topal <ba...@barantopal.com> 
> wrote:
>> Hi;
>>
>> First message, please be kind :)
>>
>> Is thread safety an important aspect in this context? Vectors are
>> threadsafe but the complexity (e.g. removal of an element) might not
>> be very desirable for performant applications. I will try to test and
>> inform you between two solutions with tests once I have time.
>>
>> Br.
>>
>> 2016-08-26 2:40 GMT+02:00 Edward J. Yoon <ed...@samsung.com>:
>>> Hi forks,
>>>
>>> Our current neuron API is designed like:
>>> https://github.com/apache/incubator-horn/blob/master/README.md#programming-m
>>> odel
>>>
>>> In forward() method, each neuron receives the pairs of the inputs x1, x2,
>>> ... xn from other neurons and weights w1, w2, ... wn like below:
>>>
>>>   public void forward(Iterable<M> messages) throws IOException;
>>>
>>> Instead of this, I suggest that we use just vector like below:
>>>
>>>   /**
>>>    * @param input vector from other neurons
>>>    * /
>>>   public void forward(Vector input) throws IOException;
>>>
>>> And, the neuron provides a getWeightVector() method that returns weight
>>> vector associated with itself. I think this is more make sense than 
>>> current
>>> version, and more easy to use GPU in the future.
>>>
>>> What do you think?
>>>
>>> Thanks.
>>>
>>> --
>>> Best Regards, Edward J. Yoon
>>>
>>>
>>>
>>>
>
>
>
> --
> Best Regards, Edward J. Yoon




Re: Use vector instead of Iterable in neuron API

Posted by Baran Topal <ba...@barantopal.com>.
Hi Edward and team;

I was checking the code and some trouble while building it from maven.
I tried with master and end up different dependency problems. I also
went for incubator-horn-d88a785. The core problem is that for some
reason the pom content at the end of the file is not recognized.

<artifactItems>
  <artifactItem>
    <groupId>org.slf4j</groupId>
    <artifactId>slf4j-log4j12</artifactId>
    <outputDirectory>${project.basedir}/lib</outputDirectory>
  </artifactItem>
  <artifactItem>
    <groupId>org.slf4j</groupId>
    <artifactId>slf4j-api</artifactId>
    <outputDirectory>${project.basedir}/lib</outputDirectory>
  </artifactItem>
</artifactItems>
<excludeTransitive>true</excludeTransitive>

<fileMode>755</fileMode>

And possibly, due to this, the test case I am writing is failing with
java.lang.NoClassDefFoundError:
org/apache/commons/collections/map/UnmodifiableMap

1) Are you sure that the artifact item place is correct? (This is
happening in d88a785 and master)

2) I am using intellij 14.1.4 and using jdk 1.8.0_45. I am using
bundled maven 3 setup in intellij. An addition is that I am under
Windows and the file mode seems not really accurate for me?

3) A newbie question: Do I need a hadoop stack to be in place to test the code?

4) Should we have a session some time and maybe, we can record the
session with these basic questions and setups so the threshold for
newbies would be faster and the project will have more attention?

Br.

2016-08-26 11:43 GMT+02:00 Edward J. Yoon <ed...@apache.org>:
> Hi,
>
> Good point. I was aimed to parallelization of neuron objects using
> multi-threading. The original purpose of iterable msgs, is related
> with memory consumption issue (message serialization). BTW, thread
> within thread seems possible [1].
>
> And, In forward() method, we usually computes the weighted sum. This
> computation can be parallelized and thread safe.
>
> 1. http://stackoverflow.com/questions/7224670/threads-within-threads-in-java
>
>
> On Fri, Aug 26, 2016 at 6:01 PM, Baran Topal <ba...@barantopal.com> wrote:
>> Hi;
>>
>> First message, please be kind :)
>>
>> Is thread safety an important aspect in this context? Vectors are
>> threadsafe but the complexity (e.g. removal of an element) might not
>> be very desirable for performant applications. I will try to test and
>> inform you between two solutions with tests once I have time.
>>
>> Br.
>>
>> 2016-08-26 2:40 GMT+02:00 Edward J. Yoon <ed...@samsung.com>:
>>> Hi forks,
>>>
>>> Our current neuron API is designed like:
>>> https://github.com/apache/incubator-horn/blob/master/README.md#programming-m
>>> odel
>>>
>>> In forward() method, each neuron receives the pairs of the inputs x1, x2,
>>> ... xn from other neurons and weights w1, w2, ... wn like below:
>>>
>>>   public void forward(Iterable<M> messages) throws IOException;
>>>
>>> Instead of this, I suggest that we use just vector like below:
>>>
>>>   /**
>>>    * @param input vector from other neurons
>>>    * /
>>>   public void forward(Vector input) throws IOException;
>>>
>>> And, the neuron provides a getWeightVector() method that returns weight
>>> vector associated with itself. I think this is more make sense than current
>>> version, and more easy to use GPU in the future.
>>>
>>> What do you think?
>>>
>>> Thanks.
>>>
>>> --
>>> Best Regards, Edward J. Yoon
>>>
>>>
>>>
>>>
>
>
>
> --
> Best Regards, Edward J. Yoon

Re: Use vector instead of Iterable in neuron API

Posted by "Edward J. Yoon" <ed...@apache.org>.
Hi,

Good point. I was aimed to parallelization of neuron objects using
multi-threading. The original purpose of iterable msgs, is related
with memory consumption issue (message serialization). BTW, thread
within thread seems possible [1].

And, In forward() method, we usually computes the weighted sum. This
computation can be parallelized and thread safe.

1. http://stackoverflow.com/questions/7224670/threads-within-threads-in-java


On Fri, Aug 26, 2016 at 6:01 PM, Baran Topal <ba...@barantopal.com> wrote:
> Hi;
>
> First message, please be kind :)
>
> Is thread safety an important aspect in this context? Vectors are
> threadsafe but the complexity (e.g. removal of an element) might not
> be very desirable for performant applications. I will try to test and
> inform you between two solutions with tests once I have time.
>
> Br.
>
> 2016-08-26 2:40 GMT+02:00 Edward J. Yoon <ed...@samsung.com>:
>> Hi forks,
>>
>> Our current neuron API is designed like:
>> https://github.com/apache/incubator-horn/blob/master/README.md#programming-m
>> odel
>>
>> In forward() method, each neuron receives the pairs of the inputs x1, x2,
>> ... xn from other neurons and weights w1, w2, ... wn like below:
>>
>>   public void forward(Iterable<M> messages) throws IOException;
>>
>> Instead of this, I suggest that we use just vector like below:
>>
>>   /**
>>    * @param input vector from other neurons
>>    * /
>>   public void forward(Vector input) throws IOException;
>>
>> And, the neuron provides a getWeightVector() method that returns weight
>> vector associated with itself. I think this is more make sense than current
>> version, and more easy to use GPU in the future.
>>
>> What do you think?
>>
>> Thanks.
>>
>> --
>> Best Regards, Edward J. Yoon
>>
>>
>>
>>



-- 
Best Regards, Edward J. Yoon

Re: Use vector instead of Iterable in neuron API

Posted by Baran Topal <ba...@barantopal.com>.
Hi;

First message, please be kind :)

Is thread safety an important aspect in this context? Vectors are
threadsafe but the complexity (e.g. removal of an element) might not
be very desirable for performant applications. I will try to test and
inform you between two solutions with tests once I have time.

Br.

2016-08-26 2:40 GMT+02:00 Edward J. Yoon <ed...@samsung.com>:
> Hi forks,
>
> Our current neuron API is designed like:
> https://github.com/apache/incubator-horn/blob/master/README.md#programming-m
> odel
>
> In forward() method, each neuron receives the pairs of the inputs x1, x2,
> ... xn from other neurons and weights w1, w2, ... wn like below:
>
>   public void forward(Iterable<M> messages) throws IOException;
>
> Instead of this, I suggest that we use just vector like below:
>
>   /**
>    * @param input vector from other neurons
>    * /
>   public void forward(Vector input) throws IOException;
>
> And, the neuron provides a getWeightVector() method that returns weight
> vector associated with itself. I think this is more make sense than current
> version, and more easy to use GPU in the future.
>
> What do you think?
>
> Thanks.
>
> --
> Best Regards, Edward J. Yoon
>
>
>
>

Re: Use vector instead of Iterable in neuron API

Posted by Baran Topal <ba...@barantopal.com>.
Hi Edward,

Thanks. I will.

Br.

1 Kasım 2016 Salı tarihinde, Edward J. Yoon <ed...@samsung.com> yazdı:

> Sure, I'll help on it. If you can try to change the user-side code as we
> discussed, I think I can change the internal others.
>
> --
> Best Regards, Edward J. Yoon
>
>
> -----Original Message-----
> From: Baran Topal [mailto:barantopal@barantopal.com <javascript:;>]
> Sent: Monday, October 31, 2016 9:22 PM
> To: dev@horn.incubator.apache.org <javascript:;>
> Subject: Re: Use vector instead of Iterable in neuron API
>
> Hi Edward;
>
> I was looking at this for a while but got stuck on this. I was changing the
> code a lot so I think my approach might not be really correct.
>
> For learning purposes, I appreciate if another developer can take the case
> so that I can see the solution.
>
> Br.
>
> 2016-09-07 14:10 GMT+02:00 Baran Topal <barantopal@barantopal.com
> <javascript:;>>:
>
> > Thanks Edward.
> >
> > I just cloned the repo and created the fork and update my fork.
> >
> > I created the following case,
> >
> > https://issues.apache.org/jira/browse/HORN-31
> >
> > Please assign me.
> >
> > Br.
> >
> > 2016-09-07 1:06 GMT+02:00 Edward J. Yoon <edward.yoon@samsung.com
> <javascript:;>>:
> > > If you're using Git, you're probably using pull requests. Here's info
> > about
> > > pull request:
> > > https://cwiki.apache.org/confluence/display/HORN/How+To+Contribute
> > >
> > > And, please feel free to file this issue on JIRA:
> > > http://issues.apache.org/jira/browse/HORN
> > >
> > > Then, I'll assign to you!
> > >
> > > --
> > > Best Regards, Edward J. Yoon
> > >
> > >
> > > -----Original Message-----
> > > From: Baran Topal [mailto:barantopal@barantopal.com <javascript:;>]
> > > Sent: Wednesday, September 07, 2016 8:02 AM
> > > To: dev@horn.incubator.apache.org <javascript:;>
> > > Subject: Re: Use vector instead of Iterable in neuron API
> > >
> > > Hi Edward,
> > >
> > > Many thanks. I will update this as soon as possible.
> > >
> > > Br.
> > >
> > > 7 Eylul 2016 Car?amba tarihinde, Edward J. Yoon <
> edward.yoon@samsung.com <javascript:;>
> > >
> > > yazdı:
> > >
> > >> > directly into FloatVector and get rid of Synapse totally?
> > >>
> > >> Yes, I think that's more clear.
> > >>
> > >> Internally, each task processes neurons' computation on assigned data
> > >> split.
> > >> The LayeredNeuralNetwork.java contains everything, you can find the
> > >> "n.forward(msg);" and "n.backward(msg);" at there. Before calling the
> > >> forward() or backward() method, we can set the weight vector
> associated
> > >> with
> > >> neuron using setWeightVector() method and set the argument value for
> the
> > >> forward() method like below:
> > >>
> > >> n.setWeightVector(weight vector e.g., weightMatrix.get(neuronID));
> > >> n.forward(outputs from previous layer as a FloatVector);
> > >>
> > >> Then, user-side program can be done like below:
> > >>
> > >> forward(FloatVector input) {
> > >>   this.getWeightVector(); // returns weight vector associated with
> > itself.>
> >  >>   float vectorSum;
> > >>   for(float element : input) {
> > >>     vectorSum += element;
> > >>   }
> > >> }
> > >>
> > >> If you have any trouble, please don't hesitate ask here.
> > >>
> > >> --
> > >> Best Regards, Edward J. Yoon
> > >>
> > >>
> > >> -----Original Message-----
> > >> From: Baran Topal [mailto:barantopal@barantopal.com <javascript:;>
> <javascript:;>]
> > >> Sent: Tuesday, September 06, 2016 11:42 PM
> > >> To: dev@horn.incubator.apache.org <javascript:;> <javascript:;>
> > >> Subject: Re: Use vector instead of Iterable in neuron API
> > >>
> > >> Hi team and Edward;
> > >>
> > >> I have been checking this and got stuck how to convert the list
> > >> structure to DenseFloatVector. Can you help on this?
> > >>
> > >> Let me explain:
> > >>
> > >> I saw that the concrete FloatVector is actually some sort of array
> > >> structure which is not really compatible with Synapse and
> > >> FloatWritables. Is the aim to convert Synapse construction logic
> > >> directly into FloatVector and get rid of Synapse totally?
> > >>
> > >> //
> > >>
> > >> In the original code, we are passing a list with a structure of having
> > >> Synapse and FloatWritables. I can see that a synapse can be
> > >> constructed with a neuron id and 2 float writables.
> > >>
> > >> What I tried;
> > >>
> > >> 1) I added the following function in Neuron.java
> > >>
> > >>   public DenseFloatVector getWeightVector() {
> > >>     DenseFloatVector dfv = new DenseFloatVector(getWeights());
> > >>     return dfv;
> > >>   }
> > >>
> > >> 2) I added the following function in NeuronInterface.java
> > >>
> > >> public void forward2(FloatVector messages) throws IOException;
> > >>
> > >> 3) I added the following function in TestNeuron.java
> > >>
> > >>     public void forward2(FloatVector messages) throws IOException {
> > >>
> > >>       long startTime = System.nanoTime();
> > >>
> > >>       float sum = messages.dot(this.getWeightVector());
> > >>       this.feedforward(this.squashingFunction.apply(sum));
> > >>
> > >>       long endTime = System.nanoTime();
> > >>
> > >>       System.out.println("Execution time for the forward2 function is:
> > >> " + ((endTime - startTime)));
> > >>
> > >>     }
> > >> 4) I tried to refactor testProp() in TestNeuron.java with several ways
> > >> but failed unfortunately with runtime error since weight is not having
> > >> a value.
> > >>
> > >>   MyNeuron n_ = new MyNeuron();
> > >>
> > >>     FloatVector ds = new DenseFloatVector();
> > >>
> > >>     Iterator <Synapse<FloatWritable, FloatWritable>> li =
> x_.iterator();
> > >>     // Iterator<FloatVector.FloatVectorElement> ie = ds.iterate();
> > >>
> > >>     while(li.hasNext()) {
> > >>       // ds.add(li.next());
> > >>       // ie.
> > >>       Synapse<FloatWritable, FloatWritable> ee = li.next();
> > >>
> > >>       ds.set(ee.getSenderID(), ee.getMessage());
> > >>
> > >>     }
> > >>
> > >>     float[] ff = new float[2];
> > >>     ff[0] = 1.0f;
> > >>     ff[1] = 0.5f;
> > >>
> > >>     float[] ffa = new float[2];
> > >>     ffa[0] = 1.0f;
> > >>     ffa[1] = 0.4f;
> > >>
> > >>     DenseFloatVector dss = new DenseFloatVector(ff);
> > >>     DenseFloatVector dssa = new DenseFloatVector(ffa);
> > >>
> > >>     dss.add(dssa);
> > >>
> > >>
> > >>     FloatWritable a = new FloatWritable(1.0f);
> > >>     FloatWritable b = new FloatWritable(0.5f);
> > >>
> > >>     Synapse s = new Synapse(0, a, b);
> > >>
> > >>
> > >>
> > >>   //  dss.set(1, 0.5f);
> > >>     //dss.set(1, 0.4f);
> > >>
> > >>    // DenseFloatVector ds = new DenseFloatVector(ff);
> > >>
> > >>     n_.forward2(dss); //forward2
> > >>
> > >>
> > >>
> > >> 2016-09-05 0:55 GMT+02:00 Baran Topal <barantopal@barantopal.com
> <javascript:;>
> > >> <javascript:;>>:
> > >> > Hi;
> > >> >
> > >> > Thanks I am on it.
> > >> >
> > >> > Br.
> > >> >
> > >> > 2016-09-04 4:16 GMT+02:00 Edward J. Yoon <edwardyoon@apache.org
> <javascript:;>
> > >> <javascript:;>>:
> > >> >> P.S., so, if you want to test more, please see FloatVector and
> > >> >> DenseFloatVector.
> > >> >>
> > >> >> On Sun, Sep 4, 2016 at 11:13 AM, Edward J. Yoon <
> > edwardyoon@apache.org <javascript:;>
> > >> <javascript:;>>
> > >> >> wrote:
> > >> >>> Once we change the iterable input messages to the vector, we can
> > >> >>> change the legacy code like below:
> > >> >>>
> > >> >>> public void forward(FloatVector input) {
> > >> >>>   float sum = input.dot(this.getWeightVector());
> > >> >>>   this.feedforward(this.squashingFunction.apply(sum));
> > >> >>> }
> > >> >>>
> > >> >>>
> > >> >>>
> > >> >>>
> > >> >>> On Sat, Sep 3, 2016 at 11:10 PM, Baran Topal <
> > >> barantopal@barantopal.com <javascript:;> <javascript:;>>
> > >> >>> wrote:
> > >> >>>> Sure.
> > >> >>>>
> > >> >>>> In the attached, TestNeuron.txt,
> > >> >>>>
> > >> >>>> 1) I put // baran as a comment for the added functions.
> > >> >>>>
> > >> >>>> 2) The added functions and created objects have _ as suffix
> > >> >>>>
> > >> >>>> (e.g. backward_)
> > >> >>>>
> > >> >>>>
> > >> >>>> A correction: above test execution time values were via
> > >> >>>> System.nanoTime().
> > >> >>>>
> > >> >>>> Br.
> > >> >>>>
> > >> >>>> 2016-09-03 14:05 GMT+02:00 Edward J. Yoon <edwardyoon@apache.org
> <javascript:;>
> > >> <javascript:;>>:
> > >> >>>>> Interesting. Can you share your test code?
> > >> >>>>>
> > >> >>>>> On Sat, Sep 3, 2016 at 2:17 AM, Baran Topal <
> > >> barantopal@barantopal.com <javascript:;> <javascript:;>>
> > >> >>>>> wrote:
> > >> >>>>>> Hi Edward and team;
> > >> >>>>>>
> > >> >>>>>> I had a brief test by refactoring Iterable to Vector and on
> > >> >>>>>> TestNeuron.java, I can see some improved times. I didn't check
> > for
> > >> >>>>>> other existing test methods but it seems the execution times
> are
> > >> >>>>>> improving for both forwarding and backwarding.
> > >> >>>>>>
> > >> >>>>>> These values are via System.currentTimeMillis().
> > >> >>>>>>
> > >> >>>>>> E.g.
> > >> >>>>>>
> > >> >>>>>>
> > >> >>>>>> Execution time for the forward function is: 5722329
> > >> >>>>>> Execution time for the backward function is: 31825
> > >> >>>>>>
> > >> >>>>>> Execution time for the refactored forward function is: 72330
> > >> >>>>>> Execution time for the refactored backward function is: 4665
> > >> >>>>>>
> > >> >>>>>> Br.
> > >> >>>>>>
> > >> >>>>>> 2016-09-02 2:14 GMT+02:00 Yeonhee Lee <ssallys0130@gmail.com
> <javascript:;>
> > >> <javascript:;>>:
> > >> >>>>>>> Hi Edward,
> > >> >>>>>>>
> > >> >>>>>>> If we don't have that kind of method in the neuron, I guess
> it's
> > >> >>>>>>> appropriate to put the method to the neuron.
> > >> >>>>>>> That can be one of the distinct features of Horn.
> > >> >>>>>>>
> > >> >>>>>>> Regards,
> > >> >>>>>>> Yeonhee
> > >> >>>>>>>
> > >> >>>>>>>
> > >> >>>>>>> 2016-08-26 9:40 GMT+09:00 Edward J. Yoon <
> > edward.yoon@samsung.com <javascript:;>
> > >> <javascript:;>>:
> > >> >>>>>>>
> > >> >>>>>>>> Hi forks,
> > >> >>>>>>>>
> > >> >>>>>>>> Our current neuron API is designed like:
> > >> >>>>>>>> https://github.com/apache/incubator-horn/blob/master/
> > >> >>>>>>>> README.md#programming-m
> > >> >>>>>>>> odel
> > >> >>>>>>>>
> > >> >>>>>>>> In forward() method, each neuron receives the pairs of the
> > inputs
> > >> x1,
> > >> >>>>>>>> x2,
> > >> >>>>>>>> ... xn from other neurons and weights w1, w2, ... wn like
> > below:
> > >> >>>>>>>>
> > >> >>>>>>>>   public void forward(Iterable<M> messages) throws
> IOException;
> > >> >>>>>>>>
> > >> >>>>>>>> Instead of this, I suggest that we use just vector like
> below:
> > >> >>>>>>>>
> > >> >>>>>>>>   /**
> > >> >>>>>>>>    * @param input vector from other neurons
> > >> >>>>>>>>    * /
> > >> >>>>>>>>   public void forward(Vector input) throws IOException;
> > >> >>>>>>>>
> > >> >>>>>>>> And, the neuron provides a getWeightVector() method that
> > returns
> > >> >>>>>>>> weight
> > >> >>>>>>>> vector associated with itself. I think this is more make
> sense
> > >> than
> > >> >>>>>>>> current
> > >> >>>>>>>> version, and more easy to use GPU in the future.
> > >> >>>>>>>>
> > >> >>>>>>>> What do you think?
> > >> >>>>>>>>
> > >> >>>>>>>> Thanks.
> > >> >>>>>>>>
> > >> >>>>>>>> --
> > >> >>>>>>>> Best Regards, Edward J. Yoon
> > >> >>>>>>>>
> > >> >>>>>>>>
> > >> >>>>>>>>
> > >> >>>>>>>>
> > >> >>>>>>>>
> > >> >>>>>
> > >> >>>>>
> > >> >>>>>
> > >> >>>>> --
> > >> >>>>> Best Regards, Edward J. Yoon
> > >> >>>
> > >> >>>
> > >> >>>
> > >> >>> --
> > >> >>> Best Regards, Edward J. Yoon
> > >> >>
> > >> >>
> > >> >>
> > >> >> --
> > >> >> Best Regards, Edward J. Yoon
> > >>
> > >>
> > >>
> > >>
> > >
> > >
> >
>
>
>

RE: Use vector instead of Iterable in neuron API

Posted by "Edward J. Yoon" <ed...@samsung.com>.
Sure, I'll help on it. If you can try to change the user-side code as we 
discussed, I think I can change the internal others.

--
Best Regards, Edward J. Yoon


-----Original Message-----
From: Baran Topal [mailto:barantopal@barantopal.com]
Sent: Monday, October 31, 2016 9:22 PM
To: dev@horn.incubator.apache.org
Subject: Re: Use vector instead of Iterable in neuron API

Hi Edward;

I was looking at this for a while but got stuck on this. I was changing the
code a lot so I think my approach might not be really correct.

For learning purposes, I appreciate if another developer can take the case
so that I can see the solution.

Br.

2016-09-07 14:10 GMT+02:00 Baran Topal <ba...@barantopal.com>:

> Thanks Edward.
>
> I just cloned the repo and created the fork and update my fork.
>
> I created the following case,
>
> https://issues.apache.org/jira/browse/HORN-31
>
> Please assign me.
>
> Br.
>
> 2016-09-07 1:06 GMT+02:00 Edward J. Yoon <ed...@samsung.com>:
> > If you're using Git, you're probably using pull requests. Here's info
> about
> > pull request:
> > https://cwiki.apache.org/confluence/display/HORN/How+To+Contribute
> >
> > And, please feel free to file this issue on JIRA:
> > http://issues.apache.org/jira/browse/HORN
> >
> > Then, I'll assign to you!
> >
> > --
> > Best Regards, Edward J. Yoon
> >
> >
> > -----Original Message-----
> > From: Baran Topal [mailto:barantopal@barantopal.com]
> > Sent: Wednesday, September 07, 2016 8:02 AM
> > To: dev@horn.incubator.apache.org
> > Subject: Re: Use vector instead of Iterable in neuron API
> >
> > Hi Edward,
> >
> > Many thanks. I will update this as soon as possible.
> >
> > Br.
> >
> > 7 Eylul 2016 Car?amba tarihinde, Edward J. Yoon <edward.yoon@samsung.com
> >
> > yazd:
> >
> >> > directly into FloatVector and get rid of Synapse totally?
> >>
> >> Yes, I think that's more clear.
> >>
> >> Internally, each task processes neurons' computation on assigned data
> >> split.
> >> The LayeredNeuralNetwork.java contains everything, you can find the
> >> "n.forward(msg);" and "n.backward(msg);" at there. Before calling the
> >> forward() or backward() method, we can set the weight vector associated
> >> with
> >> neuron using setWeightVector() method and set the argument value for the
> >> forward() method like below:
> >>
> >> n.setWeightVector(weight vector e.g., weightMatrix.get(neuronID));
> >> n.forward(outputs from previous layer as a FloatVector);
> >>
> >> Then, user-side program can be done like below:
> >>
> >> forward(FloatVector input) {
> >>   this.getWeightVector(); // returns weight vector associated with
> itself.> 
>  >>   float vectorSum;
> >>   for(float element : input) {
> >>     vectorSum += element;
> >>   }
> >> }
> >>
> >> If you have any trouble, please don't hesitate ask here.
> >>
> >> --
> >> Best Regards, Edward J. Yoon
> >>
> >>
> >> -----Original Message-----
> >> From: Baran Topal [mailto:barantopal@barantopal.com <javascript:;>]
> >> Sent: Tuesday, September 06, 2016 11:42 PM
> >> To: dev@horn.incubator.apache.org <javascript:;>
> >> Subject: Re: Use vector instead of Iterable in neuron API
> >>
> >> Hi team and Edward;
> >>
> >> I have been checking this and got stuck how to convert the list
> >> structure to DenseFloatVector. Can you help on this?
> >>
> >> Let me explain:
> >>
> >> I saw that the concrete FloatVector is actually some sort of array
> >> structure which is not really compatible with Synapse and
> >> FloatWritables. Is the aim to convert Synapse construction logic
> >> directly into FloatVector and get rid of Synapse totally?
> >>
> >> //
> >>
> >> In the original code, we are passing a list with a structure of having
> >> Synapse and FloatWritables. I can see that a synapse can be
> >> constructed with a neuron id and 2 float writables.
> >>
> >> What I tried;
> >>
> >> 1) I added the following function in Neuron.java
> >>
> >>   public DenseFloatVector getWeightVector() {
> >>     DenseFloatVector dfv = new DenseFloatVector(getWeights());
> >>     return dfv;
> >>   }
> >>
> >> 2) I added the following function in NeuronInterface.java
> >>
> >> public void forward2(FloatVector messages) throws IOException;
> >>
> >> 3) I added the following function in TestNeuron.java
> >>
> >>     public void forward2(FloatVector messages) throws IOException {
> >>
> >>       long startTime = System.nanoTime();
> >>
> >>       float sum = messages.dot(this.getWeightVector());
> >>       this.feedforward(this.squashingFunction.apply(sum));
> >>
> >>       long endTime = System.nanoTime();
> >>
> >>       System.out.println("Execution time for the forward2 function is:
> >> " + ((endTime - startTime)));
> >>
> >>     }
> >> 4) I tried to refactor testProp() in TestNeuron.java with several ways
> >> but failed unfortunately with runtime error since weight is not having
> >> a value.
> >>
> >>   MyNeuron n_ = new MyNeuron();
> >>
> >>     FloatVector ds = new DenseFloatVector();
> >>
> >>     Iterator <Synapse<FloatWritable, FloatWritable>> li = x_.iterator();
> >>     // Iterator<FloatVector.FloatVectorElement> ie = ds.iterate();
> >>
> >>     while(li.hasNext()) {
> >>       // ds.add(li.next());
> >>       // ie.
> >>       Synapse<FloatWritable, FloatWritable> ee = li.next();
> >>
> >>       ds.set(ee.getSenderID(), ee.getMessage());
> >>
> >>     }
> >>
> >>     float[] ff = new float[2];
> >>     ff[0] = 1.0f;
> >>     ff[1] = 0.5f;
> >>
> >>     float[] ffa = new float[2];
> >>     ffa[0] = 1.0f;
> >>     ffa[1] = 0.4f;
> >>
> >>     DenseFloatVector dss = new DenseFloatVector(ff);
> >>     DenseFloatVector dssa = new DenseFloatVector(ffa);
> >>
> >>     dss.add(dssa);
> >>
> >>
> >>     FloatWritable a = new FloatWritable(1.0f);
> >>     FloatWritable b = new FloatWritable(0.5f);
> >>
> >>     Synapse s = new Synapse(0, a, b);
> >>
> >>
> >>
> >>   //  dss.set(1, 0.5f);
> >>     //dss.set(1, 0.4f);
> >>
> >>    // DenseFloatVector ds = new DenseFloatVector(ff);
> >>
> >>     n_.forward2(dss); //forward2
> >>
> >>
> >>
> >> 2016-09-05 0:55 GMT+02:00 Baran Topal <barantopal@barantopal.com
> >> <javascript:;>>:
> >> > Hi;
> >> >
> >> > Thanks I am on it.
> >> >
> >> > Br.
> >> >
> >> > 2016-09-04 4:16 GMT+02:00 Edward J. Yoon <edwardyoon@apache.org
> >> <javascript:;>>:
> >> >> P.S., so, if you want to test more, please see FloatVector and
> >> >> DenseFloatVector.
> >> >>
> >> >> On Sun, Sep 4, 2016 at 11:13 AM, Edward J. Yoon <
> edwardyoon@apache.org
> >> <javascript:;>>
> >> >> wrote:
> >> >>> Once we change the iterable input messages to the vector, we can
> >> >>> change the legacy code like below:
> >> >>>
> >> >>> public void forward(FloatVector input) {
> >> >>>   float sum = input.dot(this.getWeightVector());
> >> >>>   this.feedforward(this.squashingFunction.apply(sum));
> >> >>> }
> >> >>>
> >> >>>
> >> >>>
> >> >>>
> >> >>> On Sat, Sep 3, 2016 at 11:10 PM, Baran Topal <
> >> barantopal@barantopal.com <javascript:;>>
> >> >>> wrote:
> >> >>>> Sure.
> >> >>>>
> >> >>>> In the attached, TestNeuron.txt,
> >> >>>>
> >> >>>> 1) I put // baran as a comment for the added functions.
> >> >>>>
> >> >>>> 2) The added functions and created objects have _ as suffix
> >> >>>>
> >> >>>> (e.g. backward_)
> >> >>>>
> >> >>>>
> >> >>>> A correction: above test execution time values were via
> >> >>>> System.nanoTime().
> >> >>>>
> >> >>>> Br.
> >> >>>>
> >> >>>> 2016-09-03 14:05 GMT+02:00 Edward J. Yoon <edwardyoon@apache.org
> >> <javascript:;>>:
> >> >>>>> Interesting. Can you share your test code?
> >> >>>>>
> >> >>>>> On Sat, Sep 3, 2016 at 2:17 AM, Baran Topal <
> >> barantopal@barantopal.com <javascript:;>>
> >> >>>>> wrote:
> >> >>>>>> Hi Edward and team;
> >> >>>>>>
> >> >>>>>> I had a brief test by refactoring Iterable to Vector and on
> >> >>>>>> TestNeuron.java, I can see some improved times. I didn't check
> for
> >> >>>>>> other existing test methods but it seems the execution times are
> >> >>>>>> improving for both forwarding and backwarding.
> >> >>>>>>
> >> >>>>>> These values are via System.currentTimeMillis().
> >> >>>>>>
> >> >>>>>> E.g.
> >> >>>>>>
> >> >>>>>>
> >> >>>>>> Execution time for the forward function is: 5722329
> >> >>>>>> Execution time for the backward function is: 31825
> >> >>>>>>
> >> >>>>>> Execution time for the refactored forward function is: 72330
> >> >>>>>> Execution time for the refactored backward function is: 4665
> >> >>>>>>
> >> >>>>>> Br.
> >> >>>>>>
> >> >>>>>> 2016-09-02 2:14 GMT+02:00 Yeonhee Lee <ssallys0130@gmail.com
> >> <javascript:;>>:
> >> >>>>>>> Hi Edward,
> >> >>>>>>>
> >> >>>>>>> If we don't have that kind of method in the neuron, I guess it's
> >> >>>>>>> appropriate to put the method to the neuron.
> >> >>>>>>> That can be one of the distinct features of Horn.
> >> >>>>>>>
> >> >>>>>>> Regards,
> >> >>>>>>> Yeonhee
> >> >>>>>>>
> >> >>>>>>>
> >> >>>>>>> 2016-08-26 9:40 GMT+09:00 Edward J. Yoon <
> edward.yoon@samsung.com
> >> <javascript:;>>:
> >> >>>>>>>
> >> >>>>>>>> Hi forks,
> >> >>>>>>>>
> >> >>>>>>>> Our current neuron API is designed like:
> >> >>>>>>>> https://github.com/apache/incubator-horn/blob/master/
> >> >>>>>>>> README.md#programming-m
> >> >>>>>>>> odel
> >> >>>>>>>>
> >> >>>>>>>> In forward() method, each neuron receives the pairs of the
> inputs
> >> x1,
> >> >>>>>>>> x2,
> >> >>>>>>>> ... xn from other neurons and weights w1, w2, ... wn like
> below:
> >> >>>>>>>>
> >> >>>>>>>>   public void forward(Iterable<M> messages) throws IOException;
> >> >>>>>>>>
> >> >>>>>>>> Instead of this, I suggest that we use just vector like below:
> >> >>>>>>>>
> >> >>>>>>>>   /**
> >> >>>>>>>>    * @param input vector from other neurons
> >> >>>>>>>>    * /
> >> >>>>>>>>   public void forward(Vector input) throws IOException;
> >> >>>>>>>>
> >> >>>>>>>> And, the neuron provides a getWeightVector() method that
> returns
> >> >>>>>>>> weight
> >> >>>>>>>> vector associated with itself. I think this is more make sense
> >> than
> >> >>>>>>>> current
> >> >>>>>>>> version, and more easy to use GPU in the future.
> >> >>>>>>>>
> >> >>>>>>>> What do you think?
> >> >>>>>>>>
> >> >>>>>>>> Thanks.
> >> >>>>>>>>
> >> >>>>>>>> --
> >> >>>>>>>> Best Regards, Edward J. Yoon
> >> >>>>>>>>
> >> >>>>>>>>
> >> >>>>>>>>
> >> >>>>>>>>
> >> >>>>>>>>
> >> >>>>>
> >> >>>>>
> >> >>>>>
> >> >>>>> --
> >> >>>>> Best Regards, Edward J. Yoon
> >> >>>
> >> >>>
> >> >>>
> >> >>> --
> >> >>> Best Regards, Edward J. Yoon
> >> >>
> >> >>
> >> >>
> >> >> --
> >> >> Best Regards, Edward J. Yoon
> >>
> >>
> >>
> >>
> >
> >
>



Re: Use vector instead of Iterable in neuron API

Posted by Baran Topal <ba...@barantopal.com>.
Hi Edward;

I was looking at this for a while but got stuck on this. I was changing the
code a lot so I think my approach might not be really correct.

For learning purposes, I appreciate if another developer can take the case
so that I can see the solution.

Br.

2016-09-07 14:10 GMT+02:00 Baran Topal <ba...@barantopal.com>:

> Thanks Edward.
>
> I just cloned the repo and created the fork and update my fork.
>
> I created the following case,
>
> https://issues.apache.org/jira/browse/HORN-31
>
> Please assign me.
>
> Br.
>
> 2016-09-07 1:06 GMT+02:00 Edward J. Yoon <ed...@samsung.com>:
> > If you're using Git, you're probably using pull requests. Here's info
> about
> > pull request:
> > https://cwiki.apache.org/confluence/display/HORN/How+To+Contribute
> >
> > And, please feel free to file this issue on JIRA:
> > http://issues.apache.org/jira/browse/HORN
> >
> > Then, I'll assign to you!
> >
> > --
> > Best Regards, Edward J. Yoon
> >
> >
> > -----Original Message-----
> > From: Baran Topal [mailto:barantopal@barantopal.com]
> > Sent: Wednesday, September 07, 2016 8:02 AM
> > To: dev@horn.incubator.apache.org
> > Subject: Re: Use vector instead of Iterable in neuron API
> >
> > Hi Edward,
> >
> > Many thanks. I will update this as soon as possible.
> >
> > Br.
> >
> > 7 Eylul 2016 Car?amba tarihinde, Edward J. Yoon <edward.yoon@samsung.com
> >
> > yazdı:
> >
> >> > directly into FloatVector and get rid of Synapse totally?
> >>
> >> Yes, I think that's more clear.
> >>
> >> Internally, each task processes neurons' computation on assigned data
> >> split.
> >> The LayeredNeuralNetwork.java contains everything, you can find the
> >> "n.forward(msg);" and "n.backward(msg);" at there. Before calling the
> >> forward() or backward() method, we can set the weight vector associated
> >> with
> >> neuron using setWeightVector() method and set the argument value for the
> >> forward() method like below:
> >>
> >> n.setWeightVector(weight vector e.g., weightMatrix.get(neuronID));
> >> n.forward(outputs from previous layer as a FloatVector);
> >>
> >> Then, user-side program can be done like below:
> >>
> >> forward(FloatVector input) {
> >>   this.getWeightVector(); // returns weight vector associated with
> itself.
> >>   float vectorSum;
> >>   for(float element : input) {
> >>     vectorSum += element;
> >>   }
> >> }
> >>
> >> If you have any trouble, please don't hesitate ask here.
> >>
> >> --
> >> Best Regards, Edward J. Yoon
> >>
> >>
> >> -----Original Message-----
> >> From: Baran Topal [mailto:barantopal@barantopal.com <javascript:;>]
> >> Sent: Tuesday, September 06, 2016 11:42 PM
> >> To: dev@horn.incubator.apache.org <javascript:;>
> >> Subject: Re: Use vector instead of Iterable in neuron API
> >>
> >> Hi team and Edward;
> >>
> >> I have been checking this and got stuck how to convert the list
> >> structure to DenseFloatVector. Can you help on this?
> >>
> >> Let me explain:
> >>
> >> I saw that the concrete FloatVector is actually some sort of array
> >> structure which is not really compatible with Synapse and
> >> FloatWritables. Is the aim to convert Synapse construction logic
> >> directly into FloatVector and get rid of Synapse totally?
> >>
> >> //
> >>
> >> In the original code, we are passing a list with a structure of having
> >> Synapse and FloatWritables. I can see that a synapse can be
> >> constructed with a neuron id and 2 float writables.
> >>
> >> What I tried;
> >>
> >> 1) I added the following function in Neuron.java
> >>
> >>   public DenseFloatVector getWeightVector() {
> >>     DenseFloatVector dfv = new DenseFloatVector(getWeights());
> >>     return dfv;
> >>   }
> >>
> >> 2) I added the following function in NeuronInterface.java
> >>
> >> public void forward2(FloatVector messages) throws IOException;
> >>
> >> 3) I added the following function in TestNeuron.java
> >>
> >>     public void forward2(FloatVector messages) throws IOException {
> >>
> >>       long startTime = System.nanoTime();
> >>
> >>       float sum = messages.dot(this.getWeightVector());
> >>       this.feedforward(this.squashingFunction.apply(sum));
> >>
> >>       long endTime = System.nanoTime();
> >>
> >>       System.out.println("Execution time for the forward2 function is:
> >> " + ((endTime - startTime)));
> >>
> >>     }
> >> 4) I tried to refactor testProp() in TestNeuron.java with several ways
> >> but failed unfortunately with runtime error since weight is not having
> >> a value.
> >>
> >>   MyNeuron n_ = new MyNeuron();
> >>
> >>     FloatVector ds = new DenseFloatVector();
> >>
> >>     Iterator <Synapse<FloatWritable, FloatWritable>> li = x_.iterator();
> >>     // Iterator<FloatVector.FloatVectorElement> ie = ds.iterate();
> >>
> >>     while(li.hasNext()) {
> >>       // ds.add(li.next());
> >>       // ie.
> >>       Synapse<FloatWritable, FloatWritable> ee = li.next();
> >>
> >>       ds.set(ee.getSenderID(), ee.getMessage());
> >>
> >>     }
> >>
> >>     float[] ff = new float[2];
> >>     ff[0] = 1.0f;
> >>     ff[1] = 0.5f;
> >>
> >>     float[] ffa = new float[2];
> >>     ffa[0] = 1.0f;
> >>     ffa[1] = 0.4f;
> >>
> >>     DenseFloatVector dss = new DenseFloatVector(ff);
> >>     DenseFloatVector dssa = new DenseFloatVector(ffa);
> >>
> >>     dss.add(dssa);
> >>
> >>
> >>     FloatWritable a = new FloatWritable(1.0f);
> >>     FloatWritable b = new FloatWritable(0.5f);
> >>
> >>     Synapse s = new Synapse(0, a, b);
> >>
> >>
> >>
> >>   //  dss.set(1, 0.5f);
> >>     //dss.set(1, 0.4f);
> >>
> >>    // DenseFloatVector ds = new DenseFloatVector(ff);
> >>
> >>     n_.forward2(dss); //forward2
> >>
> >>
> >>
> >> 2016-09-05 0:55 GMT+02:00 Baran Topal <barantopal@barantopal.com
> >> <javascript:;>>:
> >> > Hi;
> >> >
> >> > Thanks I am on it.
> >> >
> >> > Br.
> >> >
> >> > 2016-09-04 4:16 GMT+02:00 Edward J. Yoon <edwardyoon@apache.org
> >> <javascript:;>>:
> >> >> P.S., so, if you want to test more, please see FloatVector and
> >> >> DenseFloatVector.
> >> >>
> >> >> On Sun, Sep 4, 2016 at 11:13 AM, Edward J. Yoon <
> edwardyoon@apache.org
> >> <javascript:;>>
> >> >> wrote:
> >> >>> Once we change the iterable input messages to the vector, we can
> >> >>> change the legacy code like below:
> >> >>>
> >> >>> public void forward(FloatVector input) {
> >> >>>   float sum = input.dot(this.getWeightVector());
> >> >>>   this.feedforward(this.squashingFunction.apply(sum));
> >> >>> }
> >> >>>
> >> >>>
> >> >>>
> >> >>>
> >> >>> On Sat, Sep 3, 2016 at 11:10 PM, Baran Topal <
> >> barantopal@barantopal.com <javascript:;>>
> >> >>> wrote:
> >> >>>> Sure.
> >> >>>>
> >> >>>> In the attached, TestNeuron.txt,
> >> >>>>
> >> >>>> 1) I put // baran as a comment for the added functions.
> >> >>>>
> >> >>>> 2) The added functions and created objects have _ as suffix
> >> >>>>
> >> >>>> (e.g. backward_)
> >> >>>>
> >> >>>>
> >> >>>> A correction: above test execution time values were via
> >> >>>> System.nanoTime().
> >> >>>>
> >> >>>> Br.
> >> >>>>
> >> >>>> 2016-09-03 14:05 GMT+02:00 Edward J. Yoon <edwardyoon@apache.org
> >> <javascript:;>>:
> >> >>>>> Interesting. Can you share your test code?
> >> >>>>>
> >> >>>>> On Sat, Sep 3, 2016 at 2:17 AM, Baran Topal <
> >> barantopal@barantopal.com <javascript:;>>
> >> >>>>> wrote:
> >> >>>>>> Hi Edward and team;
> >> >>>>>>
> >> >>>>>> I had a brief test by refactoring Iterable to Vector and on
> >> >>>>>> TestNeuron.java, I can see some improved times. I didn't check
> for
> >> >>>>>> other existing test methods but it seems the execution times are
> >> >>>>>> improving for both forwarding and backwarding.
> >> >>>>>>
> >> >>>>>> These values are via System.currentTimeMillis().
> >> >>>>>>
> >> >>>>>> E.g.
> >> >>>>>>
> >> >>>>>>
> >> >>>>>> Execution time for the forward function is: 5722329
> >> >>>>>> Execution time for the backward function is: 31825
> >> >>>>>>
> >> >>>>>> Execution time for the refactored forward function is: 72330
> >> >>>>>> Execution time for the refactored backward function is: 4665
> >> >>>>>>
> >> >>>>>> Br.
> >> >>>>>>
> >> >>>>>> 2016-09-02 2:14 GMT+02:00 Yeonhee Lee <ssallys0130@gmail.com
> >> <javascript:;>>:
> >> >>>>>>> Hi Edward,
> >> >>>>>>>
> >> >>>>>>> If we don't have that kind of method in the neuron, I guess it's
> >> >>>>>>> appropriate to put the method to the neuron.
> >> >>>>>>> That can be one of the distinct features of Horn.
> >> >>>>>>>
> >> >>>>>>> Regards,
> >> >>>>>>> Yeonhee
> >> >>>>>>>
> >> >>>>>>>
> >> >>>>>>> 2016-08-26 9:40 GMT+09:00 Edward J. Yoon <
> edward.yoon@samsung.com
> >> <javascript:;>>:
> >> >>>>>>>
> >> >>>>>>>> Hi forks,
> >> >>>>>>>>
> >> >>>>>>>> Our current neuron API is designed like:
> >> >>>>>>>> https://github.com/apache/incubator-horn/blob/master/
> >> >>>>>>>> README.md#programming-m
> >> >>>>>>>> odel
> >> >>>>>>>>
> >> >>>>>>>> In forward() method, each neuron receives the pairs of the
> inputs
> >> x1,
> >> >>>>>>>> x2,
> >> >>>>>>>> ... xn from other neurons and weights w1, w2, ... wn like
> below:
> >> >>>>>>>>
> >> >>>>>>>>   public void forward(Iterable<M> messages) throws IOException;
> >> >>>>>>>>
> >> >>>>>>>> Instead of this, I suggest that we use just vector like below:
> >> >>>>>>>>
> >> >>>>>>>>   /**
> >> >>>>>>>>    * @param input vector from other neurons
> >> >>>>>>>>    * /
> >> >>>>>>>>   public void forward(Vector input) throws IOException;
> >> >>>>>>>>
> >> >>>>>>>> And, the neuron provides a getWeightVector() method that
> returns
> >> >>>>>>>> weight
> >> >>>>>>>> vector associated with itself. I think this is more make sense
> >> than
> >> >>>>>>>> current
> >> >>>>>>>> version, and more easy to use GPU in the future.
> >> >>>>>>>>
> >> >>>>>>>> What do you think?
> >> >>>>>>>>
> >> >>>>>>>> Thanks.
> >> >>>>>>>>
> >> >>>>>>>> --
> >> >>>>>>>> Best Regards, Edward J. Yoon
> >> >>>>>>>>
> >> >>>>>>>>
> >> >>>>>>>>
> >> >>>>>>>>
> >> >>>>>>>>
> >> >>>>>
> >> >>>>>
> >> >>>>>
> >> >>>>> --
> >> >>>>> Best Regards, Edward J. Yoon
> >> >>>
> >> >>>
> >> >>>
> >> >>> --
> >> >>> Best Regards, Edward J. Yoon
> >> >>
> >> >>
> >> >>
> >> >> --
> >> >> Best Regards, Edward J. Yoon
> >>
> >>
> >>
> >>
> >
> >
>

Re: Use vector instead of Iterable in neuron API

Posted by Baran Topal <ba...@barantopal.com>.
Thanks Edward.

I just cloned the repo and created the fork and update my fork.

I created the following case,

https://issues.apache.org/jira/browse/HORN-31

Please assign me.

Br.

2016-09-07 1:06 GMT+02:00 Edward J. Yoon <ed...@samsung.com>:
> If you're using Git, you're probably using pull requests. Here's info about
> pull request:
> https://cwiki.apache.org/confluence/display/HORN/How+To+Contribute
>
> And, please feel free to file this issue on JIRA:
> http://issues.apache.org/jira/browse/HORN
>
> Then, I'll assign to you!
>
> --
> Best Regards, Edward J. Yoon
>
>
> -----Original Message-----
> From: Baran Topal [mailto:barantopal@barantopal.com]
> Sent: Wednesday, September 07, 2016 8:02 AM
> To: dev@horn.incubator.apache.org
> Subject: Re: Use vector instead of Iterable in neuron API
>
> Hi Edward,
>
> Many thanks. I will update this as soon as possible.
>
> Br.
>
> 7 Eylul 2016 Car?amba tarihinde, Edward J. Yoon <ed...@samsung.com>
> yazdı:
>
>> > directly into FloatVector and get rid of Synapse totally?
>>
>> Yes, I think that's more clear.
>>
>> Internally, each task processes neurons' computation on assigned data
>> split.
>> The LayeredNeuralNetwork.java contains everything, you can find the
>> "n.forward(msg);" and "n.backward(msg);" at there. Before calling the
>> forward() or backward() method, we can set the weight vector associated
>> with
>> neuron using setWeightVector() method and set the argument value for the
>> forward() method like below:
>>
>> n.setWeightVector(weight vector e.g., weightMatrix.get(neuronID));
>> n.forward(outputs from previous layer as a FloatVector);
>>
>> Then, user-side program can be done like below:
>>
>> forward(FloatVector input) {
>>   this.getWeightVector(); // returns weight vector associated with itself.
>>   float vectorSum;
>>   for(float element : input) {
>>     vectorSum += element;
>>   }
>> }
>>
>> If you have any trouble, please don't hesitate ask here.
>>
>> --
>> Best Regards, Edward J. Yoon
>>
>>
>> -----Original Message-----
>> From: Baran Topal [mailto:barantopal@barantopal.com <javascript:;>]
>> Sent: Tuesday, September 06, 2016 11:42 PM
>> To: dev@horn.incubator.apache.org <javascript:;>
>> Subject: Re: Use vector instead of Iterable in neuron API
>>
>> Hi team and Edward;
>>
>> I have been checking this and got stuck how to convert the list
>> structure to DenseFloatVector. Can you help on this?
>>
>> Let me explain:
>>
>> I saw that the concrete FloatVector is actually some sort of array
>> structure which is not really compatible with Synapse and
>> FloatWritables. Is the aim to convert Synapse construction logic
>> directly into FloatVector and get rid of Synapse totally?
>>
>> //
>>
>> In the original code, we are passing a list with a structure of having
>> Synapse and FloatWritables. I can see that a synapse can be
>> constructed with a neuron id and 2 float writables.
>>
>> What I tried;
>>
>> 1) I added the following function in Neuron.java
>>
>>   public DenseFloatVector getWeightVector() {
>>     DenseFloatVector dfv = new DenseFloatVector(getWeights());
>>     return dfv;
>>   }
>>
>> 2) I added the following function in NeuronInterface.java
>>
>> public void forward2(FloatVector messages) throws IOException;
>>
>> 3) I added the following function in TestNeuron.java
>>
>>     public void forward2(FloatVector messages) throws IOException {
>>
>>       long startTime = System.nanoTime();
>>
>>       float sum = messages.dot(this.getWeightVector());
>>       this.feedforward(this.squashingFunction.apply(sum));
>>
>>       long endTime = System.nanoTime();
>>
>>       System.out.println("Execution time for the forward2 function is:
>> " + ((endTime - startTime)));
>>
>>     }
>> 4) I tried to refactor testProp() in TestNeuron.java with several ways
>> but failed unfortunately with runtime error since weight is not having
>> a value.
>>
>>   MyNeuron n_ = new MyNeuron();
>>
>>     FloatVector ds = new DenseFloatVector();
>>
>>     Iterator <Synapse<FloatWritable, FloatWritable>> li = x_.iterator();
>>     // Iterator<FloatVector.FloatVectorElement> ie = ds.iterate();
>>
>>     while(li.hasNext()) {
>>       // ds.add(li.next());
>>       // ie.
>>       Synapse<FloatWritable, FloatWritable> ee = li.next();
>>
>>       ds.set(ee.getSenderID(), ee.getMessage());
>>
>>     }
>>
>>     float[] ff = new float[2];
>>     ff[0] = 1.0f;
>>     ff[1] = 0.5f;
>>
>>     float[] ffa = new float[2];
>>     ffa[0] = 1.0f;
>>     ffa[1] = 0.4f;
>>
>>     DenseFloatVector dss = new DenseFloatVector(ff);
>>     DenseFloatVector dssa = new DenseFloatVector(ffa);
>>
>>     dss.add(dssa);
>>
>>
>>     FloatWritable a = new FloatWritable(1.0f);
>>     FloatWritable b = new FloatWritable(0.5f);
>>
>>     Synapse s = new Synapse(0, a, b);
>>
>>
>>
>>   //  dss.set(1, 0.5f);
>>     //dss.set(1, 0.4f);
>>
>>    // DenseFloatVector ds = new DenseFloatVector(ff);
>>
>>     n_.forward2(dss); //forward2
>>
>>
>>
>> 2016-09-05 0:55 GMT+02:00 Baran Topal <barantopal@barantopal.com
>> <javascript:;>>:
>> > Hi;
>> >
>> > Thanks I am on it.
>> >
>> > Br.
>> >
>> > 2016-09-04 4:16 GMT+02:00 Edward J. Yoon <edwardyoon@apache.org
>> <javascript:;>>:
>> >> P.S., so, if you want to test more, please see FloatVector and
>> >> DenseFloatVector.
>> >>
>> >> On Sun, Sep 4, 2016 at 11:13 AM, Edward J. Yoon <edwardyoon@apache.org
>> <javascript:;>>
>> >> wrote:
>> >>> Once we change the iterable input messages to the vector, we can
>> >>> change the legacy code like below:
>> >>>
>> >>> public void forward(FloatVector input) {
>> >>>   float sum = input.dot(this.getWeightVector());
>> >>>   this.feedforward(this.squashingFunction.apply(sum));
>> >>> }
>> >>>
>> >>>
>> >>>
>> >>>
>> >>> On Sat, Sep 3, 2016 at 11:10 PM, Baran Topal <
>> barantopal@barantopal.com <javascript:;>>
>> >>> wrote:
>> >>>> Sure.
>> >>>>
>> >>>> In the attached, TestNeuron.txt,
>> >>>>
>> >>>> 1) I put // baran as a comment for the added functions.
>> >>>>
>> >>>> 2) The added functions and created objects have _ as suffix
>> >>>>
>> >>>> (e.g. backward_)
>> >>>>
>> >>>>
>> >>>> A correction: above test execution time values were via
>> >>>> System.nanoTime().
>> >>>>
>> >>>> Br.
>> >>>>
>> >>>> 2016-09-03 14:05 GMT+02:00 Edward J. Yoon <edwardyoon@apache.org
>> <javascript:;>>:
>> >>>>> Interesting. Can you share your test code?
>> >>>>>
>> >>>>> On Sat, Sep 3, 2016 at 2:17 AM, Baran Topal <
>> barantopal@barantopal.com <javascript:;>>
>> >>>>> wrote:
>> >>>>>> Hi Edward and team;
>> >>>>>>
>> >>>>>> I had a brief test by refactoring Iterable to Vector and on
>> >>>>>> TestNeuron.java, I can see some improved times. I didn't check for
>> >>>>>> other existing test methods but it seems the execution times are
>> >>>>>> improving for both forwarding and backwarding.
>> >>>>>>
>> >>>>>> These values are via System.currentTimeMillis().
>> >>>>>>
>> >>>>>> E.g.
>> >>>>>>
>> >>>>>>
>> >>>>>> Execution time for the forward function is: 5722329
>> >>>>>> Execution time for the backward function is: 31825
>> >>>>>>
>> >>>>>> Execution time for the refactored forward function is: 72330
>> >>>>>> Execution time for the refactored backward function is: 4665
>> >>>>>>
>> >>>>>> Br.
>> >>>>>>
>> >>>>>> 2016-09-02 2:14 GMT+02:00 Yeonhee Lee <ssallys0130@gmail.com
>> <javascript:;>>:
>> >>>>>>> Hi Edward,
>> >>>>>>>
>> >>>>>>> If we don't have that kind of method in the neuron, I guess it's
>> >>>>>>> appropriate to put the method to the neuron.
>> >>>>>>> That can be one of the distinct features of Horn.
>> >>>>>>>
>> >>>>>>> Regards,
>> >>>>>>> Yeonhee
>> >>>>>>>
>> >>>>>>>
>> >>>>>>> 2016-08-26 9:40 GMT+09:00 Edward J. Yoon <edward.yoon@samsung.com
>> <javascript:;>>:
>> >>>>>>>
>> >>>>>>>> Hi forks,
>> >>>>>>>>
>> >>>>>>>> Our current neuron API is designed like:
>> >>>>>>>> https://github.com/apache/incubator-horn/blob/master/
>> >>>>>>>> README.md#programming-m
>> >>>>>>>> odel
>> >>>>>>>>
>> >>>>>>>> In forward() method, each neuron receives the pairs of the inputs
>> x1,
>> >>>>>>>> x2,
>> >>>>>>>> ... xn from other neurons and weights w1, w2, ... wn like below:
>> >>>>>>>>
>> >>>>>>>>   public void forward(Iterable<M> messages) throws IOException;
>> >>>>>>>>
>> >>>>>>>> Instead of this, I suggest that we use just vector like below:
>> >>>>>>>>
>> >>>>>>>>   /**
>> >>>>>>>>    * @param input vector from other neurons
>> >>>>>>>>    * /
>> >>>>>>>>   public void forward(Vector input) throws IOException;
>> >>>>>>>>
>> >>>>>>>> And, the neuron provides a getWeightVector() method that returns
>> >>>>>>>> weight
>> >>>>>>>> vector associated with itself. I think this is more make sense
>> than
>> >>>>>>>> current
>> >>>>>>>> version, and more easy to use GPU in the future.
>> >>>>>>>>
>> >>>>>>>> What do you think?
>> >>>>>>>>
>> >>>>>>>> Thanks.
>> >>>>>>>>
>> >>>>>>>> --
>> >>>>>>>> Best Regards, Edward J. Yoon
>> >>>>>>>>
>> >>>>>>>>
>> >>>>>>>>
>> >>>>>>>>
>> >>>>>>>>
>> >>>>>
>> >>>>>
>> >>>>>
>> >>>>> --
>> >>>>> Best Regards, Edward J. Yoon
>> >>>
>> >>>
>> >>>
>> >>> --
>> >>> Best Regards, Edward J. Yoon
>> >>
>> >>
>> >>
>> >> --
>> >> Best Regards, Edward J. Yoon
>>
>>
>>
>>
>
>

RE: Use vector instead of Iterable in neuron API

Posted by "Edward J. Yoon" <ed...@samsung.com>.
If you're using Git, you're probably using pull requests. Here's info about 
pull request: 
https://cwiki.apache.org/confluence/display/HORN/How+To+Contribute

And, please feel free to file this issue on JIRA: 
http://issues.apache.org/jira/browse/HORN

Then, I'll assign to you!

--
Best Regards, Edward J. Yoon


-----Original Message-----
From: Baran Topal [mailto:barantopal@barantopal.com]
Sent: Wednesday, September 07, 2016 8:02 AM
To: dev@horn.incubator.apache.org
Subject: Re: Use vector instead of Iterable in neuron API

Hi Edward,

Many thanks. I will update this as soon as possible.

Br.

7 Eylul 2016 Car?amba tarihinde, Edward J. Yoon <ed...@samsung.com>
yazdı:

> > directly into FloatVector and get rid of Synapse totally?
>
> Yes, I think that's more clear.
>
> Internally, each task processes neurons' computation on assigned data
> split.
> The LayeredNeuralNetwork.java contains everything, you can find the
> "n.forward(msg);" and "n.backward(msg);" at there. Before calling the
> forward() or backward() method, we can set the weight vector associated
> with
> neuron using setWeightVector() method and set the argument value for the
> forward() method like below:
>
> n.setWeightVector(weight vector e.g., weightMatrix.get(neuronID));
> n.forward(outputs from previous layer as a FloatVector);
>
> Then, user-side program can be done like below:
>
> forward(FloatVector input) {
>   this.getWeightVector(); // returns weight vector associated with itself.
>   float vectorSum;
>   for(float element : input) {
>     vectorSum += element;
>   }
> }
>
> If you have any trouble, please don't hesitate ask here.
>
> --
> Best Regards, Edward J. Yoon
>
>
> -----Original Message-----
> From: Baran Topal [mailto:barantopal@barantopal.com <javascript:;>]
> Sent: Tuesday, September 06, 2016 11:42 PM
> To: dev@horn.incubator.apache.org <javascript:;>
> Subject: Re: Use vector instead of Iterable in neuron API
>
> Hi team and Edward;
>
> I have been checking this and got stuck how to convert the list
> structure to DenseFloatVector. Can you help on this?
>
> Let me explain:
>
> I saw that the concrete FloatVector is actually some sort of array
> structure which is not really compatible with Synapse and
> FloatWritables. Is the aim to convert Synapse construction logic
> directly into FloatVector and get rid of Synapse totally?
>
> //
>
> In the original code, we are passing a list with a structure of having
> Synapse and FloatWritables. I can see that a synapse can be
> constructed with a neuron id and 2 float writables.
>
> What I tried;
>
> 1) I added the following function in Neuron.java
>
>   public DenseFloatVector getWeightVector() {
>     DenseFloatVector dfv = new DenseFloatVector(getWeights());
>     return dfv;
>   }
>
> 2) I added the following function in NeuronInterface.java
>
> public void forward2(FloatVector messages) throws IOException;
>
> 3) I added the following function in TestNeuron.java
>
>     public void forward2(FloatVector messages) throws IOException {
>
>       long startTime = System.nanoTime();
>
>       float sum = messages.dot(this.getWeightVector());
>       this.feedforward(this.squashingFunction.apply(sum));
>
>       long endTime = System.nanoTime();
>
>       System.out.println("Execution time for the forward2 function is:
> " + ((endTime - startTime)));
>
>     }
> 4) I tried to refactor testProp() in TestNeuron.java with several ways
> but failed unfortunately with runtime error since weight is not having
> a value.
>
>   MyNeuron n_ = new MyNeuron();
>
>     FloatVector ds = new DenseFloatVector();
>
>     Iterator <Synapse<FloatWritable, FloatWritable>> li = x_.iterator();
>     // Iterator<FloatVector.FloatVectorElement> ie = ds.iterate();
>
>     while(li.hasNext()) {
>       // ds.add(li.next());
>       // ie.
>       Synapse<FloatWritable, FloatWritable> ee = li.next();
>
>       ds.set(ee.getSenderID(), ee.getMessage());
>
>     }
>
>     float[] ff = new float[2];
>     ff[0] = 1.0f;
>     ff[1] = 0.5f;
>
>     float[] ffa = new float[2];
>     ffa[0] = 1.0f;
>     ffa[1] = 0.4f;
>
>     DenseFloatVector dss = new DenseFloatVector(ff);
>     DenseFloatVector dssa = new DenseFloatVector(ffa);
>
>     dss.add(dssa);
>
>
>     FloatWritable a = new FloatWritable(1.0f);
>     FloatWritable b = new FloatWritable(0.5f);
>
>     Synapse s = new Synapse(0, a, b);
>
>
>
>   //  dss.set(1, 0.5f);
>     //dss.set(1, 0.4f);
>
>    // DenseFloatVector ds = new DenseFloatVector(ff);
>
>     n_.forward2(dss); //forward2
>
>
>
> 2016-09-05 0:55 GMT+02:00 Baran Topal <barantopal@barantopal.com
> <javascript:;>>:
> > Hi;
> >
> > Thanks I am on it.
> >
> > Br.
> >
> > 2016-09-04 4:16 GMT+02:00 Edward J. Yoon <edwardyoon@apache.org
> <javascript:;>>:
> >> P.S., so, if you want to test more, please see FloatVector and
> >> DenseFloatVector.
> >>
> >> On Sun, Sep 4, 2016 at 11:13 AM, Edward J. Yoon <edwardyoon@apache.org
> <javascript:;>>
> >> wrote:
> >>> Once we change the iterable input messages to the vector, we can
> >>> change the legacy code like below:
> >>>
> >>> public void forward(FloatVector input) {
> >>>   float sum = input.dot(this.getWeightVector());
> >>>   this.feedforward(this.squashingFunction.apply(sum));
> >>> }
> >>>
> >>>
> >>>
> >>>
> >>> On Sat, Sep 3, 2016 at 11:10 PM, Baran Topal <
> barantopal@barantopal.com <javascript:;>>
> >>> wrote:
> >>>> Sure.
> >>>>
> >>>> In the attached, TestNeuron.txt,
> >>>>
> >>>> 1) I put // baran as a comment for the added functions.
> >>>>
> >>>> 2) The added functions and created objects have _ as suffix
> >>>>
> >>>> (e.g. backward_)
> >>>>
> >>>>
> >>>> A correction: above test execution time values were via
> >>>> System.nanoTime().
> >>>>
> >>>> Br.
> >>>>
> >>>> 2016-09-03 14:05 GMT+02:00 Edward J. Yoon <edwardyoon@apache.org
> <javascript:;>>:
> >>>>> Interesting. Can you share your test code?
> >>>>>
> >>>>> On Sat, Sep 3, 2016 at 2:17 AM, Baran Topal <
> barantopal@barantopal.com <javascript:;>>
> >>>>> wrote:
> >>>>>> Hi Edward and team;
> >>>>>>
> >>>>>> I had a brief test by refactoring Iterable to Vector and on
> >>>>>> TestNeuron.java, I can see some improved times. I didn't check for
> >>>>>> other existing test methods but it seems the execution times are
> >>>>>> improving for both forwarding and backwarding.
> >>>>>>
> >>>>>> These values are via System.currentTimeMillis().
> >>>>>>
> >>>>>> E.g.
> >>>>>>
> >>>>>>
> >>>>>> Execution time for the forward function is: 5722329
> >>>>>> Execution time for the backward function is: 31825
> >>>>>>
> >>>>>> Execution time for the refactored forward function is: 72330
> >>>>>> Execution time for the refactored backward function is: 4665
> >>>>>>
> >>>>>> Br.
> >>>>>>
> >>>>>> 2016-09-02 2:14 GMT+02:00 Yeonhee Lee <ssallys0130@gmail.com
> <javascript:;>>:
> >>>>>>> Hi Edward,
> >>>>>>>
> >>>>>>> If we don't have that kind of method in the neuron, I guess it's
> >>>>>>> appropriate to put the method to the neuron.
> >>>>>>> That can be one of the distinct features of Horn.
> >>>>>>>
> >>>>>>> Regards,
> >>>>>>> Yeonhee
> >>>>>>>
> >>>>>>>
> >>>>>>> 2016-08-26 9:40 GMT+09:00 Edward J. Yoon <edward.yoon@samsung.com
> <javascript:;>>:
> >>>>>>>
> >>>>>>>> Hi forks,
> >>>>>>>>
> >>>>>>>> Our current neuron API is designed like:
> >>>>>>>> https://github.com/apache/incubator-horn/blob/master/
> >>>>>>>> README.md#programming-m
> >>>>>>>> odel
> >>>>>>>>
> >>>>>>>> In forward() method, each neuron receives the pairs of the inputs
> x1,
> >>>>>>>> x2,
> >>>>>>>> ... xn from other neurons and weights w1, w2, ... wn like below:
> >>>>>>>>
> >>>>>>>>   public void forward(Iterable<M> messages) throws IOException;
> >>>>>>>>
> >>>>>>>> Instead of this, I suggest that we use just vector like below:
> >>>>>>>>
> >>>>>>>>   /**
> >>>>>>>>    * @param input vector from other neurons
> >>>>>>>>    * /
> >>>>>>>>   public void forward(Vector input) throws IOException;
> >>>>>>>>
> >>>>>>>> And, the neuron provides a getWeightVector() method that returns
> >>>>>>>> weight
> >>>>>>>> vector associated with itself. I think this is more make sense
> than
> >>>>>>>> current
> >>>>>>>> version, and more easy to use GPU in the future.
> >>>>>>>>
> >>>>>>>> What do you think?
> >>>>>>>>
> >>>>>>>> Thanks.
> >>>>>>>>
> >>>>>>>> --
> >>>>>>>> Best Regards, Edward J. Yoon
> >>>>>>>>
> >>>>>>>>
> >>>>>>>>
> >>>>>>>>
> >>>>>>>>
> >>>>>
> >>>>>
> >>>>>
> >>>>> --
> >>>>> Best Regards, Edward J. Yoon
> >>>
> >>>
> >>>
> >>> --
> >>> Best Regards, Edward J. Yoon
> >>
> >>
> >>
> >> --
> >> Best Regards, Edward J. Yoon
>
>
>
>



Re: Use vector instead of Iterable in neuron API

Posted by Baran Topal <ba...@barantopal.com>.
Hi Edward,

Many thanks. I will update this as soon as possible.

Br.

7 Eylül 2016 Çarşamba tarihinde, Edward J. Yoon <ed...@samsung.com>
yazdı:

> > directly into FloatVector and get rid of Synapse totally?
>
> Yes, I think that's more clear.
>
> Internally, each task processes neurons' computation on assigned data
> split.
> The LayeredNeuralNetwork.java contains everything, you can find the
> "n.forward(msg);" and "n.backward(msg);" at there. Before calling the
> forward() or backward() method, we can set the weight vector associated
> with
> neuron using setWeightVector() method and set the argument value for the
> forward() method like below:
>
> n.setWeightVector(weight vector e.g., weightMatrix.get(neuronID));
> n.forward(outputs from previous layer as a FloatVector);
>
> Then, user-side program can be done like below:
>
> forward(FloatVector input) {
>   this.getWeightVector(); // returns weight vector associated with itself.
>   float vectorSum;
>   for(float element : input) {
>     vectorSum += element;
>   }
> }
>
> If you have any trouble, please don't hesitate ask here.
>
> --
> Best Regards, Edward J. Yoon
>
>
> -----Original Message-----
> From: Baran Topal [mailto:barantopal@barantopal.com <javascript:;>]
> Sent: Tuesday, September 06, 2016 11:42 PM
> To: dev@horn.incubator.apache.org <javascript:;>
> Subject: Re: Use vector instead of Iterable in neuron API
>
> Hi team and Edward;
>
> I have been checking this and got stuck how to convert the list
> structure to DenseFloatVector. Can you help on this?
>
> Let me explain:
>
> I saw that the concrete FloatVector is actually some sort of array
> structure which is not really compatible with Synapse and
> FloatWritables. Is the aim to convert Synapse construction logic
> directly into FloatVector and get rid of Synapse totally?
>
> //
>
> In the original code, we are passing a list with a structure of having
> Synapse and FloatWritables. I can see that a synapse can be
> constructed with a neuron id and 2 float writables.
>
> What I tried;
>
> 1) I added the following function in Neuron.java
>
>   public DenseFloatVector getWeightVector() {
>     DenseFloatVector dfv = new DenseFloatVector(getWeights());
>     return dfv;
>   }
>
> 2) I added the following function in NeuronInterface.java
>
> public void forward2(FloatVector messages) throws IOException;
>
> 3) I added the following function in TestNeuron.java
>
>     public void forward2(FloatVector messages) throws IOException {
>
>       long startTime = System.nanoTime();
>
>       float sum = messages.dot(this.getWeightVector());
>       this.feedforward(this.squashingFunction.apply(sum));
>
>       long endTime = System.nanoTime();
>
>       System.out.println("Execution time for the forward2 function is:
> " + ((endTime - startTime)));
>
>     }
> 4) I tried to refactor testProp() in TestNeuron.java with several ways
> but failed unfortunately with runtime error since weight is not having
> a value.
>
>   MyNeuron n_ = new MyNeuron();
>
>     FloatVector ds = new DenseFloatVector();
>
>     Iterator <Synapse<FloatWritable, FloatWritable>> li = x_.iterator();
>     // Iterator<FloatVector.FloatVectorElement> ie = ds.iterate();
>
>     while(li.hasNext()) {
>       // ds.add(li.next());
>       // ie.
>       Synapse<FloatWritable, FloatWritable> ee = li.next();
>
>       ds.set(ee.getSenderID(), ee.getMessage());
>
>     }
>
>     float[] ff = new float[2];
>     ff[0] = 1.0f;
>     ff[1] = 0.5f;
>
>     float[] ffa = new float[2];
>     ffa[0] = 1.0f;
>     ffa[1] = 0.4f;
>
>     DenseFloatVector dss = new DenseFloatVector(ff);
>     DenseFloatVector dssa = new DenseFloatVector(ffa);
>
>     dss.add(dssa);
>
>
>     FloatWritable a = new FloatWritable(1.0f);
>     FloatWritable b = new FloatWritable(0.5f);
>
>     Synapse s = new Synapse(0, a, b);
>
>
>
>   //  dss.set(1, 0.5f);
>     //dss.set(1, 0.4f);
>
>    // DenseFloatVector ds = new DenseFloatVector(ff);
>
>     n_.forward2(dss); //forward2
>
>
>
> 2016-09-05 0:55 GMT+02:00 Baran Topal <barantopal@barantopal.com
> <javascript:;>>:
> > Hi;
> >
> > Thanks I am on it.
> >
> > Br.
> >
> > 2016-09-04 4:16 GMT+02:00 Edward J. Yoon <edwardyoon@apache.org
> <javascript:;>>:
> >> P.S., so, if you want to test more, please see FloatVector and
> >> DenseFloatVector.
> >>
> >> On Sun, Sep 4, 2016 at 11:13 AM, Edward J. Yoon <edwardyoon@apache.org
> <javascript:;>>
> >> wrote:
> >>> Once we change the iterable input messages to the vector, we can
> >>> change the legacy code like below:
> >>>
> >>> public void forward(FloatVector input) {
> >>>   float sum = input.dot(this.getWeightVector());
> >>>   this.feedforward(this.squashingFunction.apply(sum));
> >>> }
> >>>
> >>>
> >>>
> >>>
> >>> On Sat, Sep 3, 2016 at 11:10 PM, Baran Topal <
> barantopal@barantopal.com <javascript:;>>
> >>> wrote:
> >>>> Sure.
> >>>>
> >>>> In the attached, TestNeuron.txt,
> >>>>
> >>>> 1) I put // baran as a comment for the added functions.
> >>>>
> >>>> 2) The added functions and created objects have _ as suffix
> >>>>
> >>>> (e.g. backward_)
> >>>>
> >>>>
> >>>> A correction: above test execution time values were via
> >>>> System.nanoTime().
> >>>>
> >>>> Br.
> >>>>
> >>>> 2016-09-03 14:05 GMT+02:00 Edward J. Yoon <edwardyoon@apache.org
> <javascript:;>>:
> >>>>> Interesting. Can you share your test code?
> >>>>>
> >>>>> On Sat, Sep 3, 2016 at 2:17 AM, Baran Topal <
> barantopal@barantopal.com <javascript:;>>
> >>>>> wrote:
> >>>>>> Hi Edward and team;
> >>>>>>
> >>>>>> I had a brief test by refactoring Iterable to Vector and on
> >>>>>> TestNeuron.java, I can see some improved times. I didn't check for
> >>>>>> other existing test methods but it seems the execution times are
> >>>>>> improving for both forwarding and backwarding.
> >>>>>>
> >>>>>> These values are via System.currentTimeMillis().
> >>>>>>
> >>>>>> E.g.
> >>>>>>
> >>>>>>
> >>>>>> Execution time for the forward function is: 5722329
> >>>>>> Execution time for the backward function is: 31825
> >>>>>>
> >>>>>> Execution time for the refactored forward function is: 72330
> >>>>>> Execution time for the refactored backward function is: 4665
> >>>>>>
> >>>>>> Br.
> >>>>>>
> >>>>>> 2016-09-02 2:14 GMT+02:00 Yeonhee Lee <ssallys0130@gmail.com
> <javascript:;>>:
> >>>>>>> Hi Edward,
> >>>>>>>
> >>>>>>> If we don't have that kind of method in the neuron, I guess it's
> >>>>>>> appropriate to put the method to the neuron.
> >>>>>>> That can be one of the distinct features of Horn.
> >>>>>>>
> >>>>>>> Regards,
> >>>>>>> Yeonhee
> >>>>>>>
> >>>>>>>
> >>>>>>> 2016-08-26 9:40 GMT+09:00 Edward J. Yoon <edward.yoon@samsung.com
> <javascript:;>>:
> >>>>>>>
> >>>>>>>> Hi forks,
> >>>>>>>>
> >>>>>>>> Our current neuron API is designed like:
> >>>>>>>> https://github.com/apache/incubator-horn/blob/master/
> >>>>>>>> README.md#programming-m
> >>>>>>>> odel
> >>>>>>>>
> >>>>>>>> In forward() method, each neuron receives the pairs of the inputs
> x1,
> >>>>>>>> x2,
> >>>>>>>> ... xn from other neurons and weights w1, w2, ... wn like below:
> >>>>>>>>
> >>>>>>>>   public void forward(Iterable<M> messages) throws IOException;
> >>>>>>>>
> >>>>>>>> Instead of this, I suggest that we use just vector like below:
> >>>>>>>>
> >>>>>>>>   /**
> >>>>>>>>    * @param input vector from other neurons
> >>>>>>>>    * /
> >>>>>>>>   public void forward(Vector input) throws IOException;
> >>>>>>>>
> >>>>>>>> And, the neuron provides a getWeightVector() method that returns
> >>>>>>>> weight
> >>>>>>>> vector associated with itself. I think this is more make sense
> than
> >>>>>>>> current
> >>>>>>>> version, and more easy to use GPU in the future.
> >>>>>>>>
> >>>>>>>> What do you think?
> >>>>>>>>
> >>>>>>>> Thanks.
> >>>>>>>>
> >>>>>>>> --
> >>>>>>>> Best Regards, Edward J. Yoon
> >>>>>>>>
> >>>>>>>>
> >>>>>>>>
> >>>>>>>>
> >>>>>>>>
> >>>>>
> >>>>>
> >>>>>
> >>>>> --
> >>>>> Best Regards, Edward J. Yoon
> >>>
> >>>
> >>>
> >>> --
> >>> Best Regards, Edward J. Yoon
> >>
> >>
> >>
> >> --
> >> Best Regards, Edward J. Yoon
>
>
>
>

RE: Use vector instead of Iterable in neuron API

Posted by "Edward J. Yoon" <ed...@samsung.com>.
> directly into FloatVector and get rid of Synapse totally?

Yes, I think that's more clear.

Internally, each task processes neurons' computation on assigned data split. 
The LayeredNeuralNetwork.java contains everything, you can find the 
"n.forward(msg);" and "n.backward(msg);" at there. Before calling the 
forward() or backward() method, we can set the weight vector associated with 
neuron using setWeightVector() method and set the argument value for the 
forward() method like below:

n.setWeightVector(weight vector e.g., weightMatrix.get(neuronID));
n.forward(outputs from previous layer as a FloatVector);

Then, user-side program can be done like below:

forward(FloatVector input) {
  this.getWeightVector(); // returns weight vector associated with itself.
  float vectorSum;
  for(float element : input) {
    vectorSum += element;
  }
}

If you have any trouble, please don't hesitate ask here.

--
Best Regards, Edward J. Yoon


-----Original Message-----
From: Baran Topal [mailto:barantopal@barantopal.com]
Sent: Tuesday, September 06, 2016 11:42 PM
To: dev@horn.incubator.apache.org
Subject: Re: Use vector instead of Iterable in neuron API

Hi team and Edward;

I have been checking this and got stuck how to convert the list
structure to DenseFloatVector. Can you help on this?

Let me explain:

I saw that the concrete FloatVector is actually some sort of array
structure which is not really compatible with Synapse and
FloatWritables. Is the aim to convert Synapse construction logic
directly into FloatVector and get rid of Synapse totally?

//

In the original code, we are passing a list with a structure of having
Synapse and FloatWritables. I can see that a synapse can be
constructed with a neuron id and 2 float writables.

What I tried;

1) I added the following function in Neuron.java

  public DenseFloatVector getWeightVector() {
    DenseFloatVector dfv = new DenseFloatVector(getWeights());
    return dfv;
  }

2) I added the following function in NeuronInterface.java

public void forward2(FloatVector messages) throws IOException;

3) I added the following function in TestNeuron.java

    public void forward2(FloatVector messages) throws IOException {

      long startTime = System.nanoTime();

      float sum = messages.dot(this.getWeightVector());
      this.feedforward(this.squashingFunction.apply(sum));

      long endTime = System.nanoTime();

      System.out.println("Execution time for the forward2 function is:
" + ((endTime - startTime)));

    }
4) I tried to refactor testProp() in TestNeuron.java with several ways
but failed unfortunately with runtime error since weight is not having
a value.

  MyNeuron n_ = new MyNeuron();

    FloatVector ds = new DenseFloatVector();

    Iterator <Synapse<FloatWritable, FloatWritable>> li = x_.iterator();
    // Iterator<FloatVector.FloatVectorElement> ie = ds.iterate();

    while(li.hasNext()) {
      // ds.add(li.next());
      // ie.
      Synapse<FloatWritable, FloatWritable> ee = li.next();

      ds.set(ee.getSenderID(), ee.getMessage());

    }

    float[] ff = new float[2];
    ff[0] = 1.0f;
    ff[1] = 0.5f;

    float[] ffa = new float[2];
    ffa[0] = 1.0f;
    ffa[1] = 0.4f;

    DenseFloatVector dss = new DenseFloatVector(ff);
    DenseFloatVector dssa = new DenseFloatVector(ffa);

    dss.add(dssa);


    FloatWritable a = new FloatWritable(1.0f);
    FloatWritable b = new FloatWritable(0.5f);

    Synapse s = new Synapse(0, a, b);



  //  dss.set(1, 0.5f);
    //dss.set(1, 0.4f);

   // DenseFloatVector ds = new DenseFloatVector(ff);

    n_.forward2(dss); //forward2



2016-09-05 0:55 GMT+02:00 Baran Topal <ba...@barantopal.com>:
> Hi;
>
> Thanks I am on it.
>
> Br.
>
> 2016-09-04 4:16 GMT+02:00 Edward J. Yoon <ed...@apache.org>:
>> P.S., so, if you want to test more, please see FloatVector and
>> DenseFloatVector.
>>
>> On Sun, Sep 4, 2016 at 11:13 AM, Edward J. Yoon <ed...@apache.org> 
>> wrote:
>>> Once we change the iterable input messages to the vector, we can
>>> change the legacy code like below:
>>>
>>> public void forward(FloatVector input) {
>>>   float sum = input.dot(this.getWeightVector());
>>>   this.feedforward(this.squashingFunction.apply(sum));
>>> }
>>>
>>>
>>>
>>>
>>> On Sat, Sep 3, 2016 at 11:10 PM, Baran Topal <ba...@barantopal.com> 
>>> wrote:
>>>> Sure.
>>>>
>>>> In the attached, TestNeuron.txt,
>>>>
>>>> 1) I put // baran as a comment for the added functions.
>>>>
>>>> 2) The added functions and created objects have _ as suffix
>>>>
>>>> (e.g. backward_)
>>>>
>>>>
>>>> A correction: above test execution time values were via 
>>>> System.nanoTime().
>>>>
>>>> Br.
>>>>
>>>> 2016-09-03 14:05 GMT+02:00 Edward J. Yoon <ed...@apache.org>:
>>>>> Interesting. Can you share your test code?
>>>>>
>>>>> On Sat, Sep 3, 2016 at 2:17 AM, Baran Topal <ba...@barantopal.com> 
>>>>> wrote:
>>>>>> Hi Edward and team;
>>>>>>
>>>>>> I had a brief test by refactoring Iterable to Vector and on
>>>>>> TestNeuron.java, I can see some improved times. I didn't check for
>>>>>> other existing test methods but it seems the execution times are
>>>>>> improving for both forwarding and backwarding.
>>>>>>
>>>>>> These values are via System.currentTimeMillis().
>>>>>>
>>>>>> E.g.
>>>>>>
>>>>>>
>>>>>> Execution time for the forward function is: 5722329
>>>>>> Execution time for the backward function is: 31825
>>>>>>
>>>>>> Execution time for the refactored forward function is: 72330
>>>>>> Execution time for the refactored backward function is: 4665
>>>>>>
>>>>>> Br.
>>>>>>
>>>>>> 2016-09-02 2:14 GMT+02:00 Yeonhee Lee <ss...@gmail.com>:
>>>>>>> Hi Edward,
>>>>>>>
>>>>>>> If we don't have that kind of method in the neuron, I guess it's
>>>>>>> appropriate to put the method to the neuron.
>>>>>>> That can be one of the distinct features of Horn.
>>>>>>>
>>>>>>> Regards,
>>>>>>> Yeonhee
>>>>>>>
>>>>>>>
>>>>>>> 2016-08-26 9:40 GMT+09:00 Edward J. Yoon <ed...@samsung.com>:
>>>>>>>
>>>>>>>> Hi forks,
>>>>>>>>
>>>>>>>> Our current neuron API is designed like:
>>>>>>>> https://github.com/apache/incubator-horn/blob/master/
>>>>>>>> README.md#programming-m
>>>>>>>> odel
>>>>>>>>
>>>>>>>> In forward() method, each neuron receives the pairs of the inputs x1, 
>>>>>>>> x2,
>>>>>>>> ... xn from other neurons and weights w1, w2, ... wn like below:
>>>>>>>>
>>>>>>>>   public void forward(Iterable<M> messages) throws IOException;
>>>>>>>>
>>>>>>>> Instead of this, I suggest that we use just vector like below:
>>>>>>>>
>>>>>>>>   /**
>>>>>>>>    * @param input vector from other neurons
>>>>>>>>    * /
>>>>>>>>   public void forward(Vector input) throws IOException;
>>>>>>>>
>>>>>>>> And, the neuron provides a getWeightVector() method that returns 
>>>>>>>> weight
>>>>>>>> vector associated with itself. I think this is more make sense than 
>>>>>>>> current
>>>>>>>> version, and more easy to use GPU in the future.
>>>>>>>>
>>>>>>>> What do you think?
>>>>>>>>
>>>>>>>> Thanks.
>>>>>>>>
>>>>>>>> --
>>>>>>>> Best Regards, Edward J. Yoon
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> Best Regards, Edward J. Yoon
>>>
>>>
>>>
>>> --
>>> Best Regards, Edward J. Yoon
>>
>>
>>
>> --
>> Best Regards, Edward J. Yoon




Re: Use vector instead of Iterable in neuron API

Posted by Baran Topal <ba...@barantopal.com>.
Hi team and Edward;

I have been checking this and got stuck how to convert the list
structure to DenseFloatVector. Can you help on this?

Let me explain:

I saw that the concrete FloatVector is actually some sort of array
structure which is not really compatible with Synapse and
FloatWritables. Is the aim to convert Synapse construction logic
directly into FloatVector and get rid of Synapse totally?

//

In the original code, we are passing a list with a structure of having
Synapse and FloatWritables. I can see that a synapse can be
constructed with a neuron id and 2 float writables.

What I tried;

1) I added the following function in Neuron.java

  public DenseFloatVector getWeightVector() {
    DenseFloatVector dfv = new DenseFloatVector(getWeights());
    return dfv;
  }

2) I added the following function in NeuronInterface.java

public void forward2(FloatVector messages) throws IOException;

3) I added the following function in TestNeuron.java

    public void forward2(FloatVector messages) throws IOException {

      long startTime = System.nanoTime();

      float sum = messages.dot(this.getWeightVector());
      this.feedforward(this.squashingFunction.apply(sum));

      long endTime = System.nanoTime();

      System.out.println("Execution time for the forward2 function is:
" + ((endTime - startTime)));

    }
4) I tried to refactor testProp() in TestNeuron.java with several ways
but failed unfortunately with runtime error since weight is not having
a value.

  MyNeuron n_ = new MyNeuron();

    FloatVector ds = new DenseFloatVector();

    Iterator <Synapse<FloatWritable, FloatWritable>> li = x_.iterator();
    // Iterator<FloatVector.FloatVectorElement> ie = ds.iterate();

    while(li.hasNext()) {
      // ds.add(li.next());
      // ie.
      Synapse<FloatWritable, FloatWritable> ee = li.next();

      ds.set(ee.getSenderID(), ee.getMessage());

    }

    float[] ff = new float[2];
    ff[0] = 1.0f;
    ff[1] = 0.5f;

    float[] ffa = new float[2];
    ffa[0] = 1.0f;
    ffa[1] = 0.4f;

    DenseFloatVector dss = new DenseFloatVector(ff);
    DenseFloatVector dssa = new DenseFloatVector(ffa);

    dss.add(dssa);


    FloatWritable a = new FloatWritable(1.0f);
    FloatWritable b = new FloatWritable(0.5f);

    Synapse s = new Synapse(0, a, b);



  //  dss.set(1, 0.5f);
    //dss.set(1, 0.4f);

   // DenseFloatVector ds = new DenseFloatVector(ff);

    n_.forward2(dss); //forward2



2016-09-05 0:55 GMT+02:00 Baran Topal <ba...@barantopal.com>:
> Hi;
>
> Thanks I am on it.
>
> Br.
>
> 2016-09-04 4:16 GMT+02:00 Edward J. Yoon <ed...@apache.org>:
>> P.S., so, if you want to test more, please see FloatVector and
>> DenseFloatVector.
>>
>> On Sun, Sep 4, 2016 at 11:13 AM, Edward J. Yoon <ed...@apache.org> wrote:
>>> Once we change the iterable input messages to the vector, we can
>>> change the legacy code like below:
>>>
>>> public void forward(FloatVector input) {
>>>   float sum = input.dot(this.getWeightVector());
>>>   this.feedforward(this.squashingFunction.apply(sum));
>>> }
>>>
>>>
>>>
>>>
>>> On Sat, Sep 3, 2016 at 11:10 PM, Baran Topal <ba...@barantopal.com> wrote:
>>>> Sure.
>>>>
>>>> In the attached, TestNeuron.txt,
>>>>
>>>> 1) I put // baran as a comment for the added functions.
>>>>
>>>> 2) The added functions and created objects have _ as suffix
>>>>
>>>> (e.g. backward_)
>>>>
>>>>
>>>> A correction: above test execution time values were via System.nanoTime().
>>>>
>>>> Br.
>>>>
>>>> 2016-09-03 14:05 GMT+02:00 Edward J. Yoon <ed...@apache.org>:
>>>>> Interesting. Can you share your test code?
>>>>>
>>>>> On Sat, Sep 3, 2016 at 2:17 AM, Baran Topal <ba...@barantopal.com> wrote:
>>>>>> Hi Edward and team;
>>>>>>
>>>>>> I had a brief test by refactoring Iterable to Vector and on
>>>>>> TestNeuron.java, I can see some improved times. I didn't check for
>>>>>> other existing test methods but it seems the execution times are
>>>>>> improving for both forwarding and backwarding.
>>>>>>
>>>>>> These values are via System.currentTimeMillis().
>>>>>>
>>>>>> E.g.
>>>>>>
>>>>>>
>>>>>> Execution time for the forward function is: 5722329
>>>>>> Execution time for the backward function is: 31825
>>>>>>
>>>>>> Execution time for the refactored forward function is: 72330
>>>>>> Execution time for the refactored backward function is: 4665
>>>>>>
>>>>>> Br.
>>>>>>
>>>>>> 2016-09-02 2:14 GMT+02:00 Yeonhee Lee <ss...@gmail.com>:
>>>>>>> Hi Edward,
>>>>>>>
>>>>>>> If we don't have that kind of method in the neuron, I guess it's
>>>>>>> appropriate to put the method to the neuron.
>>>>>>> That can be one of the distinct features of Horn.
>>>>>>>
>>>>>>> Regards,
>>>>>>> Yeonhee
>>>>>>>
>>>>>>>
>>>>>>> 2016-08-26 9:40 GMT+09:00 Edward J. Yoon <ed...@samsung.com>:
>>>>>>>
>>>>>>>> Hi forks,
>>>>>>>>
>>>>>>>> Our current neuron API is designed like:
>>>>>>>> https://github.com/apache/incubator-horn/blob/master/
>>>>>>>> README.md#programming-m
>>>>>>>> odel
>>>>>>>>
>>>>>>>> In forward() method, each neuron receives the pairs of the inputs x1, x2,
>>>>>>>> ... xn from other neurons and weights w1, w2, ... wn like below:
>>>>>>>>
>>>>>>>>   public void forward(Iterable<M> messages) throws IOException;
>>>>>>>>
>>>>>>>> Instead of this, I suggest that we use just vector like below:
>>>>>>>>
>>>>>>>>   /**
>>>>>>>>    * @param input vector from other neurons
>>>>>>>>    * /
>>>>>>>>   public void forward(Vector input) throws IOException;
>>>>>>>>
>>>>>>>> And, the neuron provides a getWeightVector() method that returns weight
>>>>>>>> vector associated with itself. I think this is more make sense than current
>>>>>>>> version, and more easy to use GPU in the future.
>>>>>>>>
>>>>>>>> What do you think?
>>>>>>>>
>>>>>>>> Thanks.
>>>>>>>>
>>>>>>>> --
>>>>>>>> Best Regards, Edward J. Yoon
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> Best Regards, Edward J. Yoon
>>>
>>>
>>>
>>> --
>>> Best Regards, Edward J. Yoon
>>
>>
>>
>> --
>> Best Regards, Edward J. Yoon

Re: Use vector instead of Iterable in neuron API

Posted by Baran Topal <ba...@barantopal.com>.
Hi;

Thanks I am on it.

Br.

2016-09-04 4:16 GMT+02:00 Edward J. Yoon <ed...@apache.org>:
> P.S., so, if you want to test more, please see FloatVector and
> DenseFloatVector.
>
> On Sun, Sep 4, 2016 at 11:13 AM, Edward J. Yoon <ed...@apache.org> wrote:
>> Once we change the iterable input messages to the vector, we can
>> change the legacy code like below:
>>
>> public void forward(FloatVector input) {
>>   float sum = input.dot(this.getWeightVector());
>>   this.feedforward(this.squashingFunction.apply(sum));
>> }
>>
>>
>>
>>
>> On Sat, Sep 3, 2016 at 11:10 PM, Baran Topal <ba...@barantopal.com> wrote:
>>> Sure.
>>>
>>> In the attached, TestNeuron.txt,
>>>
>>> 1) I put // baran as a comment for the added functions.
>>>
>>> 2) The added functions and created objects have _ as suffix
>>>
>>> (e.g. backward_)
>>>
>>>
>>> A correction: above test execution time values were via System.nanoTime().
>>>
>>> Br.
>>>
>>> 2016-09-03 14:05 GMT+02:00 Edward J. Yoon <ed...@apache.org>:
>>>> Interesting. Can you share your test code?
>>>>
>>>> On Sat, Sep 3, 2016 at 2:17 AM, Baran Topal <ba...@barantopal.com> wrote:
>>>>> Hi Edward and team;
>>>>>
>>>>> I had a brief test by refactoring Iterable to Vector and on
>>>>> TestNeuron.java, I can see some improved times. I didn't check for
>>>>> other existing test methods but it seems the execution times are
>>>>> improving for both forwarding and backwarding.
>>>>>
>>>>> These values are via System.currentTimeMillis().
>>>>>
>>>>> E.g.
>>>>>
>>>>>
>>>>> Execution time for the forward function is: 5722329
>>>>> Execution time for the backward function is: 31825
>>>>>
>>>>> Execution time for the refactored forward function is: 72330
>>>>> Execution time for the refactored backward function is: 4665
>>>>>
>>>>> Br.
>>>>>
>>>>> 2016-09-02 2:14 GMT+02:00 Yeonhee Lee <ss...@gmail.com>:
>>>>>> Hi Edward,
>>>>>>
>>>>>> If we don't have that kind of method in the neuron, I guess it's
>>>>>> appropriate to put the method to the neuron.
>>>>>> That can be one of the distinct features of Horn.
>>>>>>
>>>>>> Regards,
>>>>>> Yeonhee
>>>>>>
>>>>>>
>>>>>> 2016-08-26 9:40 GMT+09:00 Edward J. Yoon <ed...@samsung.com>:
>>>>>>
>>>>>>> Hi forks,
>>>>>>>
>>>>>>> Our current neuron API is designed like:
>>>>>>> https://github.com/apache/incubator-horn/blob/master/
>>>>>>> README.md#programming-m
>>>>>>> odel
>>>>>>>
>>>>>>> In forward() method, each neuron receives the pairs of the inputs x1, x2,
>>>>>>> ... xn from other neurons and weights w1, w2, ... wn like below:
>>>>>>>
>>>>>>>   public void forward(Iterable<M> messages) throws IOException;
>>>>>>>
>>>>>>> Instead of this, I suggest that we use just vector like below:
>>>>>>>
>>>>>>>   /**
>>>>>>>    * @param input vector from other neurons
>>>>>>>    * /
>>>>>>>   public void forward(Vector input) throws IOException;
>>>>>>>
>>>>>>> And, the neuron provides a getWeightVector() method that returns weight
>>>>>>> vector associated with itself. I think this is more make sense than current
>>>>>>> version, and more easy to use GPU in the future.
>>>>>>>
>>>>>>> What do you think?
>>>>>>>
>>>>>>> Thanks.
>>>>>>>
>>>>>>> --
>>>>>>> Best Regards, Edward J. Yoon
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> Best Regards, Edward J. Yoon
>>
>>
>>
>> --
>> Best Regards, Edward J. Yoon
>
>
>
> --
> Best Regards, Edward J. Yoon

Re: Use vector instead of Iterable in neuron API

Posted by "Edward J. Yoon" <ed...@apache.org>.
P.S., so, if you want to test more, please see FloatVector and
DenseFloatVector.

On Sun, Sep 4, 2016 at 11:13 AM, Edward J. Yoon <ed...@apache.org> wrote:
> Once we change the iterable input messages to the vector, we can
> change the legacy code like below:
>
> public void forward(FloatVector input) {
>   float sum = input.dot(this.getWeightVector());
>   this.feedforward(this.squashingFunction.apply(sum));
> }
>
>
>
>
> On Sat, Sep 3, 2016 at 11:10 PM, Baran Topal <ba...@barantopal.com> wrote:
>> Sure.
>>
>> In the attached, TestNeuron.txt,
>>
>> 1) I put // baran as a comment for the added functions.
>>
>> 2) The added functions and created objects have _ as suffix
>>
>> (e.g. backward_)
>>
>>
>> A correction: above test execution time values were via System.nanoTime().
>>
>> Br.
>>
>> 2016-09-03 14:05 GMT+02:00 Edward J. Yoon <ed...@apache.org>:
>>> Interesting. Can you share your test code?
>>>
>>> On Sat, Sep 3, 2016 at 2:17 AM, Baran Topal <ba...@barantopal.com> wrote:
>>>> Hi Edward and team;
>>>>
>>>> I had a brief test by refactoring Iterable to Vector and on
>>>> TestNeuron.java, I can see some improved times. I didn't check for
>>>> other existing test methods but it seems the execution times are
>>>> improving for both forwarding and backwarding.
>>>>
>>>> These values are via System.currentTimeMillis().
>>>>
>>>> E.g.
>>>>
>>>>
>>>> Execution time for the forward function is: 5722329
>>>> Execution time for the backward function is: 31825
>>>>
>>>> Execution time for the refactored forward function is: 72330
>>>> Execution time for the refactored backward function is: 4665
>>>>
>>>> Br.
>>>>
>>>> 2016-09-02 2:14 GMT+02:00 Yeonhee Lee <ss...@gmail.com>:
>>>>> Hi Edward,
>>>>>
>>>>> If we don't have that kind of method in the neuron, I guess it's
>>>>> appropriate to put the method to the neuron.
>>>>> That can be one of the distinct features of Horn.
>>>>>
>>>>> Regards,
>>>>> Yeonhee
>>>>>
>>>>>
>>>>> 2016-08-26 9:40 GMT+09:00 Edward J. Yoon <ed...@samsung.com>:
>>>>>
>>>>>> Hi forks,
>>>>>>
>>>>>> Our current neuron API is designed like:
>>>>>> https://github.com/apache/incubator-horn/blob/master/
>>>>>> README.md#programming-m
>>>>>> odel
>>>>>>
>>>>>> In forward() method, each neuron receives the pairs of the inputs x1, x2,
>>>>>> ... xn from other neurons and weights w1, w2, ... wn like below:
>>>>>>
>>>>>>   public void forward(Iterable<M> messages) throws IOException;
>>>>>>
>>>>>> Instead of this, I suggest that we use just vector like below:
>>>>>>
>>>>>>   /**
>>>>>>    * @param input vector from other neurons
>>>>>>    * /
>>>>>>   public void forward(Vector input) throws IOException;
>>>>>>
>>>>>> And, the neuron provides a getWeightVector() method that returns weight
>>>>>> vector associated with itself. I think this is more make sense than current
>>>>>> version, and more easy to use GPU in the future.
>>>>>>
>>>>>> What do you think?
>>>>>>
>>>>>> Thanks.
>>>>>>
>>>>>> --
>>>>>> Best Regards, Edward J. Yoon
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>
>>>
>>>
>>> --
>>> Best Regards, Edward J. Yoon
>
>
>
> --
> Best Regards, Edward J. Yoon



-- 
Best Regards, Edward J. Yoon

Re: Use vector instead of Iterable in neuron API

Posted by "Edward J. Yoon" <ed...@apache.org>.
Once we change the iterable input messages to the vector, we can
change the legacy code like below:

public void forward(FloatVector input) {
  float sum = input.dot(this.getWeightVector());
  this.feedforward(this.squashingFunction.apply(sum));
}




On Sat, Sep 3, 2016 at 11:10 PM, Baran Topal <ba...@barantopal.com> wrote:
> Sure.
>
> In the attached, TestNeuron.txt,
>
> 1) I put // baran as a comment for the added functions.
>
> 2) The added functions and created objects have _ as suffix
>
> (e.g. backward_)
>
>
> A correction: above test execution time values were via System.nanoTime().
>
> Br.
>
> 2016-09-03 14:05 GMT+02:00 Edward J. Yoon <ed...@apache.org>:
>> Interesting. Can you share your test code?
>>
>> On Sat, Sep 3, 2016 at 2:17 AM, Baran Topal <ba...@barantopal.com> wrote:
>>> Hi Edward and team;
>>>
>>> I had a brief test by refactoring Iterable to Vector and on
>>> TestNeuron.java, I can see some improved times. I didn't check for
>>> other existing test methods but it seems the execution times are
>>> improving for both forwarding and backwarding.
>>>
>>> These values are via System.currentTimeMillis().
>>>
>>> E.g.
>>>
>>>
>>> Execution time for the forward function is: 5722329
>>> Execution time for the backward function is: 31825
>>>
>>> Execution time for the refactored forward function is: 72330
>>> Execution time for the refactored backward function is: 4665
>>>
>>> Br.
>>>
>>> 2016-09-02 2:14 GMT+02:00 Yeonhee Lee <ss...@gmail.com>:
>>>> Hi Edward,
>>>>
>>>> If we don't have that kind of method in the neuron, I guess it's
>>>> appropriate to put the method to the neuron.
>>>> That can be one of the distinct features of Horn.
>>>>
>>>> Regards,
>>>> Yeonhee
>>>>
>>>>
>>>> 2016-08-26 9:40 GMT+09:00 Edward J. Yoon <ed...@samsung.com>:
>>>>
>>>>> Hi forks,
>>>>>
>>>>> Our current neuron API is designed like:
>>>>> https://github.com/apache/incubator-horn/blob/master/
>>>>> README.md#programming-m
>>>>> odel
>>>>>
>>>>> In forward() method, each neuron receives the pairs of the inputs x1, x2,
>>>>> ... xn from other neurons and weights w1, w2, ... wn like below:
>>>>>
>>>>>   public void forward(Iterable<M> messages) throws IOException;
>>>>>
>>>>> Instead of this, I suggest that we use just vector like below:
>>>>>
>>>>>   /**
>>>>>    * @param input vector from other neurons
>>>>>    * /
>>>>>   public void forward(Vector input) throws IOException;
>>>>>
>>>>> And, the neuron provides a getWeightVector() method that returns weight
>>>>> vector associated with itself. I think this is more make sense than current
>>>>> version, and more easy to use GPU in the future.
>>>>>
>>>>> What do you think?
>>>>>
>>>>> Thanks.
>>>>>
>>>>> --
>>>>> Best Regards, Edward J. Yoon
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>
>>
>>
>> --
>> Best Regards, Edward J. Yoon



-- 
Best Regards, Edward J. Yoon

Re: Use vector instead of Iterable in neuron API

Posted by Baran Topal <ba...@barantopal.com>.
Sure.

In the attached, TestNeuron.txt,

1) I put // baran as a comment for the added functions.

2) The added functions and created objects have _ as suffix

(e.g. backward_)


A correction: above test execution time values were via System.nanoTime().

Br.

2016-09-03 14:05 GMT+02:00 Edward J. Yoon <ed...@apache.org>:
> Interesting. Can you share your test code?
>
> On Sat, Sep 3, 2016 at 2:17 AM, Baran Topal <ba...@barantopal.com> wrote:
>> Hi Edward and team;
>>
>> I had a brief test by refactoring Iterable to Vector and on
>> TestNeuron.java, I can see some improved times. I didn't check for
>> other existing test methods but it seems the execution times are
>> improving for both forwarding and backwarding.
>>
>> These values are via System.currentTimeMillis().
>>
>> E.g.
>>
>>
>> Execution time for the forward function is: 5722329
>> Execution time for the backward function is: 31825
>>
>> Execution time for the refactored forward function is: 72330
>> Execution time for the refactored backward function is: 4665
>>
>> Br.
>>
>> 2016-09-02 2:14 GMT+02:00 Yeonhee Lee <ss...@gmail.com>:
>>> Hi Edward,
>>>
>>> If we don't have that kind of method in the neuron, I guess it's
>>> appropriate to put the method to the neuron.
>>> That can be one of the distinct features of Horn.
>>>
>>> Regards,
>>> Yeonhee
>>>
>>>
>>> 2016-08-26 9:40 GMT+09:00 Edward J. Yoon <ed...@samsung.com>:
>>>
>>>> Hi forks,
>>>>
>>>> Our current neuron API is designed like:
>>>> https://github.com/apache/incubator-horn/blob/master/
>>>> README.md#programming-m
>>>> odel
>>>>
>>>> In forward() method, each neuron receives the pairs of the inputs x1, x2,
>>>> ... xn from other neurons and weights w1, w2, ... wn like below:
>>>>
>>>>   public void forward(Iterable<M> messages) throws IOException;
>>>>
>>>> Instead of this, I suggest that we use just vector like below:
>>>>
>>>>   /**
>>>>    * @param input vector from other neurons
>>>>    * /
>>>>   public void forward(Vector input) throws IOException;
>>>>
>>>> And, the neuron provides a getWeightVector() method that returns weight
>>>> vector associated with itself. I think this is more make sense than current
>>>> version, and more easy to use GPU in the future.
>>>>
>>>> What do you think?
>>>>
>>>> Thanks.
>>>>
>>>> --
>>>> Best Regards, Edward J. Yoon
>>>>
>>>>
>>>>
>>>>
>>>>
>
>
>
> --
> Best Regards, Edward J. Yoon

Re: Use vector instead of Iterable in neuron API

Posted by "Edward J. Yoon" <ed...@apache.org>.
Interesting. Can you share your test code?

On Sat, Sep 3, 2016 at 2:17 AM, Baran Topal <ba...@barantopal.com> wrote:
> Hi Edward and team;
>
> I had a brief test by refactoring Iterable to Vector and on
> TestNeuron.java, I can see some improved times. I didn't check for
> other existing test methods but it seems the execution times are
> improving for both forwarding and backwarding.
>
> These values are via System.currentTimeMillis().
>
> E.g.
>
>
> Execution time for the forward function is: 5722329
> Execution time for the backward function is: 31825
>
> Execution time for the refactored forward function is: 72330
> Execution time for the refactored backward function is: 4665
>
> Br.
>
> 2016-09-02 2:14 GMT+02:00 Yeonhee Lee <ss...@gmail.com>:
>> Hi Edward,
>>
>> If we don't have that kind of method in the neuron, I guess it's
>> appropriate to put the method to the neuron.
>> That can be one of the distinct features of Horn.
>>
>> Regards,
>> Yeonhee
>>
>>
>> 2016-08-26 9:40 GMT+09:00 Edward J. Yoon <ed...@samsung.com>:
>>
>>> Hi forks,
>>>
>>> Our current neuron API is designed like:
>>> https://github.com/apache/incubator-horn/blob/master/
>>> README.md#programming-m
>>> odel
>>>
>>> In forward() method, each neuron receives the pairs of the inputs x1, x2,
>>> ... xn from other neurons and weights w1, w2, ... wn like below:
>>>
>>>   public void forward(Iterable<M> messages) throws IOException;
>>>
>>> Instead of this, I suggest that we use just vector like below:
>>>
>>>   /**
>>>    * @param input vector from other neurons
>>>    * /
>>>   public void forward(Vector input) throws IOException;
>>>
>>> And, the neuron provides a getWeightVector() method that returns weight
>>> vector associated with itself. I think this is more make sense than current
>>> version, and more easy to use GPU in the future.
>>>
>>> What do you think?
>>>
>>> Thanks.
>>>
>>> --
>>> Best Regards, Edward J. Yoon
>>>
>>>
>>>
>>>
>>>



-- 
Best Regards, Edward J. Yoon

Re: Use vector instead of Iterable in neuron API

Posted by Baran Topal <ba...@barantopal.com>.
Hi Edward and team;

I had a brief test by refactoring Iterable to Vector and on
TestNeuron.java, I can see some improved times. I didn't check for
other existing test methods but it seems the execution times are
improving for both forwarding and backwarding.

These values are via System.currentTimeMillis().

E.g.


Execution time for the forward function is: 5722329
Execution time for the backward function is: 31825

Execution time for the refactored forward function is: 72330
Execution time for the refactored backward function is: 4665

Br.

2016-09-02 2:14 GMT+02:00 Yeonhee Lee <ss...@gmail.com>:
> Hi Edward,
>
> If we don't have that kind of method in the neuron, I guess it's
> appropriate to put the method to the neuron.
> That can be one of the distinct features of Horn.
>
> Regards,
> Yeonhee
>
>
> 2016-08-26 9:40 GMT+09:00 Edward J. Yoon <ed...@samsung.com>:
>
>> Hi forks,
>>
>> Our current neuron API is designed like:
>> https://github.com/apache/incubator-horn/blob/master/
>> README.md#programming-m
>> odel
>>
>> In forward() method, each neuron receives the pairs of the inputs x1, x2,
>> ... xn from other neurons and weights w1, w2, ... wn like below:
>>
>>   public void forward(Iterable<M> messages) throws IOException;
>>
>> Instead of this, I suggest that we use just vector like below:
>>
>>   /**
>>    * @param input vector from other neurons
>>    * /
>>   public void forward(Vector input) throws IOException;
>>
>> And, the neuron provides a getWeightVector() method that returns weight
>> vector associated with itself. I think this is more make sense than current
>> version, and more easy to use GPU in the future.
>>
>> What do you think?
>>
>> Thanks.
>>
>> --
>> Best Regards, Edward J. Yoon
>>
>>
>>
>>
>>

Re: Use vector instead of Iterable in neuron API

Posted by Yeonhee Lee <ss...@gmail.com>.
Hi Edward,

If we don't have that kind of method in the neuron, I guess it's
appropriate to put the method to the neuron.
That can be one of the distinct features of Horn.

Regards,
Yeonhee


2016-08-26 9:40 GMT+09:00 Edward J. Yoon <ed...@samsung.com>:

> Hi forks,
>
> Our current neuron API is designed like:
> https://github.com/apache/incubator-horn/blob/master/
> README.md#programming-m
> odel
>
> In forward() method, each neuron receives the pairs of the inputs x1, x2,
> ... xn from other neurons and weights w1, w2, ... wn like below:
>
>   public void forward(Iterable<M> messages) throws IOException;
>
> Instead of this, I suggest that we use just vector like below:
>
>   /**
>    * @param input vector from other neurons
>    * /
>   public void forward(Vector input) throws IOException;
>
> And, the neuron provides a getWeightVector() method that returns weight
> vector associated with itself. I think this is more make sense than current
> version, and more easy to use GPU in the future.
>
> What do you think?
>
> Thanks.
>
> --
> Best Regards, Edward J. Yoon
>
>
>
>
>