You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@singa.apache.org by Greg Stein <gs...@gmail.com> on 2016/01/15 13:23:15 UTC

Re: [ANNOUNCE] Apache SINGA (incubating) 0.2.0 release

In the future, you MUST include the Incubation Disclaimer in your release
announcements. Unfortunately, your announcement email was moderated
through, erroneously. Please correct your procedures for your next release.

Your download page should also include the disclaimer.

Thx,
-g

On Thu, Jan 14, 2016 at 9:42 PM, Wang Wei <wa...@apache.org> wrote:

> Hi,
>
> We are pleased to announce that Apache SINGA (incubating) 0.2.0 is
> released.
>
> SINGA is a general distributed deep learning platform for training big
> deep learning models over large datasets. It is designed with an intuitive
> programming model based on the layer abstraction. SINGA supports a wide
> variety of popular deep learning models.
>
> The release is available at:
> http://singa.apache.org/downloads.html
>
> The main features of this release include
>
> * Training on GPU  -- enabling training of complex models on a single node
> with multiple GPU cards.
> * Hybrid neural net partitioning -- supporting data and model parallelism
> at the same time.
> * Python wrapper -- making it easier to configure jobs, including neural
> net and SGD algorithm.
> * RNN model and BPTT algorithm -- supporting applications based on RNN
> models, e.g., GRU.
> * Cloud software integration, including Mesos, Docker and HDFS.
> * Visualization of neural net structure and layer information -- helpful
> for debugging.
> * Linear algebra functions and random functions against Blobs and raw data
> pointers.
> * New layers, including SoftmaxLayer, ArgSortLayer, DummyLayer, RNN layers
> and cuDNN layers.
> * Update Layer class -- for carrying multiple data/grad Blobs.
> * Extract features and test performance for new data by loading previously
> trained model parameters.
> * Add Store class for IO operations
>
> We look forward to hearing your feedbacks, suggestions, and contributions
> to the project (http://singa.apache.org/develop/schedule.html).
>
> On behalf of the SINGA team,
> Wei Wang
>

Re: [ANNOUNCE] Apache SINGA (incubating) 0.2.0 release

Posted by Wang Wei <wa...@apache.org>.
Hi Greg,

Thanks!
We have added the disclaimer in the download page, and will add it in the
announce email for the next release.

Best,
Wei

On Fri, Jan 15, 2016 at 8:23 PM, Greg Stein <gs...@gmail.com> wrote:

> In the future, you MUST include the Incubation Disclaimer in your release
> announcements. Unfortunately, your announcement email was moderated
> through, erroneously. Please correct your procedures for your next release.
>
> Your download page should also include the disclaimer.
>
> Thx,
> -g
>
> On Thu, Jan 14, 2016 at 9:42 PM, Wang Wei <wa...@apache.org> wrote:
>
> > Hi,
> >
> > We are pleased to announce that Apache SINGA (incubating) 0.2.0 is
> > released.
> >
> > SINGA is a general distributed deep learning platform for training big
> > deep learning models over large datasets. It is designed with an
> intuitive
> > programming model based on the layer abstraction. SINGA supports a wide
> > variety of popular deep learning models.
> >
> > The release is available at:
> > http://singa.apache.org/downloads.html
> >
> > The main features of this release include
> >
> > * Training on GPU  -- enabling training of complex models on a single
> node
> > with multiple GPU cards.
> > * Hybrid neural net partitioning -- supporting data and model parallelism
> > at the same time.
> > * Python wrapper -- making it easier to configure jobs, including neural
> > net and SGD algorithm.
> > * RNN model and BPTT algorithm -- supporting applications based on RNN
> > models, e.g., GRU.
> > * Cloud software integration, including Mesos, Docker and HDFS.
> > * Visualization of neural net structure and layer information -- helpful
> > for debugging.
> > * Linear algebra functions and random functions against Blobs and raw
> data
> > pointers.
> > * New layers, including SoftmaxLayer, ArgSortLayer, DummyLayer, RNN
> layers
> > and cuDNN layers.
> > * Update Layer class -- for carrying multiple data/grad Blobs.
> > * Extract features and test performance for new data by loading
> previously
> > trained model parameters.
> > * Add Store class for IO operations
> >
> > We look forward to hearing your feedbacks, suggestions, and contributions
> > to the project (http://singa.apache.org/develop/schedule.html).
> >
> > On behalf of the SINGA team,
> > Wei Wang
> >
>

Re: [ANNOUNCE] Apache SINGA (incubating) 0.2.0 release

Posted by Wang Wei <wa...@apache.org>.
Hi Greg,

Thanks!
We have added the disclaimer in the download page, and will add it in the
announce email for the next release.

Best,
Wei

On Fri, Jan 15, 2016 at 8:23 PM, Greg Stein <gs...@gmail.com> wrote:

> In the future, you MUST include the Incubation Disclaimer in your release
> announcements. Unfortunately, your announcement email was moderated
> through, erroneously. Please correct your procedures for your next release.
>
> Your download page should also include the disclaimer.
>
> Thx,
> -g
>
> On Thu, Jan 14, 2016 at 9:42 PM, Wang Wei <wa...@apache.org> wrote:
>
> > Hi,
> >
> > We are pleased to announce that Apache SINGA (incubating) 0.2.0 is
> > released.
> >
> > SINGA is a general distributed deep learning platform for training big
> > deep learning models over large datasets. It is designed with an
> intuitive
> > programming model based on the layer abstraction. SINGA supports a wide
> > variety of popular deep learning models.
> >
> > The release is available at:
> > http://singa.apache.org/downloads.html
> >
> > The main features of this release include
> >
> > * Training on GPU  -- enabling training of complex models on a single
> node
> > with multiple GPU cards.
> > * Hybrid neural net partitioning -- supporting data and model parallelism
> > at the same time.
> > * Python wrapper -- making it easier to configure jobs, including neural
> > net and SGD algorithm.
> > * RNN model and BPTT algorithm -- supporting applications based on RNN
> > models, e.g., GRU.
> > * Cloud software integration, including Mesos, Docker and HDFS.
> > * Visualization of neural net structure and layer information -- helpful
> > for debugging.
> > * Linear algebra functions and random functions against Blobs and raw
> data
> > pointers.
> > * New layers, including SoftmaxLayer, ArgSortLayer, DummyLayer, RNN
> layers
> > and cuDNN layers.
> > * Update Layer class -- for carrying multiple data/grad Blobs.
> > * Extract features and test performance for new data by loading
> previously
> > trained model parameters.
> > * Add Store class for IO operations
> >
> > We look forward to hearing your feedbacks, suggestions, and contributions
> > to the project (http://singa.apache.org/develop/schedule.html).
> >
> > On behalf of the SINGA team,
> > Wei Wang
> >
>