You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@mahout.apache.org by "Suneel Marthi (JIRA)" <ji...@apache.org> on 2016/04/23 19:24:12 UTC

[jira] [Issue Comment Deleted] (MAHOUT-1791) Automatic threading for java based mmul in the front end and the backend.

     [ https://issues.apache.org/jira/browse/MAHOUT-1791?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Suneel Marthi updated MAHOUT-1791:
----------------------------------
    Comment: was deleted

(was: Andrew/Dmitry,
I am wondering if there is any coding related efforts here that I can help investigate or are there enough engineers already working on this?)

> Automatic threading for java based mmul in the front end and the backend.
> -------------------------------------------------------------------------
>
>                 Key: MAHOUT-1791
>                 URL: https://issues.apache.org/jira/browse/MAHOUT-1791
>             Project: Mahout
>          Issue Type: Improvement
>    Affects Versions: 0.11.1, 0.12.0, 0.11.2
>            Reporter: Dmitriy Lyubimov
>            Assignee: Andrew Musselman
>             Fix For: 0.12.1
>
>
> As we know, we are still struggling with decisions which path to take for bare metal accelerations in in-core math. 
> Meanwhile, a simple no-brainer improvement though is to add decision paths and apply multithreaded matrix-matrix multiplication (and maybe even others; but mmul perhaps is the most prominent beneficiary here at the moment which is both easy to do and to have a statistically significant improvement) 
> So multithreaded logic addition to mmul is one path. 
> Another path is automatic adjustment of multithreading. 
> In front end, we probably want to utilize all cores available. 
> in the backend, we can oversubscribe cores but probably doing so by more than 2x or 3x is unadvisable because of point of diminishing returns driven by growing likelihood of context switching overhead.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)