You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Joseph K. Bradley (JIRA)" <ji...@apache.org> on 2017/02/09 20:07:41 UTC

[jira] [Commented] (SPARK-13857) Feature parity for ALS ML with MLLIB

    [ https://issues.apache.org/jira/browse/SPARK-13857?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15860108#comment-15860108 ] 

Joseph K. Bradley commented on SPARK-13857:
-------------------------------------------

Hi all, catching up these many ALS discussions now.  This work to support evaluation and tuning for recommendation is great, but I'm worried about it not being resolved in time for 2.2.  I've heard a lot of requests for the plain functionality available in spark.mllib for recommendUsers/Products, so I'd recommend we just add those methods for now as a short-term solution.  Let's keep working on the evaluation/tuning plans too.  I'll create a JIRA for adding basic recommendUsers/Products methods.

> Feature parity for ALS ML with MLLIB
> ------------------------------------
>
>                 Key: SPARK-13857
>                 URL: https://issues.apache.org/jira/browse/SPARK-13857
>             Project: Spark
>          Issue Type: Sub-task
>          Components: ML
>            Reporter: Nick Pentreath
>            Assignee: Nick Pentreath
>
> Currently {{mllib.recommendation.MatrixFactorizationModel}} has methods {{recommendProducts/recommendUsers}} for recommending top K to a given user / item, as well as {{recommendProductsForUsers/recommendUsersForProducts}} to recommend top K across all users/items.
> Additionally, SPARK-10802 is for adding the ability to do {{recommendProductsForUsers}} for a subset of users (or vice versa).
> Look at exposing or porting (as appropriate) these methods to ALS in ML. 
> Investigate if efficiency can be improved at the same time (see SPARK-11968).



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org