You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Rahul Palamuttam (JIRA)" <ji...@apache.org> on 2015/07/10 01:37:04 UTC

[jira] [Commented] (SPARK-6442) MLlib Local Linear Algebra Package

    [ https://issues.apache.org/jira/browse/SPARK-6442?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14621452#comment-14621452 ] 

Rahul Palamuttam commented on SPARK-6442:
-----------------------------------------

Hi!
I'm fairly new here but have been dealing with a similar issues concerning matrix operations on Spark.
I noticed in the design docs the note about using JBLAS api since java does not support operator overloading.
This is good but Mllib should also provide the overloaded operators. The operators would be functions wrapped around the suggest JBLAS api. I'm suggesting this based off of my experience with Nd4j which is a linear algebra library that allows users to switch between using the java api and scala operators for their linear algebra operations.

Is it feasible to do this in Mllib?



> MLlib Local Linear Algebra Package
> ----------------------------------
>
>                 Key: SPARK-6442
>                 URL: https://issues.apache.org/jira/browse/SPARK-6442
>             Project: Spark
>          Issue Type: New Feature
>          Components: MLlib
>            Reporter: Burak Yavuz
>            Priority: Critical
>
> MLlib's local linear algebra package doesn't have any support for any type of matrix operations. With 1.5, we wish to add support to a complete package of optimized linear algebra operations for Scala/Java users.
> The main goal is to support lazy operations so that element-wise can be implemented in a single for-loop, and complex operations can be interfaced through BLAS. 
> The design doc: http://goo.gl/sf5LCE



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org