You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2015/04/11 13:57:13 UTC

[jira] [Resolved] (SPARK-6244) Implement VectorSpace to easy create a complicated feature vector

     [ https://issues.apache.org/jira/browse/SPARK-6244?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean Owen resolved SPARK-6244.
------------------------------
    Resolution: Won't Fix

This still isn't what a "vector space" means. Naming aside, this also sounds like manipulation you can accomplish directly with Scala collections or at best a third party library, before plugging it directly into MLlib.

> Implement VectorSpace to easy create a complicated feature vector
> -----------------------------------------------------------------
>
>                 Key: SPARK-6244
>                 URL: https://issues.apache.org/jira/browse/SPARK-6244
>             Project: Spark
>          Issue Type: New Feature
>          Components: MLlib
>            Reporter: Kirill A. Korinskiy
>            Priority: Minor
>
> VectorSpace is wrapper what implement three operation:
>  - concat -- concat all vectors to single vector
>  - sum -- sum of vectors
>  - scaled -- multiple scalar to vector
>  
> Example of usage:
> ```
> import org.apache.spark.mllib.linalg.Vectors
> import org.apache.spark.mllib.linalg.VectorSpace
> // Create a new Vector Space with one dense vector.
> val vs = VectorSpace.create(Vectors.dense(1.0, 0.0, 3.0))
> // Add a to vector space a scaled vector space
> val vs2 = vs.add(vs.scaled(-1d))
> // concat vectors from vector space, result: Vectors.dense(1.0, 0.0, 3.0, -1.0, 0.0, -3.0)
> val concat = vs2.concat
> // take a sum from vector space, result: Vectors.dense(0.0, 0.0, 0.0)
> val sum = vs2.sum
> ```
> This wrapper is very useful when create a complicated feature vector from structured objects.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org