You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/05/21 04:00:26 UTC
[jira] [Updated] (SPARK-19475) (ML|MLlib).linalg.DenseVector method
delegation fails for __neg__
[ https://issues.apache.org/jira/browse/SPARK-19475?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon updated SPARK-19475:
---------------------------------
Labels: bulk-closed (was: )
> (ML|MLlib).linalg.DenseVector method delegation fails for __neg__
> -----------------------------------------------------------------
>
> Key: SPARK-19475
> URL: https://issues.apache.org/jira/browse/SPARK-19475
> Project: Spark
> Issue Type: Bug
> Components: ML, MLlib, PySpark
> Affects Versions: 2.0.0, 2.1.0, 2.2.0
> Reporter: Maciej Szymkiewicz
> Priority: Minor
> Labels: bulk-closed
>
> {{(ML|MLlib).linalg.DenseVector}} delegate number of methods to NumPy. By design [it does the same|https://github.com/apache/spark/blob/933a6548d423cf17448207a99299cf36fc1a95f6/python/pyspark/mllib/linalg/__init__.py#L487] for {{__neg__}} but current {{_delegate}} method [expects binary operators|https://github.com/apache/spark/blob/933a6548d423cf17448207a99299cf36fc1a95f6/python/pyspark/mllib/linalg/__init__.py#L481].
> {code}
> Welcome to
> ____ __
> / __/__ ___ _____/ /__
> _\ \/ _ \/ _ `/ __/ '_/
> /__ / .__/\_,_/_/ /_/\_\ version 2.1.0
> /_/
> Using Python version 3.5.2 (default, Jul 2 2016 17:53:06)
> SparkSession available as 'spark'.
> In [1]: from pyspark.ml import linalg
> In [2]: -linalg.DenseVector([1, 2, 3])
> ---------------------------------------------------------------------------
> TypeError Traceback (most recent call last)
> <ipython-input-2-737fe2c5dfd8> in <module>()
> ----> 1 -linalg.DenseVector([1, 2, 3])
> TypeError: func() missing 1 required positional argument: 'other'
> {code}
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org