You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Nicola (Jira)" <ji...@apache.org> on 2022/02/01 17:44:00 UTC
[jira] [Updated] (SPARK-38083) set the amount of explained variance as parameter of pyspark.ml.feature.PCA
[ https://issues.apache.org/jira/browse/SPARK-38083?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Nicola updated SPARK-38083:
---------------------------
Summary: set the amount of explained variance as parameter of pyspark.ml.feature.PCA (was: set the amout of explained variance as parameter of pyspark.ml.feature.PCA)
> set the amount of explained variance as parameter of pyspark.ml.feature.PCA
> ---------------------------------------------------------------------------
>
> Key: SPARK-38083
> URL: https://issues.apache.org/jira/browse/SPARK-38083
> Project: Spark
> Issue Type: Wish
> Components: ML, MLlib
> Affects Versions: 3.2.2
> Reporter: Nicola
> Priority: Major
>
> As in [sklearn.decomposition.PCA|https://scikit-learn.org/stable/modules/generated/sklearn.decomposition.PCA.html], where:
> if {{0 < n_components < 1}} select the number of components such that the amount of variance that needs to be explained is greater than the percentage specified by n_components
> it would be useful to have a similar behavior with the k parameter in pyspark.ml.feature.PCA.
--
This message was sent by Atlassian Jira
(v8.20.1#820001)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org