You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Bryan Cutler (JIRA)" <ji...@apache.org> on 2019/04/23 20:55:01 UTC
[jira] [Resolved] (SPARK-26970) Can't load PipelineModel that was
created in Scala with Python due to missing Interaction transformer
[ https://issues.apache.org/jira/browse/SPARK-26970?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Bryan Cutler resolved SPARK-26970.
----------------------------------
Resolution: Fixed
Fix Version/s: 3.0.0
Issue resolved by pull request 24426
[https://github.com/apache/spark/pull/24426]
> Can't load PipelineModel that was created in Scala with Python due to missing Interaction transformer
> -----------------------------------------------------------------------------------------------------
>
> Key: SPARK-26970
> URL: https://issues.apache.org/jira/browse/SPARK-26970
> Project: Spark
> Issue Type: Improvement
> Components: ML, PySpark
> Affects Versions: 2.4.0
> Reporter: Andrew Crosby
> Priority: Minor
> Fix For: 3.0.0
>
>
> The Interaction transformer [https://github.com/apache/spark/blob/master/mllib/src/main/scala/org/apache/spark/ml/feature/Interaction.scala|https://github.com/apache/spark/blob/master/mllib/src/main/scala/org/apache/spark/ml/feature/Interaction.scala] is missing from the set of pyspark feature transformers [https://github.com/apache/spark/blob/master/python/pyspark/ml/feature.py|https://github.com/apache/spark/blob/master/python/pyspark/ml/feature.py]
> This means that it is impossible to create a model that includes an Interaction transformer with pyspark. It also means that attempting to load a PipelineModel created in Scala that includes an Interaction transformer with pyspark fails with the following error:
> {code:java}
> AttributeError: module 'pyspark.ml.feature' has no attribute 'Interaction'
> {code}
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org