You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Jen Darrouzet (JIRA)" <ji...@apache.org> on 2019/04/07 17:43:00 UTC
[jira] [Commented] (SPARK-26970) Can't load PipelineModel that was
created in Scala with Python due to missing Interaction transformer
[ https://issues.apache.org/jira/browse/SPARK-26970?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16811932#comment-16811932 ]
Jen Darrouzet commented on SPARK-26970:
---------------------------------------
I am a newbie but would very much like to use the interaction transformer in pyspark and am available to help QA test it, when/if it becomes available.
> Can't load PipelineModel that was created in Scala with Python due to missing Interaction transformer
> -----------------------------------------------------------------------------------------------------
>
> Key: SPARK-26970
> URL: https://issues.apache.org/jira/browse/SPARK-26970
> Project: Spark
> Issue Type: Improvement
> Components: ML, PySpark
> Affects Versions: 2.4.0
> Reporter: Andrew Crosby
> Priority: Minor
>
> The Interaction transformer [https://github.com/apache/spark/blob/master/mllib/src/main/scala/org/apache/spark/ml/feature/Interaction.scala|https://github.com/apache/spark/blob/master/mllib/src/main/scala/org/apache/spark/ml/feature/Interaction.scala] is missing from the set of pyspark feature transformers [https://github.com/apache/spark/blob/master/python/pyspark/ml/feature.py|https://github.com/apache/spark/blob/master/python/pyspark/ml/feature.py]
> This means that it is impossible to create a model that includes an Interaction transformer with pyspark. It also means that attempting to load a PipelineModel created in Scala that includes an Interaction transformer with pyspark fails with the following error:
> {code:java}
> AttributeError: module 'pyspark.ml.feature' has no attribute 'Interaction'
> {code}
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org