You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by HARSH TAKKAR <ta...@gmail.com> on 2021/02/16 05:22:39 UTC

Using Custom Scala Spark ML Estimator in PySpark

Hi ,

I have created a custom Estimator in scala, which i can use successfully by
creating a pipeline model in Java and scala, But when i try to load the
pipeline model saved using scala api in pyspark, i am getting an error
saying module not found.

I have included my custom model jar in the class pass using "spark.jars"

Can you please help, if i am missing something.

Kind Regards
Harsh Takkar

Re: Using Custom Scala Spark ML Estimator in PySpark

Posted by Mich Talebzadeh <mi...@gmail.com>.
Hi,

Specifically is this a run time or compilation error.

I gather by class path you mean something like below

spark-submit --master yarn --deploy-mode client --driver-class-path
<full_path_to_custom_jar>  --jars ......

HTH





LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*





*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.




On Tue, 16 Feb 2021 at 05:23, HARSH TAKKAR <ta...@gmail.com> wrote:

> Hi ,
>
> I have created a custom Estimator in scala, which i can use successfully
> by creating a pipeline model in Java and scala, But when i try to load the
> pipeline model saved using scala api in pyspark, i am getting an error
> saying module not found.
>
> I have included my custom model jar in the class pass using "spark.jars"
>
> Can you please help, if i am missing something.
>
> Kind Regards
> Harsh Takkar
>

Re: Using Custom Scala Spark ML Estimator in PySpark

Posted by HARSH TAKKAR <ta...@gmail.com>.
Hello Sean,

Thanks for the advice, can you please point me to an example where i can
find a custom wrapper for python.


Kind Regards
Harsh Takkar

On Tue, 16 Feb, 2021, 8:25 pm Sean Owen, <sr...@gmail.com> wrote:

> You won't be able to use it in python if it is implemented in Java - needs
> a python wrapper too.
>
> On Mon, Feb 15, 2021, 11:29 PM HARSH TAKKAR <ta...@gmail.com> wrote:
>
>> Hi ,
>>
>> I have created a custom Estimator in scala, which i can use successfully
>> by creating a pipeline model in Java and scala, But when i try to load the
>> pipeline model saved using scala api in pyspark, i am getting an error
>> saying module not found.
>>
>> I have included my custom model jar in the class pass using "spark.jars"
>>
>> Can you please help, if i am missing something.
>>
>> Kind Regards
>> Harsh Takkar
>>
>

Re: Using Custom Scala Spark ML Estimator in PySpark

Posted by Sean Owen <sr...@gmail.com>.
You won't be able to use it in python if it is implemented in Java - needs
a python wrapper too.

On Mon, Feb 15, 2021, 11:29 PM HARSH TAKKAR <ta...@gmail.com> wrote:

> Hi ,
>
> I have created a custom Estimator in scala, which i can use successfully
> by creating a pipeline model in Java and scala, But when i try to load the
> pipeline model saved using scala api in pyspark, i am getting an error
> saying module not found.
>
> I have included my custom model jar in the class pass using "spark.jars"
>
> Can you please help, if i am missing something.
>
> Kind Regards
> Harsh Takkar
>