You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Mohit Jaggi <mo...@gmail.com> on 2018/07/15 06:01:18 UTC

Re: Pyspark access to scala/java libraries

Trying again…anyone know how to make this work?

> On Jul 9, 2018, at 3:45 PM, Mohit Jaggi <mo...@gmail.com> wrote:
> 
> Folks,
> I am writing some Scala/Java code and want it to be usable from pyspark.
> 
> For example:
> class MyStuff(addend: Int)  {
> 	def myMapFunction(x: Int) = x + addend
> }
> 
> I want to call it from pyspark as:
> 
> df = ...
> mystuff = sc._jvm.MyStuff(5)
> df[‘x’].map(lambda x: mystuff.myMapFunction(x))
> 
> How can I do this?
> 
> Mohit.
> 
> 


---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org


Re: Pyspark access to scala/java libraries

Posted by HARSH TAKKAR <ta...@gmail.com>.
Hi

You can access your java packages using following in pySpark

obj = sc._jvm.yourPackage.className()


Kind Regards
Harsh Takkar

On Wed, Jul 18, 2018 at 4:00 AM Mohit Jaggi <mo...@gmail.com> wrote:

> Thanks 0xF0F0F0 and Ashutosh for the pointers.
>
> Holden,
> I am trying to look into sparklingml...what am I looking for? Also which
> chapter/page of your book should I look at?
>
> Mohit.
>
> On Sun, Jul 15, 2018 at 3:02 AM Holden Karau <ho...@gmail.com>
> wrote:
>
>> If you want to see some examples in a library shows a way to do it -
>> https://github.com/sparklingpandas/sparklingml and high performance
>> spark also talks about it.
>>
>> On Sun, Jul 15, 2018, 11:57 AM <0x...@protonmail.com.invalid> wrote:
>>
>>> Check
>>> https://stackoverflow.com/questions/31684842/calling-java-scala-function-from-a-task
>>>
>>> ​Sent with ProtonMail Secure Email.​
>>>
>>> ‐‐‐‐‐‐‐ Original Message ‐‐‐‐‐‐‐
>>>
>>> On July 15, 2018 8:01 AM, Mohit Jaggi <mo...@gmail.com> wrote:
>>>
>>> > Trying again…anyone know how to make this work?
>>> >
>>> > > On Jul 9, 2018, at 3:45 PM, Mohit Jaggi mohitjaggi@gmail.com wrote:
>>> > >
>>> > > Folks,
>>> > >
>>> > > I am writing some Scala/Java code and want it to be usable from
>>> pyspark.
>>> > >
>>> > > For example:
>>> > >
>>> > > class MyStuff(addend: Int) {
>>> > >
>>> > > def myMapFunction(x: Int) = x + addend
>>> > >
>>> > > }
>>> > >
>>> > > I want to call it from pyspark as:
>>> > >
>>> > > df = ...
>>> > >
>>> > > mystuff = sc._jvm.MyStuff(5)
>>> > >
>>> > > df[‘x’].map(lambda x: mystuff.myMapFunction(x))
>>> > >
>>> > > How can I do this?
>>> > >
>>> > > Mohit.
>>> >
>>> > --
>>> >
>>> > To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>>>
>>>
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>>>
>>>

Re: Pyspark access to scala/java libraries

Posted by HARSH TAKKAR <ta...@gmail.com>.
Hi

You can access your java packages using following in pySpark

obj = sc._jvm.yourPackage.className()


Kind Regards
Harsh Takkar

On Wed, Jul 18, 2018 at 4:00 AM Mohit Jaggi <mo...@gmail.com> wrote:

> Thanks 0xF0F0F0 and Ashutosh for the pointers.
>
> Holden,
> I am trying to look into sparklingml...what am I looking for? Also which
> chapter/page of your book should I look at?
>
> Mohit.
>
> On Sun, Jul 15, 2018 at 3:02 AM Holden Karau <ho...@gmail.com>
> wrote:
>
>> If you want to see some examples in a library shows a way to do it -
>> https://github.com/sparklingpandas/sparklingml and high performance
>> spark also talks about it.
>>
>> On Sun, Jul 15, 2018, 11:57 AM <0x...@protonmail.com.invalid> wrote:
>>
>>> Check
>>> https://stackoverflow.com/questions/31684842/calling-java-scala-function-from-a-task
>>>
>>> ​Sent with ProtonMail Secure Email.​
>>>
>>> ‐‐‐‐‐‐‐ Original Message ‐‐‐‐‐‐‐
>>>
>>> On July 15, 2018 8:01 AM, Mohit Jaggi <mo...@gmail.com> wrote:
>>>
>>> > Trying again…anyone know how to make this work?
>>> >
>>> > > On Jul 9, 2018, at 3:45 PM, Mohit Jaggi mohitjaggi@gmail.com wrote:
>>> > >
>>> > > Folks,
>>> > >
>>> > > I am writing some Scala/Java code and want it to be usable from
>>> pyspark.
>>> > >
>>> > > For example:
>>> > >
>>> > > class MyStuff(addend: Int) {
>>> > >
>>> > > def myMapFunction(x: Int) = x + addend
>>> > >
>>> > > }
>>> > >
>>> > > I want to call it from pyspark as:
>>> > >
>>> > > df = ...
>>> > >
>>> > > mystuff = sc._jvm.MyStuff(5)
>>> > >
>>> > > df[‘x’].map(lambda x: mystuff.myMapFunction(x))
>>> > >
>>> > > How can I do this?
>>> > >
>>> > > Mohit.
>>> >
>>> > --
>>> >
>>> > To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>>>
>>>
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>>>
>>>

Re: Pyspark access to scala/java libraries

Posted by Mohit Jaggi <mo...@gmail.com>.
Thanks 0xF0F0F0 and Ashutosh for the pointers.

Holden,
I am trying to look into sparklingml...what am I looking for? Also which
chapter/page of your book should I look at?

Mohit.

On Sun, Jul 15, 2018 at 3:02 AM Holden Karau <ho...@gmail.com> wrote:

> If you want to see some examples in a library shows a way to do it -
> https://github.com/sparklingpandas/sparklingml and high performance spark
> also talks about it.
>
> On Sun, Jul 15, 2018, 11:57 AM <0x...@protonmail.com.invalid> wrote:
>
>> Check
>> https://stackoverflow.com/questions/31684842/calling-java-scala-function-from-a-task
>>
>> ​Sent with ProtonMail Secure Email.​
>>
>> ‐‐‐‐‐‐‐ Original Message ‐‐‐‐‐‐‐
>>
>> On July 15, 2018 8:01 AM, Mohit Jaggi <mo...@gmail.com> wrote:
>>
>> > Trying again…anyone know how to make this work?
>> >
>> > > On Jul 9, 2018, at 3:45 PM, Mohit Jaggi mohitjaggi@gmail.com wrote:
>> > >
>> > > Folks,
>> > >
>> > > I am writing some Scala/Java code and want it to be usable from
>> pyspark.
>> > >
>> > > For example:
>> > >
>> > > class MyStuff(addend: Int) {
>> > >
>> > > def myMapFunction(x: Int) = x + addend
>> > >
>> > > }
>> > >
>> > > I want to call it from pyspark as:
>> > >
>> > > df = ...
>> > >
>> > > mystuff = sc._jvm.MyStuff(5)
>> > >
>> > > df[‘x’].map(lambda x: mystuff.myMapFunction(x))
>> > >
>> > > How can I do this?
>> > >
>> > > Mohit.
>> >
>> > --
>> >
>> > To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>>
>>
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>>
>>

Re: Pyspark access to scala/java libraries

Posted by Mohit Jaggi <mo...@gmail.com>.
Thanks 0xF0F0F0 and Ashutosh for the pointers.

Holden,
I am trying to look into sparklingml...what am I looking for? Also which
chapter/page of your book should I look at?

Mohit.

On Sun, Jul 15, 2018 at 3:02 AM Holden Karau <ho...@gmail.com> wrote:

> If you want to see some examples in a library shows a way to do it -
> https://github.com/sparklingpandas/sparklingml and high performance spark
> also talks about it.
>
> On Sun, Jul 15, 2018, 11:57 AM <0x...@protonmail.com.invalid> wrote:
>
>> Check
>> https://stackoverflow.com/questions/31684842/calling-java-scala-function-from-a-task
>>
>> ​Sent with ProtonMail Secure Email.​
>>
>> ‐‐‐‐‐‐‐ Original Message ‐‐‐‐‐‐‐
>>
>> On July 15, 2018 8:01 AM, Mohit Jaggi <mo...@gmail.com> wrote:
>>
>> > Trying again…anyone know how to make this work?
>> >
>> > > On Jul 9, 2018, at 3:45 PM, Mohit Jaggi mohitjaggi@gmail.com wrote:
>> > >
>> > > Folks,
>> > >
>> > > I am writing some Scala/Java code and want it to be usable from
>> pyspark.
>> > >
>> > > For example:
>> > >
>> > > class MyStuff(addend: Int) {
>> > >
>> > > def myMapFunction(x: Int) = x + addend
>> > >
>> > > }
>> > >
>> > > I want to call it from pyspark as:
>> > >
>> > > df = ...
>> > >
>> > > mystuff = sc._jvm.MyStuff(5)
>> > >
>> > > df[‘x’].map(lambda x: mystuff.myMapFunction(x))
>> > >
>> > > How can I do this?
>> > >
>> > > Mohit.
>> >
>> > --
>> >
>> > To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>>
>>
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>>
>>

Re: Pyspark access to scala/java libraries

Posted by Holden Karau <ho...@gmail.com>.
If you want to see some examples in a library shows a way to do it -
https://github.com/sparklingpandas/sparklingml and high performance spark
also talks about it.

On Sun, Jul 15, 2018, 11:57 AM <0x...@protonmail.com.invalid> wrote:

> Check
> https://stackoverflow.com/questions/31684842/calling-java-scala-function-from-a-task
>
> ​Sent with ProtonMail Secure Email.​
>
> ‐‐‐‐‐‐‐ Original Message ‐‐‐‐‐‐‐
>
> On July 15, 2018 8:01 AM, Mohit Jaggi <mo...@gmail.com> wrote:
>
> > Trying again…anyone know how to make this work?
> >
> > > On Jul 9, 2018, at 3:45 PM, Mohit Jaggi mohitjaggi@gmail.com wrote:
> > >
> > > Folks,
> > >
> > > I am writing some Scala/Java code and want it to be usable from
> pyspark.
> > >
> > > For example:
> > >
> > > class MyStuff(addend: Int) {
> > >
> > > def myMapFunction(x: Int) = x + addend
> > >
> > > }
> > >
> > > I want to call it from pyspark as:
> > >
> > > df = ...
> > >
> > > mystuff = sc._jvm.MyStuff(5)
> > >
> > > df[‘x’].map(lambda x: mystuff.myMapFunction(x))
> > >
> > > How can I do this?
> > >
> > > Mohit.
> >
> > --
> >
> > To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>

Re: Pyspark access to scala/java libraries

Posted by Holden Karau <ho...@gmail.com>.
If you want to see some examples in a library shows a way to do it -
https://github.com/sparklingpandas/sparklingml and high performance spark
also talks about it.

On Sun, Jul 15, 2018, 11:57 AM <0x...@protonmail.com.invalid> wrote:

> Check
> https://stackoverflow.com/questions/31684842/calling-java-scala-function-from-a-task
>
> ​Sent with ProtonMail Secure Email.​
>
> ‐‐‐‐‐‐‐ Original Message ‐‐‐‐‐‐‐
>
> On July 15, 2018 8:01 AM, Mohit Jaggi <mo...@gmail.com> wrote:
>
> > Trying again…anyone know how to make this work?
> >
> > > On Jul 9, 2018, at 3:45 PM, Mohit Jaggi mohitjaggi@gmail.com wrote:
> > >
> > > Folks,
> > >
> > > I am writing some Scala/Java code and want it to be usable from
> pyspark.
> > >
> > > For example:
> > >
> > > class MyStuff(addend: Int) {
> > >
> > > def myMapFunction(x: Int) = x + addend
> > >
> > > }
> > >
> > > I want to call it from pyspark as:
> > >
> > > df = ...
> > >
> > > mystuff = sc._jvm.MyStuff(5)
> > >
> > > df[‘x’].map(lambda x: mystuff.myMapFunction(x))
> > >
> > > How can I do this?
> > >
> > > Mohit.
> >
> > --
> >
> > To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org
>
>

Re: Pyspark access to scala/java libraries

Posted by 0x...@protonmail.com.INVALID.
Check https://stackoverflow.com/questions/31684842/calling-java-scala-function-from-a-task

​Sent with ProtonMail Secure Email.​

‐‐‐‐‐‐‐ Original Message ‐‐‐‐‐‐‐

On July 15, 2018 8:01 AM, Mohit Jaggi <mo...@gmail.com> wrote:

> Trying again…anyone know how to make this work?
> 
> > On Jul 9, 2018, at 3:45 PM, Mohit Jaggi mohitjaggi@gmail.com wrote:
> > 
> > Folks,
> > 
> > I am writing some Scala/Java code and want it to be usable from pyspark.
> > 
> > For example:
> > 
> > class MyStuff(addend: Int) {
> > 
> > def myMapFunction(x: Int) = x + addend
> > 
> > }
> > 
> > I want to call it from pyspark as:
> > 
> > df = ...
> > 
> > mystuff = sc._jvm.MyStuff(5)
> > 
> > df[‘x’].map(lambda x: mystuff.myMapFunction(x))
> > 
> > How can I do this?
> > 
> > Mohit.
> 
> --
> 
> To unsubscribe e-mail: dev-unsubscribe@spark.apache.org



---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org