You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by vineet chadha <st...@gmail.com> on 2016/11/25 22:33:09 UTC

Third party library

Hi,

I am trying to invoke C library from the Spark Stack using JNI interface
(here is sample  application code)


class SimpleApp {
 // ---Native methods
@native def foo (Top: String): String
}

object SimpleApp  {
   def main(args: Array[String]) {

    val conf = new
SparkConf().setAppName("SimpleApplication").set("SPARK_LIBRARY_PATH", "lib")
    val sc = new SparkContext(conf)
     System.loadLibrary("foolib")
    //instantiate the class
     val SimpleAppInstance = new SimpleApp
    //String passing - Working
    val ret = SimpleAppInstance.foo("fooString")
  }

Above code work fines.

I have setup LD_LIBRARY_PATH and spark.executor.extraClassPath,
spark.executor.extraLibraryPath
at worker node

How can i invoke JNI library from worker node ? Where should i load it in
executor ?
Calling  System.loadLibrary("foolib") inside the work node gives me
following error :

Exception in thread "main" java.lang.UnsatisfiedLinkError:

Any help would be really appreciated.

Re: Third party library

Posted by vineet chadha <st...@gmail.com>.
Thanks Jakob for sharing the link. Will try it out.

Regards,
Vineet

On Tue, Dec 13, 2016 at 3:00 PM, Jakob Odersky <ja...@odersky.com> wrote:

> Hi Vineet,
> great to see you solved the problem! Since this just appeared in my
> inbox, I wanted to take the opportunity for a shameless plug:
> https://github.com/jodersky/sbt-jni. In case you're using sbt and also
> developing the native library, this plugin may help with the pains of
> building and packaging JNI applications.
>
> cheers,
> --Jakob
>
> On Tue, Dec 13, 2016 at 11:02 AM, vineet chadha <st...@gmail.com>
> wrote:
> > Thanks Steve and Kant. Apologies for late reply as I was out for
> vacation.
> > Got  it working.  For other users:
> >
> > def loadResources() {
> >
> >         System.loadLibrary("foolib")
> >
> >          val MyInstance  = new MyClass
> >
> >          val retstr = MyInstance.foo("mystring") // method trying to
> invoke
> >
> >  }
> >
> >     val conf = new
> > SparkConf().setAppName("SimpleApplication").set("SPARK_LIBRARY_PATH",
> > "/lib/location")
> >
> >     val sc = new SparkContext(conf)
> >
> >     sc.parallelize(1 to 10, 2).mapPartitions ( iter => {
> >
> >         MySimpleApp.loadResources()
> >
> >         iter
> >
> >     }).count
> >
> >
> >
> > Regards,
> > Vineet
> >
> > On Sun, Nov 27, 2016 at 2:15 PM, Steve Loughran <st...@hortonworks.com>
> > wrote:
> >>
> >>
> >> On 27 Nov 2016, at 02:55, kant kodali <ka...@gmail.com> wrote:
> >>
> >> I would say instead of LD_LIBRARY_PATH you might want to use
> >> java.library.path
> >>
> >> in the following way
> >>
> >> java -Djava.library.path=/path/to/my/library or pass java.library.path
> >> along with spark-submit
> >>
> >>
> >> This is only going to set up paths on the submitting system; to load JNI
> >> code in the executors, the binary needs to be sent to far end and then
> put
> >> on the Java load path there.
> >>
> >> Copy the relevant binary to somewhere on the PATH of the destination
> >> machine. Do that and you shouldn't have to worry about other JVM
> options,
> >> (though it's been a few years since I did any JNI).
> >>
> >> One trick: write a simple main() object/entry point which calls the JNI
> >> method, and doesn't attempt to use any spark libraries; have it log any
> >> exception and return an error code if the call failed. This will let
> you use
> >> it as a link test after deployment: if you can't run that class then
> things
> >> are broken, before you go near spark
> >>
> >>
> >> On Sat, Nov 26, 2016 at 6:44 PM, Gmail <vg...@gmail.com> wrote:
> >>>
> >>> Maybe you've already checked these out. Some basic questions that come
> to
> >>> my mind are:
> >>> 1) is this library "foolib" or "foo-C-library" available on the worker
> >>> node?
> >>> 2) if yes, is it accessible by the user/program (rwx)?
> >>>
> >>> Thanks,
> >>> Vasu.
> >>>
> >>> On Nov 26, 2016, at 5:08 PM, kant kodali <ka...@gmail.com> wrote:
> >>>
> >>> If it is working for standalone program I would think you can apply the
> >>> same settings across all the spark worker  and client machines and
> give that
> >>> a try. Lets start with that.
> >>>
> >>> On Sat, Nov 26, 2016 at 11:59 AM, vineet chadha <
> start.vineet@gmail.com>
> >>> wrote:
> >>>>
> >>>> Just subscribed to  Spark User.  So, forwarding message again.
> >>>>
> >>>> On Sat, Nov 26, 2016 at 11:50 AM, vineet chadha <
> start.vineet@gmail.com>
> >>>> wrote:
> >>>>>
> >>>>> Thanks Kant. Can you give me a sample program which allows me to call
> >>>>> jni from executor task ?   I have jni working in standalone program
> in
> >>>>> scala/java.
> >>>>>
> >>>>> Regards,
> >>>>> Vineet
> >>>>>
> >>>>> On Sat, Nov 26, 2016 at 11:43 AM, kant kodali <ka...@gmail.com>
> >>>>> wrote:
> >>>>>>
> >>>>>> Yes this is a Java JNI question. Nothing to do with Spark really.
> >>>>>>
> >>>>>>  java.lang.UnsatisfiedLinkError typically would mean the way you
> setup
> >>>>>> LD_LIBRARY_PATH is wrong unless you tell us that it is working for
> other
> >>>>>> cases but not this one.
> >>>>>>
> >>>>>> On Sat, Nov 26, 2016 at 11:23 AM, Reynold Xin <rx...@databricks.com>
> >>>>>> wrote:
> >>>>>>>
> >>>>>>> That's just standard JNI and has nothing to do with Spark, does it?
> >>>>>>>
> >>>>>>>
> >>>>>>> On Sat, Nov 26, 2016 at 11:19 AM, vineet chadha
> >>>>>>> <st...@gmail.com> wrote:
> >>>>>>>>
> >>>>>>>> Thanks Reynold for quick reply.
> >>>>>>>>
> >>>>>>>>  I have tried following:
> >>>>>>>>
> >>>>>>>> class MySimpleApp {
> >>>>>>>>  // ---Native methods
> >>>>>>>>   @native def fooMethod (foo: String): String
> >>>>>>>> }
> >>>>>>>>
> >>>>>>>> object MySimpleApp {
> >>>>>>>>   val flag = false
> >>>>>>>>   def loadResources() {
> >>>>>>>> System.loadLibrary("foo-C-library")
> >>>>>>>>   val flag = true
> >>>>>>>>   }
> >>>>>>>>   def main() {
> >>>>>>>>     sc.parallelize(1 to 10).mapPartitions ( iter => {
> >>>>>>>>       if(flag == false){
> >>>>>>>>       MySimpleApp.loadResources()
> >>>>>>>>      val SimpleInstance = new MySimpleApp
> >>>>>>>>       }
> >>>>>>>>       SimpleInstance.fooMethod ("fooString")
> >>>>>>>>       iter
> >>>>>>>>     })
> >>>>>>>>   }
> >>>>>>>> }
> >>>>>>>>
> >>>>>>>> I don't see way to invoke fooMethod which is implemented in
> >>>>>>>> foo-C-library. Is I am missing something ? If possible, can you
> point me to
> >>>>>>>> existing implementation which i can refer to.
> >>>>>>>>
> >>>>>>>> Thanks again.
> >>>>>>>>
> >>>>>>>> ~
> >>>>>>>>
> >>>>>>>>
> >>>>>>>> On Fri, Nov 25, 2016 at 3:32 PM, Reynold Xin <rxin@databricks.com
> >
> >>>>>>>> wrote:
> >>>>>>>>>
> >>>>>>>>> bcc dev@ and add user@
> >>>>>>>>>
> >>>>>>>>>
> >>>>>>>>> This is more a user@ list question rather than a dev@ list
> >>>>>>>>> question. You can do something like this:
> >>>>>>>>>
> >>>>>>>>> object MySimpleApp {
> >>>>>>>>>   def loadResources(): Unit = // define some idempotent way to
> load
> >>>>>>>>> resources, e.g. with a flag or lazy val
> >>>>>>>>>
> >>>>>>>>>   def main() = {
> >>>>>>>>>     ...
> >>>>>>>>>
> >>>>>>>>>     sc.parallelize(1 to 10).mapPartitions { iter =>
> >>>>>>>>>       MySimpleApp.loadResources()
> >>>>>>>>>
> >>>>>>>>>       // do whatever you want with the iterator
> >>>>>>>>>     }
> >>>>>>>>>   }
> >>>>>>>>> }
> >>>>>>>>>
> >>>>>>>>>
> >>>>>>>>>
> >>>>>>>>>
> >>>>>>>>>
> >>>>>>>>> On Fri, Nov 25, 2016 at 2:33 PM, vineet chadha
> >>>>>>>>> <st...@gmail.com> wrote:
> >>>>>>>>>>
> >>>>>>>>>> Hi,
> >>>>>>>>>>
> >>>>>>>>>> I am trying to invoke C library from the Spark Stack using JNI
> >>>>>>>>>> interface (here is sample  application code)
> >>>>>>>>>>
> >>>>>>>>>>
> >>>>>>>>>> class SimpleApp {
> >>>>>>>>>>  // ---Native methods
> >>>>>>>>>> @native def foo (Top: String): String
> >>>>>>>>>> }
> >>>>>>>>>>
> >>>>>>>>>> object SimpleApp  {
> >>>>>>>>>>    def main(args: Array[String]) {
> >>>>>>>>>>
> >>>>>>>>>>     val conf = new
> >>>>>>>>>> SparkConf().setAppName("SimpleApplication").set("SPARK_LIBRARY_PATH",
> "lib")
> >>>>>>>>>>     val sc = new SparkContext(conf)
> >>>>>>>>>>      System.loadLibrary("foolib")
> >>>>>>>>>>     //instantiate the class
> >>>>>>>>>>      val SimpleAppInstance = new SimpleApp
> >>>>>>>>>>     //String passing - Working
> >>>>>>>>>>     val ret = SimpleAppInstance.foo("fooString")
> >>>>>>>>>>   }
> >>>>>>>>>>
> >>>>>>>>>> Above code work fines.
> >>>>>>>>>>
> >>>>>>>>>> I have setup LD_LIBRARY_PATH and spark.executor.extraClassPath,
> >>>>>>>>>> spark.executor.extraLibraryPath at worker node
> >>>>>>>>>>
> >>>>>>>>>> How can i invoke JNI library from worker node ? Where should i
> >>>>>>>>>> load it in executor ?
> >>>>>>>>>> Calling  System.loadLibrary("foolib") inside the work node gives
> >>>>>>>>>> me following error :
> >>>>>>>>>>
> >>>>>>>>>> Exception in thread "main" java.lang.UnsatisfiedLinkError:
> >>>>>>>>>>
> >>>>>>>>>> Any help would be really appreciated.
> >>>>>>>>>>
> >>>>>>>>>>
> >>>>>>>>>>
> >>>>>>>>>>
> >>>>>>>>>>
> >>>>>>>>>>
> >>>>>>>>>>
> >>>>>>>>>>
> >>>>>>>>>>
> >>>>>>>>>>
> >>>>>>>>>>
> >>>>>>>>>>
> >>>>>>>>>>
> >>>>>>>>>
> >>>>>>>>
> >>>>>>>
> >>>>>>
> >>>>>
> >>>>
> >>>
> >>
> >>
> >
>

Re: Third party library

Posted by Jakob Odersky <ja...@odersky.com>.
Hi Vineet,
great to see you solved the problem! Since this just appeared in my
inbox, I wanted to take the opportunity for a shameless plug:
https://github.com/jodersky/sbt-jni. In case you're using sbt and also
developing the native library, this plugin may help with the pains of
building and packaging JNI applications.

cheers,
--Jakob

On Tue, Dec 13, 2016 at 11:02 AM, vineet chadha <st...@gmail.com> wrote:
> Thanks Steve and Kant. Apologies for late reply as I was out for vacation.
> Got  it working.  For other users:
>
> def loadResources() {
>
>         System.loadLibrary("foolib")
>
>          val MyInstance  = new MyClass
>
>          val retstr = MyInstance.foo("mystring") // method trying to invoke
>
>  }
>
>     val conf = new
> SparkConf().setAppName("SimpleApplication").set("SPARK_LIBRARY_PATH",
> "/lib/location")
>
>     val sc = new SparkContext(conf)
>
>     sc.parallelize(1 to 10, 2).mapPartitions ( iter => {
>
>         MySimpleApp.loadResources()
>
>         iter
>
>     }).count
>
>
>
> Regards,
> Vineet
>
> On Sun, Nov 27, 2016 at 2:15 PM, Steve Loughran <st...@hortonworks.com>
> wrote:
>>
>>
>> On 27 Nov 2016, at 02:55, kant kodali <ka...@gmail.com> wrote:
>>
>> I would say instead of LD_LIBRARY_PATH you might want to use
>> java.library.path
>>
>> in the following way
>>
>> java -Djava.library.path=/path/to/my/library or pass java.library.path
>> along with spark-submit
>>
>>
>> This is only going to set up paths on the submitting system; to load JNI
>> code in the executors, the binary needs to be sent to far end and then put
>> on the Java load path there.
>>
>> Copy the relevant binary to somewhere on the PATH of the destination
>> machine. Do that and you shouldn't have to worry about other JVM options,
>> (though it's been a few years since I did any JNI).
>>
>> One trick: write a simple main() object/entry point which calls the JNI
>> method, and doesn't attempt to use any spark libraries; have it log any
>> exception and return an error code if the call failed. This will let you use
>> it as a link test after deployment: if you can't run that class then things
>> are broken, before you go near spark
>>
>>
>> On Sat, Nov 26, 2016 at 6:44 PM, Gmail <vg...@gmail.com> wrote:
>>>
>>> Maybe you've already checked these out. Some basic questions that come to
>>> my mind are:
>>> 1) is this library "foolib" or "foo-C-library" available on the worker
>>> node?
>>> 2) if yes, is it accessible by the user/program (rwx)?
>>>
>>> Thanks,
>>> Vasu.
>>>
>>> On Nov 26, 2016, at 5:08 PM, kant kodali <ka...@gmail.com> wrote:
>>>
>>> If it is working for standalone program I would think you can apply the
>>> same settings across all the spark worker  and client machines and give that
>>> a try. Lets start with that.
>>>
>>> On Sat, Nov 26, 2016 at 11:59 AM, vineet chadha <st...@gmail.com>
>>> wrote:
>>>>
>>>> Just subscribed to  Spark User.  So, forwarding message again.
>>>>
>>>> On Sat, Nov 26, 2016 at 11:50 AM, vineet chadha <st...@gmail.com>
>>>> wrote:
>>>>>
>>>>> Thanks Kant. Can you give me a sample program which allows me to call
>>>>> jni from executor task ?   I have jni working in standalone program in
>>>>> scala/java.
>>>>>
>>>>> Regards,
>>>>> Vineet
>>>>>
>>>>> On Sat, Nov 26, 2016 at 11:43 AM, kant kodali <ka...@gmail.com>
>>>>> wrote:
>>>>>>
>>>>>> Yes this is a Java JNI question. Nothing to do with Spark really.
>>>>>>
>>>>>>  java.lang.UnsatisfiedLinkError typically would mean the way you setup
>>>>>> LD_LIBRARY_PATH is wrong unless you tell us that it is working for other
>>>>>> cases but not this one.
>>>>>>
>>>>>> On Sat, Nov 26, 2016 at 11:23 AM, Reynold Xin <rx...@databricks.com>
>>>>>> wrote:
>>>>>>>
>>>>>>> That's just standard JNI and has nothing to do with Spark, does it?
>>>>>>>
>>>>>>>
>>>>>>> On Sat, Nov 26, 2016 at 11:19 AM, vineet chadha
>>>>>>> <st...@gmail.com> wrote:
>>>>>>>>
>>>>>>>> Thanks Reynold for quick reply.
>>>>>>>>
>>>>>>>>  I have tried following:
>>>>>>>>
>>>>>>>> class MySimpleApp {
>>>>>>>>  // ---Native methods
>>>>>>>>   @native def fooMethod (foo: String): String
>>>>>>>> }
>>>>>>>>
>>>>>>>> object MySimpleApp {
>>>>>>>>   val flag = false
>>>>>>>>   def loadResources() {
>>>>>>>> System.loadLibrary("foo-C-library")
>>>>>>>>   val flag = true
>>>>>>>>   }
>>>>>>>>   def main() {
>>>>>>>>     sc.parallelize(1 to 10).mapPartitions ( iter => {
>>>>>>>>       if(flag == false){
>>>>>>>>       MySimpleApp.loadResources()
>>>>>>>>      val SimpleInstance = new MySimpleApp
>>>>>>>>       }
>>>>>>>>       SimpleInstance.fooMethod ("fooString")
>>>>>>>>       iter
>>>>>>>>     })
>>>>>>>>   }
>>>>>>>> }
>>>>>>>>
>>>>>>>> I don't see way to invoke fooMethod which is implemented in
>>>>>>>> foo-C-library. Is I am missing something ? If possible, can you point me to
>>>>>>>> existing implementation which i can refer to.
>>>>>>>>
>>>>>>>> Thanks again.
>>>>>>>>
>>>>>>>> ~
>>>>>>>>
>>>>>>>>
>>>>>>>> On Fri, Nov 25, 2016 at 3:32 PM, Reynold Xin <rx...@databricks.com>
>>>>>>>> wrote:
>>>>>>>>>
>>>>>>>>> bcc dev@ and add user@
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> This is more a user@ list question rather than a dev@ list
>>>>>>>>> question. You can do something like this:
>>>>>>>>>
>>>>>>>>> object MySimpleApp {
>>>>>>>>>   def loadResources(): Unit = // define some idempotent way to load
>>>>>>>>> resources, e.g. with a flag or lazy val
>>>>>>>>>
>>>>>>>>>   def main() = {
>>>>>>>>>     ...
>>>>>>>>>
>>>>>>>>>     sc.parallelize(1 to 10).mapPartitions { iter =>
>>>>>>>>>       MySimpleApp.loadResources()
>>>>>>>>>
>>>>>>>>>       // do whatever you want with the iterator
>>>>>>>>>     }
>>>>>>>>>   }
>>>>>>>>> }
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On Fri, Nov 25, 2016 at 2:33 PM, vineet chadha
>>>>>>>>> <st...@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>> Hi,
>>>>>>>>>>
>>>>>>>>>> I am trying to invoke C library from the Spark Stack using JNI
>>>>>>>>>> interface (here is sample  application code)
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>> class SimpleApp {
>>>>>>>>>>  // ---Native methods
>>>>>>>>>> @native def foo (Top: String): String
>>>>>>>>>> }
>>>>>>>>>>
>>>>>>>>>> object SimpleApp  {
>>>>>>>>>>    def main(args: Array[String]) {
>>>>>>>>>>
>>>>>>>>>>     val conf = new
>>>>>>>>>> SparkConf().setAppName("SimpleApplication").set("SPARK_LIBRARY_PATH", "lib")
>>>>>>>>>>     val sc = new SparkContext(conf)
>>>>>>>>>>      System.loadLibrary("foolib")
>>>>>>>>>>     //instantiate the class
>>>>>>>>>>      val SimpleAppInstance = new SimpleApp
>>>>>>>>>>     //String passing - Working
>>>>>>>>>>     val ret = SimpleAppInstance.foo("fooString")
>>>>>>>>>>   }
>>>>>>>>>>
>>>>>>>>>> Above code work fines.
>>>>>>>>>>
>>>>>>>>>> I have setup LD_LIBRARY_PATH and spark.executor.extraClassPath,
>>>>>>>>>> spark.executor.extraLibraryPath at worker node
>>>>>>>>>>
>>>>>>>>>> How can i invoke JNI library from worker node ? Where should i
>>>>>>>>>> load it in executor ?
>>>>>>>>>> Calling  System.loadLibrary("foolib") inside the work node gives
>>>>>>>>>> me following error :
>>>>>>>>>>
>>>>>>>>>> Exception in thread "main" java.lang.UnsatisfiedLinkError:
>>>>>>>>>>
>>>>>>>>>> Any help would be really appreciated.
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>>
>

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Re: Third party library

Posted by vineet chadha <st...@gmail.com>.
Thanks Steve and Kant. Apologies for late reply as I was out for vacation.
Got  it working.  For other users:

def loadResources() {

        System.loadLibrary("foolib")

         val MyInstance  = new MyClass

         val retstr = MyInstance.foo("mystring") // method trying to invoke

 }

    val conf = new
SparkConf().setAppName("SimpleApplication").set("SPARK_LIBRARY_PATH",
 "/lib/location")

    val sc = new SparkContext(conf)

    sc.parallelize(1 to 10, 2).mapPartitions ( iter => {

        MySimpleApp.loadResources()

        iter

    }).count


Regards,
Vineet

On Sun, Nov 27, 2016 at 2:15 PM, Steve Loughran <st...@hortonworks.com>
wrote:

>
> On 27 Nov 2016, at 02:55, kant kodali <ka...@gmail.com> wrote:
>
> I would say instead of LD_LIBRARY_PATH you might want to use java.library.
> path
>
> in the following way
>
> java -Djava.library.path=/path/to/my/library or pass java.library.path
> along with spark-submit
>
>
> This is only going to set up paths on the submitting system; to load JNI
> code in the executors, the binary needs to be sent to far end and then put
> on the Java load path there.
>
> Copy the relevant binary to somewhere on the PATH of the destination
> machine. Do that and you shouldn't have to worry about other JVM options,
> (though it's been a few years since I did any JNI).
>
> One trick: write a simple main() object/entry point which calls the JNI
> method, and doesn't attempt to use any spark libraries; have it log any
> exception and return an error code if the call failed. This will let you
> use it as a link test after deployment: if you can't run that class then
> things are broken, before you go near spark
>
>
> On Sat, Nov 26, 2016 at 6:44 PM, Gmail <vg...@gmail.com> wrote:
>
>> Maybe you've already checked these out. Some basic questions that come to
>> my mind are:
>> 1) is this library "foolib" or "foo-C-library" available on the worker
>> node?
>> 2) if yes, is it accessible by the user/program (rwx)?
>>
>> Thanks,
>> Vasu.
>>
>> On Nov 26, 2016, at 5:08 PM, kant kodali <ka...@gmail.com> wrote:
>>
>> If it is working for standalone program I would think you can apply the
>> same settings across all the spark worker  and client machines and give
>> that a try. Lets start with that.
>>
>> On Sat, Nov 26, 2016 at 11:59 AM, vineet chadha <st...@gmail.com>
>> wrote:
>>
>>> Just subscribed to  Spark User.  So, forwarding message again.
>>>
>>> On Sat, Nov 26, 2016 at 11:50 AM, vineet chadha <st...@gmail.com>
>>> wrote:
>>>
>>>> Thanks Kant. Can you give me a sample program which allows me to call
>>>> jni from executor task ?   I have jni working in standalone program in
>>>> scala/java.
>>>>
>>>> Regards,
>>>> Vineet
>>>>
>>>> On Sat, Nov 26, 2016 at 11:43 AM, kant kodali <ka...@gmail.com>
>>>> wrote:
>>>>
>>>>> Yes this is a Java JNI question. Nothing to do with Spark really.
>>>>>
>>>>>  java.lang.UnsatisfiedLinkError typically would mean the way you
>>>>> setup LD_LIBRARY_PATH is wrong unless you tell us that it is working
>>>>> for other cases but not this one.
>>>>>
>>>>> On Sat, Nov 26, 2016 at 11:23 AM, Reynold Xin <rx...@databricks.com>
>>>>> wrote:
>>>>>
>>>>>> That's just standard JNI and has nothing to do with Spark, does it?
>>>>>>
>>>>>>
>>>>>> On Sat, Nov 26, 2016 at 11:19 AM, vineet chadha <
>>>>>> start.vineet@gmail.com> wrote:
>>>>>>
>>>>>>> Thanks Reynold for quick reply.
>>>>>>>
>>>>>>>  I have tried following:
>>>>>>>
>>>>>>> class MySimpleApp {
>>>>>>>  // ---Native methods
>>>>>>>   @native def fooMethod (foo: String): String
>>>>>>> }
>>>>>>>
>>>>>>> object MySimpleApp {
>>>>>>>   val flag = false
>>>>>>>   def loadResources() {
>>>>>>> System.loadLibrary("foo-C-library")
>>>>>>>   val flag = true
>>>>>>>   }
>>>>>>>   def main() {
>>>>>>>     sc.parallelize(1 to 10).mapPartitions ( iter => {
>>>>>>>       if(flag == false){
>>>>>>>       MySimpleApp.loadResources()
>>>>>>>      val SimpleInstance = new MySimpleApp
>>>>>>>       }
>>>>>>>       SimpleInstance.fooMethod ("fooString")
>>>>>>>       iter
>>>>>>>     })
>>>>>>>   }
>>>>>>> }
>>>>>>>
>>>>>>> I don't see way to invoke fooMethod which is implemented in
>>>>>>> foo-C-library. Is I am missing something ? If possible, can you point me to
>>>>>>> existing implementation which i can refer to.
>>>>>>>
>>>>>>> Thanks again.
>>>>>>>
>>>>>>> ~
>>>>>>>
>>>>>>> On Fri, Nov 25, 2016 at 3:32 PM, Reynold Xin <rx...@databricks.com>
>>>>>>> wrote:
>>>>>>>
>>>>>>>> bcc dev@ and add user@
>>>>>>>>
>>>>>>>>
>>>>>>>> This is more a user@ list question rather than a dev@ list
>>>>>>>> question. You can do something like this:
>>>>>>>>
>>>>>>>> object MySimpleApp {
>>>>>>>>   def loadResources(): Unit = // define some idempotent way to load
>>>>>>>> resources, e.g. with a flag or lazy val
>>>>>>>>
>>>>>>>>   def main() = {
>>>>>>>>     ...
>>>>>>>>
>>>>>>>>     sc.parallelize(1 to 10).mapPartitions { iter =>
>>>>>>>>       MySimpleApp.loadResources()
>>>>>>>>
>>>>>>>>       // do whatever you want with the iterator
>>>>>>>>     }
>>>>>>>>   }
>>>>>>>> }
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> On Fri, Nov 25, 2016 at 2:33 PM, vineet chadha <
>>>>>>>> start.vineet@gmail.com> wrote:
>>>>>>>>
>>>>>>>>> Hi,
>>>>>>>>>
>>>>>>>>> I am trying to invoke C library from the Spark Stack using JNI
>>>>>>>>> interface (here is sample  application code)
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> class SimpleApp {
>>>>>>>>>  // ---Native methods
>>>>>>>>> @native def foo (Top: String): String
>>>>>>>>> }
>>>>>>>>>
>>>>>>>>> object SimpleApp  {
>>>>>>>>>    def main(args: Array[String]) {
>>>>>>>>>
>>>>>>>>>     val conf = new SparkConf().setAppName("Simple
>>>>>>>>> Application").set("SPARK_LIBRARY_PATH", "lib")
>>>>>>>>>     val sc = new SparkContext(conf)
>>>>>>>>>      System.loadLibrary("foolib")
>>>>>>>>>     //instantiate the class
>>>>>>>>>      val SimpleAppInstance = new SimpleApp
>>>>>>>>>     //String passing - Working
>>>>>>>>>     val ret = SimpleAppInstance.foo("fooString")
>>>>>>>>>   }
>>>>>>>>>
>>>>>>>>> Above code work fines.
>>>>>>>>>
>>>>>>>>> I have setup LD_LIBRARY_PATH and spark.executor.extraClassPath,
>>>>>>>>> spark.executor.extraLibraryPath at worker node
>>>>>>>>>
>>>>>>>>> How can i invoke JNI library from worker node ? Where should i
>>>>>>>>> load it in executor ?
>>>>>>>>> Calling  System.loadLibrary("foolib") inside the work node gives
>>>>>>>>> me following error :
>>>>>>>>>
>>>>>>>>> Exception in thread "main" java.lang.UnsatisfiedLinkError:
>>>>>>>>>
>>>>>>>>> Any help would be really appreciated.
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>
>

Re: Third party library

Posted by Steve Loughran <st...@hortonworks.com>.
On 27 Nov 2016, at 02:55, kant kodali <ka...@gmail.com>> wrote:

I would say instead of LD_LIBRARY_PATH you might want to use java.library.path

in the following way

java -Djava.library.path=/path/to/my/library or pass java.library.path along with spark-submit


This is only going to set up paths on the submitting system; to load JNI code in the executors, the binary needs to be sent to far end and then put on the Java load path there.

Copy the relevant binary to somewhere on the PATH of the destination machine. Do that and you shouldn't have to worry about other JVM options, (though it's been a few years since I did any JNI).

One trick: write a simple main() object/entry point which calls the JNI method, and doesn't attempt to use any spark libraries; have it log any exception and return an error code if the call failed. This will let you use it as a link test after deployment: if you can't run that class then things are broken, before you go near spark


On Sat, Nov 26, 2016 at 6:44 PM, Gmail <vg...@gmail.com>> wrote:
Maybe you've already checked these out. Some basic questions that come to my mind are:
1) is this library "foolib" or "foo-C-library" available on the worker node?
2) if yes, is it accessible by the user/program (rwx)?

Thanks,
Vasu.

On Nov 26, 2016, at 5:08 PM, kant kodali <ka...@gmail.com>> wrote:

If it is working for standalone program I would think you can apply the same settings across all the spark worker  and client machines and give that a try. Lets start with that.

On Sat, Nov 26, 2016 at 11:59 AM, vineet chadha <st...@gmail.com>> wrote:
Just subscribed to  Spark User.  So, forwarding message again.

On Sat, Nov 26, 2016 at 11:50 AM, vineet chadha <st...@gmail.com>> wrote:
Thanks Kant. Can you give me a sample program which allows me to call jni from executor task ?   I have jni working in standalone program in scala/java.

Regards,
Vineet

On Sat, Nov 26, 2016 at 11:43 AM, kant kodali <ka...@gmail.com>> wrote:
Yes this is a Java JNI question. Nothing to do with Spark really.

 java.lang.UnsatisfiedLinkError typically would mean the way you setup LD_LIBRARY_PATH is wrong unless you tell us that it is working for other cases but not this one.

On Sat, Nov 26, 2016 at 11:23 AM, Reynold Xin <rx...@databricks.com>> wrote:
That's just standard JNI and has nothing to do with Spark, does it?


On Sat, Nov 26, 2016 at 11:19 AM, vineet chadha <st...@gmail.com>> wrote:
Thanks Reynold for quick reply.

 I have tried following:

class MySimpleApp {
 // ---Native methods
  @native def fooMethod (foo: String): String
}

object MySimpleApp {
  val flag = false
  def loadResources() {
System.loadLibrary("foo-C-library")
  val flag = true
  }
  def main() {
    sc.parallelize(1 to 10).mapPartitions ( iter => {
      if(flag == false){
      MySimpleApp.loadResources()
     val SimpleInstance = new MySimpleApp
      }
      SimpleInstance.fooMethod ("fooString")
      iter
    })
  }
}

I don't see way to invoke fooMethod which is implemented in foo-C-library. Is I am missing something ? If possible, can you point me to existing implementation which i can refer to.

Thanks again.

~

On Fri, Nov 25, 2016 at 3:32 PM, Reynold Xin <rx...@databricks.com>> wrote:
bcc dev@ and add user@


This is more a user@ list question rather than a dev@ list question. You can do something like this:

object MySimpleApp {
  def loadResources(): Unit = // define some idempotent way to load resources, e.g. with a flag or lazy val

  def main() = {
    ...

    sc.parallelize(1 to 10).mapPartitions { iter =>
      MySimpleApp.loadResources()

      // do whatever you want with the iterator
    }
  }
}





On Fri, Nov 25, 2016 at 2:33 PM, vineet chadha <st...@gmail.com>> wrote:
Hi,

I am trying to invoke C library from the Spark Stack using JNI interface (here is sample  application code)


class SimpleApp {
 // ---Native methods
@native def foo (Top: String): String
}

object SimpleApp  {
   def main(args: Array[String]) {

    val conf = new SparkConf().setAppName("SimpleApplication").set("SPARK_LIBRARY_PATH", "lib")
    val sc = new SparkContext(conf)
     System.loadLibrary("foolib")
    //instantiate the class
     val SimpleAppInstance = new SimpleApp
    //String passing - Working
    val ret = SimpleAppInstance.foo("fooString")
  }

Above code work fines.

I have setup LD_LIBRARY_PATH and spark.executor.extraClassPath,  spark.executor.extraLibraryPath at worker node

How can i invoke JNI library from worker node ? Where should i load it in executor ?
Calling  System.loadLibrary("foolib") inside the work node gives me following error :

Exception in thread "main" java.lang.UnsatisfiedLinkError:

Any help would be really appreciated.






















Re: Third party library

Posted by kant kodali <ka...@gmail.com>.
I would say instead of LD_LIBRARY_PATH you might want to use java.library.
path

in the following way

java -Djava.library.path=/path/to/my/library or pass java.library.path
along with spark-submit

On Sat, Nov 26, 2016 at 6:44 PM, Gmail <vg...@gmail.com> wrote:

> Maybe you've already checked these out. Some basic questions that come to
> my mind are:
> 1) is this library "foolib" or "foo-C-library" available on the worker
> node?
> 2) if yes, is it accessible by the user/program (rwx)?
>
> Thanks,
> Vasu.
>
> On Nov 26, 2016, at 5:08 PM, kant kodali <ka...@gmail.com> wrote:
>
> If it is working for standalone program I would think you can apply the
> same settings across all the spark worker  and client machines and give
> that a try. Lets start with that.
>
> On Sat, Nov 26, 2016 at 11:59 AM, vineet chadha <st...@gmail.com>
> wrote:
>
>> Just subscribed to  Spark User.  So, forwarding message again.
>>
>> On Sat, Nov 26, 2016 at 11:50 AM, vineet chadha <st...@gmail.com>
>> wrote:
>>
>>> Thanks Kant. Can you give me a sample program which allows me to call
>>> jni from executor task ?   I have jni working in standalone program in
>>> scala/java.
>>>
>>> Regards,
>>> Vineet
>>>
>>> On Sat, Nov 26, 2016 at 11:43 AM, kant kodali <ka...@gmail.com>
>>> wrote:
>>>
>>>> Yes this is a Java JNI question. Nothing to do with Spark really.
>>>>
>>>>  java.lang.UnsatisfiedLinkError typically would mean the way you setup LD_LIBRARY_PATH
>>>> is wrong unless you tell us that it is working for other cases but not this
>>>> one.
>>>>
>>>> On Sat, Nov 26, 2016 at 11:23 AM, Reynold Xin <rx...@databricks.com>
>>>> wrote:
>>>>
>>>>> That's just standard JNI and has nothing to do with Spark, does it?
>>>>>
>>>>>
>>>>> On Sat, Nov 26, 2016 at 11:19 AM, vineet chadha <
>>>>> start.vineet@gmail.com> wrote:
>>>>>
>>>>>> Thanks Reynold for quick reply.
>>>>>>
>>>>>>  I have tried following:
>>>>>>
>>>>>> class MySimpleApp {
>>>>>>  // ---Native methods
>>>>>>   @native def fooMethod (foo: String): String
>>>>>> }
>>>>>>
>>>>>> object MySimpleApp {
>>>>>>   val flag = false
>>>>>>   def loadResources() {
>>>>>> System.loadLibrary("foo-C-library")
>>>>>>   val flag = true
>>>>>>   }
>>>>>>   def main() {
>>>>>>     sc.parallelize(1 to 10).mapPartitions ( iter => {
>>>>>>       if(flag == false){
>>>>>>       MySimpleApp.loadResources()
>>>>>>      val SimpleInstance = new MySimpleApp
>>>>>>       }
>>>>>>       SimpleInstance.fooMethod ("fooString")
>>>>>>       iter
>>>>>>     })
>>>>>>   }
>>>>>> }
>>>>>>
>>>>>> I don't see way to invoke fooMethod which is implemented in
>>>>>> foo-C-library. Is I am missing something ? If possible, can you point me to
>>>>>> existing implementation which i can refer to.
>>>>>>
>>>>>> Thanks again.
>>>>>>
>>>>>> ~
>>>>>>
>>>>>> On Fri, Nov 25, 2016 at 3:32 PM, Reynold Xin <rx...@databricks.com>
>>>>>> wrote:
>>>>>>
>>>>>>> bcc dev@ and add user@
>>>>>>>
>>>>>>>
>>>>>>> This is more a user@ list question rather than a dev@ list
>>>>>>> question. You can do something like this:
>>>>>>>
>>>>>>> object MySimpleApp {
>>>>>>>   def loadResources(): Unit = // define some idempotent way to load
>>>>>>> resources, e.g. with a flag or lazy val
>>>>>>>
>>>>>>>   def main() = {
>>>>>>>     ...
>>>>>>>
>>>>>>>     sc.parallelize(1 to 10).mapPartitions { iter =>
>>>>>>>       MySimpleApp.loadResources()
>>>>>>>
>>>>>>>       // do whatever you want with the iterator
>>>>>>>     }
>>>>>>>   }
>>>>>>> }
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Fri, Nov 25, 2016 at 2:33 PM, vineet chadha <
>>>>>>> start.vineet@gmail.com> wrote:
>>>>>>>
>>>>>>>> Hi,
>>>>>>>>
>>>>>>>> I am trying to invoke C library from the Spark Stack using JNI
>>>>>>>> interface (here is sample  application code)
>>>>>>>>
>>>>>>>>
>>>>>>>> class SimpleApp {
>>>>>>>>  // ---Native methods
>>>>>>>> @native def foo (Top: String): String
>>>>>>>> }
>>>>>>>>
>>>>>>>> object SimpleApp  {
>>>>>>>>    def main(args: Array[String]) {
>>>>>>>>
>>>>>>>>     val conf = new SparkConf().setAppName("Simple
>>>>>>>> Application").set("SPARK_LIBRARY_PATH", "lib")
>>>>>>>>     val sc = new SparkContext(conf)
>>>>>>>>      System.loadLibrary("foolib")
>>>>>>>>     //instantiate the class
>>>>>>>>      val SimpleAppInstance = new SimpleApp
>>>>>>>>     //String passing - Working
>>>>>>>>     val ret = SimpleAppInstance.foo("fooString")
>>>>>>>>   }
>>>>>>>>
>>>>>>>> Above code work fines.
>>>>>>>>
>>>>>>>> I have setup LD_LIBRARY_PATH and spark.executor.extraClassPath,
>>>>>>>> spark.executor.extraLibraryPath at worker node
>>>>>>>>
>>>>>>>> How can i invoke JNI library from worker node ? Where should i load
>>>>>>>> it in executor ?
>>>>>>>> Calling  System.loadLibrary("foolib") inside the work node gives
>>>>>>>> me following error :
>>>>>>>>
>>>>>>>> Exception in thread "main" java.lang.UnsatisfiedLinkError:
>>>>>>>>
>>>>>>>> Any help would be really appreciated.
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Third party library

Posted by Gmail <vg...@gmail.com>.
Maybe you've already checked these out. Some basic questions that come to my mind are:
1) is this library "foolib" or "foo-C-library" available on the worker node?
2) if yes, is it accessible by the user/program (rwx)?

Thanks,
Vasu. 

> On Nov 26, 2016, at 5:08 PM, kant kodali <ka...@gmail.com> wrote:
> 
> If it is working for standalone program I would think you can apply the same settings across all the spark worker  and client machines and give that a try. Lets start with that.
> 
>> On Sat, Nov 26, 2016 at 11:59 AM, vineet chadha <st...@gmail.com> wrote:
>> Just subscribed to  Spark User.  So, forwarding message again.
>> 
>>> On Sat, Nov 26, 2016 at 11:50 AM, vineet chadha <st...@gmail.com> wrote:
>>> Thanks Kant. Can you give me a sample program which allows me to call jni from executor task ?   I have jni working in standalone program in scala/java. 
>>> 
>>> Regards,
>>> Vineet
>>> 
>>>> On Sat, Nov 26, 2016 at 11:43 AM, kant kodali <ka...@gmail.com> wrote:
>>>> Yes this is a Java JNI question. Nothing to do with Spark really.
>>>> 
>>>>  java.lang.UnsatisfiedLinkError typically would mean the way you setup LD_LIBRARY_PATH is wrong unless you tell us that it is working for other cases but not this one.
>>>> 
>>>>> On Sat, Nov 26, 2016 at 11:23 AM, Reynold Xin <rx...@databricks.com> wrote:
>>>> 
>>>>> That's just standard JNI and has nothing to do with Spark, does it?
>>>>> 
>>>>> 
>>>>>> On Sat, Nov 26, 2016 at 11:19 AM, vineet chadha <st...@gmail.com> wrote:
>>>>>> Thanks Reynold for quick reply.
>>>>>> 
>>>>>>  I have tried following: 
>>>>>> 
>>>>>> class MySimpleApp {
>>>>>>  // ---Native methods
>>>>>>   @native def fooMethod (foo: String): String
>>>>>> }
>>>>>> 
>>>>>> object MySimpleApp {
>>>>>>   val flag = false
>>>>>>   def loadResources() {
>>>>>> 	 System.loadLibrary("foo-C-library")
>>>>>>  	 val flag = true
>>>>>>   }
>>>>>>   def main() {
>>>>>>     sc.parallelize(1 to 10).mapPartitions ( iter => {
>>>>>>       if(flag == false){
>>>>>>       	MySimpleApp.loadResources()
>>>>>> 	      val SimpleInstance = new MySimpleApp
>>>>>>       }
>>>>>>       SimpleInstance.fooMethod ("fooString") 
>>>>>>       iter
>>>>>>     })
>>>>>>   }
>>>>>> }
>>>>>> 
>>>>>> I don't see way to invoke fooMethod which is implemented in foo-C-library. Is I am missing something ? If possible, can you point me to existing implementation which i can refer to.
>>>>>> 
>>>>>> Thanks again. 
>>>>>> ~        
>>>>>> 
>>>>>> 
>>>>>>> On Fri, Nov 25, 2016 at 3:32 PM, Reynold Xin <rx...@databricks.com> wrote:
>>>>>>> bcc dev@ and add user@
>>>>>>> 
>>>>>>> 
>>>>>>> This is more a user@ list question rather than a dev@ list question. You can do something like this:
>>>>>>> 
>>>>>>> object MySimpleApp {
>>>>>>>   def loadResources(): Unit = // define some idempotent way to load resources, e.g. with a flag or lazy val
>>>>>>> 
>>>>>>>   def main() = {
>>>>>>>     ...
>>>>>>>    
>>>>>>>     sc.parallelize(1 to 10).mapPartitions { iter =>
>>>>>>>       MySimpleApp.loadResources()
>>>>>>>       
>>>>>>>       // do whatever you want with the iterator
>>>>>>>     }
>>>>>>>   }
>>>>>>> }
>>>>>>> 
>>>>>>> 
>>>>>>> 
>>>>>>> 
>>>>>>> 
>>>>>>>> On Fri, Nov 25, 2016 at 2:33 PM, vineet chadha <st...@gmail.com> wrote:
>>>>>>>> Hi,
>>>>>>>> 
>>>>>>>> I am trying to invoke C library from the Spark Stack using JNI interface (here is sample  application code)
>>>>>>>> 
>>>>>>>> 
>>>>>>>> class SimpleApp {
>>>>>>>>  // ---Native methods
>>>>>>>> @native def foo (Top: String): String
>>>>>>>> }
>>>>>>>> 
>>>>>>>> object SimpleApp  {
>>>>>>>>    def main(args: Array[String]) {
>>>>>>>>  
>>>>>>>>     val conf = new SparkConf().setAppName("SimpleApplication").set("SPARK_LIBRARY_PATH", "lib")
>>>>>>>>     val sc = new SparkContext(conf)
>>>>>>>>      System.loadLibrary("foolib")
>>>>>>>>     //instantiate the class
>>>>>>>>      val SimpleAppInstance = new SimpleApp
>>>>>>>>     //String passing - Working
>>>>>>>>     val ret = SimpleAppInstance.foo("fooString")
>>>>>>>>   }
>>>>>>>> 
>>>>>>>> Above code work fines. 
>>>>>>>> 
>>>>>>>> I have setup LD_LIBRARY_PATH and spark.executor.extraClassPath,  spark.executor.extraLibraryPath at worker node
>>>>>>>> 
>>>>>>>> How can i invoke JNI library from worker node ? Where should i load it in executor ?
>>>>>>>> Calling  System.loadLibrary("foolib") inside the work node gives me following error :
>>>>>>>> 
>>>>>>>> Exception in thread "main" java.lang.UnsatisfiedLinkError: 
>>>>>>>> Any help would be really appreciated.
>>>>>>>> 
>>>>>>>> 
>>>>>>>> 
>>>>>>>> 
>>>>>>>> 
>>>>>>>> 
>>>>>>>> 
>>>>>>>> 
>>>>>>>> 
>>>>>>>> 
>>>>>>>> 
>>>>>>>> 
>>>>>>>> 
>>>>>>> 
>>>>>> 
>>>>> 
>>>> 
>>> 
>> 
> 

Re: Third party library

Posted by kant kodali <ka...@gmail.com>.
If it is working for standalone program I would think you can apply the
same settings across all the spark worker  and client machines and give
that a try. Lets start with that.

On Sat, Nov 26, 2016 at 11:59 AM, vineet chadha <st...@gmail.com>
wrote:

> Just subscribed to  Spark User.  So, forwarding message again.
>
> On Sat, Nov 26, 2016 at 11:50 AM, vineet chadha <st...@gmail.com>
> wrote:
>
>> Thanks Kant. Can you give me a sample program which allows me to call jni
>> from executor task ?   I have jni working in standalone program in
>> scala/java.
>>
>> Regards,
>> Vineet
>>
>> On Sat, Nov 26, 2016 at 11:43 AM, kant kodali <ka...@gmail.com> wrote:
>>
>>> Yes this is a Java JNI question. Nothing to do with Spark really.
>>>
>>>  java.lang.UnsatisfiedLinkError typically would mean the way you setup LD_LIBRARY_PATH
>>> is wrong unless you tell us that it is working for other cases but not this
>>> one.
>>>
>>> On Sat, Nov 26, 2016 at 11:23 AM, Reynold Xin <rx...@databricks.com>
>>> wrote:
>>>
>>>> That's just standard JNI and has nothing to do with Spark, does it?
>>>>
>>>>
>>>> On Sat, Nov 26, 2016 at 11:19 AM, vineet chadha <start.vineet@gmail.com
>>>> > wrote:
>>>>
>>>>> Thanks Reynold for quick reply.
>>>>>
>>>>>  I have tried following:
>>>>>
>>>>> class MySimpleApp {
>>>>>  // ---Native methods
>>>>>   @native def fooMethod (foo: String): String
>>>>> }
>>>>>
>>>>> object MySimpleApp {
>>>>>   val flag = false
>>>>>   def loadResources() {
>>>>> System.loadLibrary("foo-C-library")
>>>>>   val flag = true
>>>>>   }
>>>>>   def main() {
>>>>>     sc.parallelize(1 to 10).mapPartitions ( iter => {
>>>>>       if(flag == false){
>>>>>       MySimpleApp.loadResources()
>>>>>      val SimpleInstance = new MySimpleApp
>>>>>       }
>>>>>       SimpleInstance.fooMethod ("fooString")
>>>>>       iter
>>>>>     })
>>>>>   }
>>>>> }
>>>>>
>>>>> I don't see way to invoke fooMethod which is implemented in
>>>>> foo-C-library. Is I am missing something ? If possible, can you point me to
>>>>> existing implementation which i can refer to.
>>>>>
>>>>> Thanks again.
>>>>>
>>>>> ~
>>>>>
>>>>> On Fri, Nov 25, 2016 at 3:32 PM, Reynold Xin <rx...@databricks.com>
>>>>> wrote:
>>>>>
>>>>>> bcc dev@ and add user@
>>>>>>
>>>>>>
>>>>>> This is more a user@ list question rather than a dev@ list question.
>>>>>> You can do something like this:
>>>>>>
>>>>>> object MySimpleApp {
>>>>>>   def loadResources(): Unit = // define some idempotent way to load
>>>>>> resources, e.g. with a flag or lazy val
>>>>>>
>>>>>>   def main() = {
>>>>>>     ...
>>>>>>
>>>>>>     sc.parallelize(1 to 10).mapPartitions { iter =>
>>>>>>       MySimpleApp.loadResources()
>>>>>>
>>>>>>       // do whatever you want with the iterator
>>>>>>     }
>>>>>>   }
>>>>>> }
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Fri, Nov 25, 2016 at 2:33 PM, vineet chadha <
>>>>>> start.vineet@gmail.com> wrote:
>>>>>>
>>>>>>> Hi,
>>>>>>>
>>>>>>> I am trying to invoke C library from the Spark Stack using JNI
>>>>>>> interface (here is sample  application code)
>>>>>>>
>>>>>>>
>>>>>>> class SimpleApp {
>>>>>>>  // ---Native methods
>>>>>>> @native def foo (Top: String): String
>>>>>>> }
>>>>>>>
>>>>>>> object SimpleApp  {
>>>>>>>    def main(args: Array[String]) {
>>>>>>>
>>>>>>>     val conf = new SparkConf().setAppName("Simple
>>>>>>> Application").set("SPARK_LIBRARY_PATH", "lib")
>>>>>>>     val sc = new SparkContext(conf)
>>>>>>>      System.loadLibrary("foolib")
>>>>>>>     //instantiate the class
>>>>>>>      val SimpleAppInstance = new SimpleApp
>>>>>>>     //String passing - Working
>>>>>>>     val ret = SimpleAppInstance.foo("fooString")
>>>>>>>   }
>>>>>>>
>>>>>>> Above code work fines.
>>>>>>>
>>>>>>> I have setup LD_LIBRARY_PATH and spark.executor.extraClassPath,
>>>>>>> spark.executor.extraLibraryPath at worker node
>>>>>>>
>>>>>>> How can i invoke JNI library from worker node ? Where should i load
>>>>>>> it in executor ?
>>>>>>> Calling  System.loadLibrary("foolib") inside the work node gives me
>>>>>>> following error :
>>>>>>>
>>>>>>> Exception in thread "main" java.lang.UnsatisfiedLinkError:
>>>>>>>
>>>>>>> Any help would be really appreciated.
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Third party library

Posted by vineet chadha <st...@gmail.com>.
Just subscribed to  Spark User.  So, forwarding message again.

On Sat, Nov 26, 2016 at 11:50 AM, vineet chadha <st...@gmail.com>
wrote:

> Thanks Kant. Can you give me a sample program which allows me to call jni
> from executor task ?   I have jni working in standalone program in
> scala/java.
>
> Regards,
> Vineet
>
> On Sat, Nov 26, 2016 at 11:43 AM, kant kodali <ka...@gmail.com> wrote:
>
>> Yes this is a Java JNI question. Nothing to do with Spark really.
>>
>>  java.lang.UnsatisfiedLinkError typically would mean the way you setup LD_LIBRARY_PATH
>> is wrong unless you tell us that it is working for other cases but not this
>> one.
>>
>> On Sat, Nov 26, 2016 at 11:23 AM, Reynold Xin <rx...@databricks.com>
>> wrote:
>>
>>> That's just standard JNI and has nothing to do with Spark, does it?
>>>
>>>
>>> On Sat, Nov 26, 2016 at 11:19 AM, vineet chadha <st...@gmail.com>
>>> wrote:
>>>
>>>> Thanks Reynold for quick reply.
>>>>
>>>>  I have tried following:
>>>>
>>>> class MySimpleApp {
>>>>  // ---Native methods
>>>>   @native def fooMethod (foo: String): String
>>>> }
>>>>
>>>> object MySimpleApp {
>>>>   val flag = false
>>>>   def loadResources() {
>>>> System.loadLibrary("foo-C-library")
>>>>   val flag = true
>>>>   }
>>>>   def main() {
>>>>     sc.parallelize(1 to 10).mapPartitions ( iter => {
>>>>       if(flag == false){
>>>>       MySimpleApp.loadResources()
>>>>      val SimpleInstance = new MySimpleApp
>>>>       }
>>>>       SimpleInstance.fooMethod ("fooString")
>>>>       iter
>>>>     })
>>>>   }
>>>> }
>>>>
>>>> I don't see way to invoke fooMethod which is implemented in
>>>> foo-C-library. Is I am missing something ? If possible, can you point me to
>>>> existing implementation which i can refer to.
>>>>
>>>> Thanks again.
>>>>
>>>> ~
>>>>
>>>> On Fri, Nov 25, 2016 at 3:32 PM, Reynold Xin <rx...@databricks.com>
>>>> wrote:
>>>>
>>>>> bcc dev@ and add user@
>>>>>
>>>>>
>>>>> This is more a user@ list question rather than a dev@ list question.
>>>>> You can do something like this:
>>>>>
>>>>> object MySimpleApp {
>>>>>   def loadResources(): Unit = // define some idempotent way to load
>>>>> resources, e.g. with a flag or lazy val
>>>>>
>>>>>   def main() = {
>>>>>     ...
>>>>>
>>>>>     sc.parallelize(1 to 10).mapPartitions { iter =>
>>>>>       MySimpleApp.loadResources()
>>>>>
>>>>>       // do whatever you want with the iterator
>>>>>     }
>>>>>   }
>>>>> }
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> On Fri, Nov 25, 2016 at 2:33 PM, vineet chadha <start.vineet@gmail.com
>>>>> > wrote:
>>>>>
>>>>>> Hi,
>>>>>>
>>>>>> I am trying to invoke C library from the Spark Stack using JNI
>>>>>> interface (here is sample  application code)
>>>>>>
>>>>>>
>>>>>> class SimpleApp {
>>>>>>  // ---Native methods
>>>>>> @native def foo (Top: String): String
>>>>>> }
>>>>>>
>>>>>> object SimpleApp  {
>>>>>>    def main(args: Array[String]) {
>>>>>>
>>>>>>     val conf = new SparkConf().setAppName("Simple
>>>>>> Application").set("SPARK_LIBRARY_PATH", "lib")
>>>>>>     val sc = new SparkContext(conf)
>>>>>>      System.loadLibrary("foolib")
>>>>>>     //instantiate the class
>>>>>>      val SimpleAppInstance = new SimpleApp
>>>>>>     //String passing - Working
>>>>>>     val ret = SimpleAppInstance.foo("fooString")
>>>>>>   }
>>>>>>
>>>>>> Above code work fines.
>>>>>>
>>>>>> I have setup LD_LIBRARY_PATH and spark.executor.extraClassPath,
>>>>>> spark.executor.extraLibraryPath at worker node
>>>>>>
>>>>>> How can i invoke JNI library from worker node ? Where should i load
>>>>>> it in executor ?
>>>>>> Calling  System.loadLibrary("foolib") inside the work node gives me
>>>>>> following error :
>>>>>>
>>>>>> Exception in thread "main" java.lang.UnsatisfiedLinkError:
>>>>>>
>>>>>> Any help would be really appreciated.
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Third party library

Posted by kant kodali <ka...@gmail.com>.
Yes this is a Java JNI question. Nothing to do with Spark really.

 java.lang.UnsatisfiedLinkError typically would mean the way you setup
LD_LIBRARY_PATH
is wrong unless you tell us that it is working for other cases but not this
one.

On Sat, Nov 26, 2016 at 11:23 AM, Reynold Xin <rx...@databricks.com> wrote:

> That's just standard JNI and has nothing to do with Spark, does it?
>
>
> On Sat, Nov 26, 2016 at 11:19 AM, vineet chadha <st...@gmail.com>
> wrote:
>
>> Thanks Reynold for quick reply.
>>
>>  I have tried following:
>>
>> class MySimpleApp {
>>  // ---Native methods
>>   @native def fooMethod (foo: String): String
>> }
>>
>> object MySimpleApp {
>>   val flag = false
>>   def loadResources() {
>> System.loadLibrary("foo-C-library")
>>   val flag = true
>>   }
>>   def main() {
>>     sc.parallelize(1 to 10).mapPartitions ( iter => {
>>       if(flag == false){
>>       MySimpleApp.loadResources()
>>      val SimpleInstance = new MySimpleApp
>>       }
>>       SimpleInstance.fooMethod ("fooString")
>>       iter
>>     })
>>   }
>> }
>>
>> I don't see way to invoke fooMethod which is implemented in
>> foo-C-library. Is I am missing something ? If possible, can you point me to
>> existing implementation which i can refer to.
>>
>> Thanks again.
>>
>> ~
>>
>> On Fri, Nov 25, 2016 at 3:32 PM, Reynold Xin <rx...@databricks.com> wrote:
>>
>>> bcc dev@ and add user@
>>>
>>>
>>> This is more a user@ list question rather than a dev@ list question.
>>> You can do something like this:
>>>
>>> object MySimpleApp {
>>>   def loadResources(): Unit = // define some idempotent way to load
>>> resources, e.g. with a flag or lazy val
>>>
>>>   def main() = {
>>>     ...
>>>
>>>     sc.parallelize(1 to 10).mapPartitions { iter =>
>>>       MySimpleApp.loadResources()
>>>
>>>       // do whatever you want with the iterator
>>>     }
>>>   }
>>> }
>>>
>>>
>>>
>>>
>>>
>>> On Fri, Nov 25, 2016 at 2:33 PM, vineet chadha <st...@gmail.com>
>>> wrote:
>>>
>>>> Hi,
>>>>
>>>> I am trying to invoke C library from the Spark Stack using JNI
>>>> interface (here is sample  application code)
>>>>
>>>>
>>>> class SimpleApp {
>>>>  // ---Native methods
>>>> @native def foo (Top: String): String
>>>> }
>>>>
>>>> object SimpleApp  {
>>>>    def main(args: Array[String]) {
>>>>
>>>>     val conf = new SparkConf().setAppName("Simple
>>>> Application").set("SPARK_LIBRARY_PATH", "lib")
>>>>     val sc = new SparkContext(conf)
>>>>      System.loadLibrary("foolib")
>>>>     //instantiate the class
>>>>      val SimpleAppInstance = new SimpleApp
>>>>     //String passing - Working
>>>>     val ret = SimpleAppInstance.foo("fooString")
>>>>   }
>>>>
>>>> Above code work fines.
>>>>
>>>> I have setup LD_LIBRARY_PATH and spark.executor.extraClassPath,
>>>> spark.executor.extraLibraryPath at worker node
>>>>
>>>> How can i invoke JNI library from worker node ? Where should i load it
>>>> in executor ?
>>>> Calling  System.loadLibrary("foolib") inside the work node gives me
>>>> following error :
>>>>
>>>> Exception in thread "main" java.lang.UnsatisfiedLinkError:
>>>>
>>>> Any help would be really appreciated.
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>
>>
>

Re: Third party library

Posted by Reynold Xin <rx...@databricks.com>.
That's just standard JNI and has nothing to do with Spark, does it?


On Sat, Nov 26, 2016 at 11:19 AM, vineet chadha <st...@gmail.com>
wrote:

> Thanks Reynold for quick reply.
>
>  I have tried following:
>
> class MySimpleApp {
>  // ---Native methods
>   @native def fooMethod (foo: String): String
> }
>
> object MySimpleApp {
>   val flag = false
>   def loadResources() {
> System.loadLibrary("foo-C-library")
>   val flag = true
>   }
>   def main() {
>     sc.parallelize(1 to 10).mapPartitions ( iter => {
>       if(flag == false){
>       MySimpleApp.loadResources()
>      val SimpleInstance = new MySimpleApp
>       }
>       SimpleInstance.fooMethod ("fooString")
>       iter
>     })
>   }
> }
>
> I don't see way to invoke fooMethod which is implemented in foo-C-library.
> Is I am missing something ? If possible, can you point me to existing
> implementation which i can refer to.
>
> Thanks again.
>
> ~
>
> On Fri, Nov 25, 2016 at 3:32 PM, Reynold Xin <rx...@databricks.com> wrote:
>
>> bcc dev@ and add user@
>>
>>
>> This is more a user@ list question rather than a dev@ list question. You
>> can do something like this:
>>
>> object MySimpleApp {
>>   def loadResources(): Unit = // define some idempotent way to load
>> resources, e.g. with a flag or lazy val
>>
>>   def main() = {
>>     ...
>>
>>     sc.parallelize(1 to 10).mapPartitions { iter =>
>>       MySimpleApp.loadResources()
>>
>>       // do whatever you want with the iterator
>>     }
>>   }
>> }
>>
>>
>>
>>
>>
>> On Fri, Nov 25, 2016 at 2:33 PM, vineet chadha <st...@gmail.com>
>> wrote:
>>
>>> Hi,
>>>
>>> I am trying to invoke C library from the Spark Stack using JNI interface
>>> (here is sample  application code)
>>>
>>>
>>> class SimpleApp {
>>>  // ---Native methods
>>> @native def foo (Top: String): String
>>> }
>>>
>>> object SimpleApp  {
>>>    def main(args: Array[String]) {
>>>
>>>     val conf = new SparkConf().setAppName("Simple
>>> Application").set("SPARK_LIBRARY_PATH", "lib")
>>>     val sc = new SparkContext(conf)
>>>      System.loadLibrary("foolib")
>>>     //instantiate the class
>>>      val SimpleAppInstance = new SimpleApp
>>>     //String passing - Working
>>>     val ret = SimpleAppInstance.foo("fooString")
>>>   }
>>>
>>> Above code work fines.
>>>
>>> I have setup LD_LIBRARY_PATH and spark.executor.extraClassPath,
>>> spark.executor.extraLibraryPath at worker node
>>>
>>> How can i invoke JNI library from worker node ? Where should i load it
>>> in executor ?
>>> Calling  System.loadLibrary("foolib") inside the work node gives me
>>> following error :
>>>
>>> Exception in thread "main" java.lang.UnsatisfiedLinkError:
>>>
>>> Any help would be really appreciated.
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>
>

Re: Third party library

Posted by Reynold Xin <rx...@databricks.com>.
bcc dev@ and add user@


This is more a user@ list question rather than a dev@ list question. You
can do something like this:

object MySimpleApp {
  def loadResources(): Unit = // define some idempotent way to load
resources, e.g. with a flag or lazy val

  def main() = {
    ...

    sc.parallelize(1 to 10).mapPartitions { iter =>
      MySimpleApp.loadResources()

      // do whatever you want with the iterator
    }
  }
}





On Fri, Nov 25, 2016 at 2:33 PM, vineet chadha <st...@gmail.com>
wrote:

> Hi,
>
> I am trying to invoke C library from the Spark Stack using JNI interface
> (here is sample  application code)
>
>
> class SimpleApp {
>  // ---Native methods
> @native def foo (Top: String): String
> }
>
> object SimpleApp  {
>    def main(args: Array[String]) {
>
>     val conf = new SparkConf().setAppName("SimpleApplication").set("SPARK_LIBRARY_PATH",
> "lib")
>     val sc = new SparkContext(conf)
>      System.loadLibrary("foolib")
>     //instantiate the class
>      val SimpleAppInstance = new SimpleApp
>     //String passing - Working
>     val ret = SimpleAppInstance.foo("fooString")
>   }
>
> Above code work fines.
>
> I have setup LD_LIBRARY_PATH and spark.executor.extraClassPath,
> spark.executor.extraLibraryPath at worker node
>
> How can i invoke JNI library from worker node ? Where should i load it in
> executor ?
> Calling  System.loadLibrary("foolib") inside the work node gives me
> following error :
>
> Exception in thread "main" java.lang.UnsatisfiedLinkError:
>
> Any help would be really appreciated.
>
>
>
>
>
>
>
>
>
>
>
>
>

Re: Third party library

Posted by Reynold Xin <rx...@databricks.com>.
bcc dev@ and add user@


This is more a user@ list question rather than a dev@ list question. You
can do something like this:

object MySimpleApp {
  def loadResources(): Unit = // define some idempotent way to load
resources, e.g. with a flag or lazy val

  def main() = {
    ...

    sc.parallelize(1 to 10).mapPartitions { iter =>
      MySimpleApp.loadResources()

      // do whatever you want with the iterator
    }
  }
}





On Fri, Nov 25, 2016 at 2:33 PM, vineet chadha <st...@gmail.com>
wrote:

> Hi,
>
> I am trying to invoke C library from the Spark Stack using JNI interface
> (here is sample  application code)
>
>
> class SimpleApp {
>  // ---Native methods
> @native def foo (Top: String): String
> }
>
> object SimpleApp  {
>    def main(args: Array[String]) {
>
>     val conf = new SparkConf().setAppName("SimpleApplication").set("SPARK_LIBRARY_PATH",
> "lib")
>     val sc = new SparkContext(conf)
>      System.loadLibrary("foolib")
>     //instantiate the class
>      val SimpleAppInstance = new SimpleApp
>     //String passing - Working
>     val ret = SimpleAppInstance.foo("fooString")
>   }
>
> Above code work fines.
>
> I have setup LD_LIBRARY_PATH and spark.executor.extraClassPath,
> spark.executor.extraLibraryPath at worker node
>
> How can i invoke JNI library from worker node ? Where should i load it in
> executor ?
> Calling  System.loadLibrary("foolib") inside the work node gives me
> following error :
>
> Exception in thread "main" java.lang.UnsatisfiedLinkError:
>
> Any help would be really appreciated.
>
>
>
>
>
>
>
>
>
>
>
>
>