You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Jaonary Rabarisoa <ja...@gmail.com> on 2014/03/11 12:06:50 UTC

OpenCV + Spark : Where to put System.loadLibrary ?

Hi all,

I'm trying to build a stand alone scala spark application that uses opencv
for image processing.
To get opencv works with scala one need to call

 System.loadLibrary(Core.NATIVE_LIBRARY_NAME)

once per JVM process. How to call it inside spark application distributed
on several nodes ?

Best regards,

Jaonary

Re: OpenCV + Spark : Where to put System.loadLibrary ?

Posted by Jishnu Prathap <ji...@wipro.com>.
Hi Jaonary Rabarisoa,
      Where you able to fix this issue? Actually i am trying to integrate
OpenCV with Spark.It would be very helpful if you could share your
experience in integrating opencv with spark.It would really help me if you
could share some code how to use Mat ,IplImage and spark rdd 's . I am
relatively new to both spark and OpenCV
any help would be really appreciated.

Thanks in Advance!!
			



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/OpenCV-Spark-Where-to-put-System-loadLibrary-tp2523p21132.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: OpenCV + Spark : Where to put System.loadLibrary ?

Posted by Tobias Pfeiffer <tg...@preferred.jp>.
Hi,

please see this post:


http://apache-spark-user-list.1001560.n3.nabble.com/Keep-state-inside-map-function-tp10968p11009.html

Where it says "some setup code here", you could add your code to load the
library. Note, however, that this is not once per node, but once per
partition, so it might be called more than once on each node.

If you want to have setup code that is run once per node, put it in a Scala
"object" (as Matei pointed out back then). For example:

    object JvmLocalResource {
      val resource = {
        someInitFunction()
        new SomeResource()
      }
    }

Now if you use JvmLocalResource.resource, the someInitFunction() will be
called exactly once on each node (in each JVM). If the library loading is
synchronous (i.e., doesn't start some fancy background action that is not
finished yet when you want to start processing), that should do it.

Tobias

Re: OpenCV + Spark : Where to put System.loadLibrary ?

Posted by kmatzen <km...@gmail.com>.
Reviving this thread hoping I might be able to get an exact snippet for the
correct way to do this in Scala.  I had a solution for OpenCV that I thought
was correct, but half the time the library was not loaded by time it was
needed.

Keep in mind that I am completely new at Scala, so you're going to have to
be pretty explicit.



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/OpenCV-Spark-Where-to-put-System-loadLibrary-tp2523p12413.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: OpenCV + Spark : Where to put System.loadLibrary ?

Posted by Jaonary Rabarisoa <ja...@gmail.com>.
Do you have a snippets showing how to do this. I'm relatively new to spark
and scala and for now, my code is just a single file inspired  from spark
example :

object SparkOpencv {
  def main(args: Array[String]) {
    val conf = new SparkConf()
             .setMaster("local[8]")
             .setAppName("SparkPi")
             .set("spark.executor.memory", "1g")
             .set("SPARK_LIBRARY_PATH", "/home/opencv/build/lib")
             .set("SPARK_PRINT_LAUNCH_COMMAND","1")
             .set("SPARK_CLASSPATH","/home/opencv/build/bin/opencv-300.jar")

    val spark = new SparkContext(conf)
    spark.addJar("/home/jrabarisoa/github/opencv/build/bin/opencv-300.jar")

    System.loadLibrary(Core.NATIVE_LIBRARY_NAME)




On Tue, Mar 11, 2014 at 8:05 PM, Matei Zaharia <ma...@gmail.com>wrote:

> In short you should add it to a static initializer or singleton object
> that you call before accessing your library.
>
> Also add your library to SPARK_LIBRARY_PATH so it can find the .so / .dll.
>
> Matei
>
> On Mar 11, 2014, at 7:05 AM, Debasish Das <de...@gmail.com>
> wrote:
>
> Look at jblas operations inside mllib...jblas calls jniloader internally
> which loadd up native code when available....
>  On Mar 11, 2014 4:07 AM, "Jaonary Rabarisoa" <ja...@gmail.com> wrote:
>
>> Hi all,
>>
>> I'm trying to build a stand alone scala spark application that uses
>> opencv for image processing.
>> To get opencv works with scala one need to call
>>
>>   System.loadLibrary(Core.NATIVE_LIBRARY_NAME)
>>
>> once per JVM process. How to call it inside spark application distributed
>> on several nodes ?
>>
>> Best regards,
>>
>> Jaonary
>>
>
>

Re: OpenCV + Spark : Where to put System.loadLibrary ?

Posted by Matei Zaharia <ma...@gmail.com>.
In short you should add it to a static initializer or singleton object that you call before accessing your library.

Also add your library to SPARK_LIBRARY_PATH so it can find the .so / .dll.

Matei

On Mar 11, 2014, at 7:05 AM, Debasish Das <de...@gmail.com> wrote:

> Look at jblas operations inside mllib...jblas calls jniloader internally which loadd up native code when available....
> On Mar 11, 2014 4:07 AM, "Jaonary Rabarisoa" <ja...@gmail.com> wrote:
> Hi all,
> 
> I'm trying to build a stand alone scala spark application that uses opencv for image processing.
> To get opencv works with scala one need to call
> 
>  System.loadLibrary(Core.NATIVE_LIBRARY_NAME)
> 
> once per JVM process. How to call it inside spark application distributed on several nodes ?
> 
> Best regards,
> 
> Jaonary 


Re: OpenCV + Spark : Where to put System.loadLibrary ?

Posted by Debasish Das <de...@gmail.com>.
Look at jblas operations inside mllib...jblas calls jniloader internally
which loadd up native code when available....
 On Mar 11, 2014 4:07 AM, "Jaonary Rabarisoa" <ja...@gmail.com> wrote:

> Hi all,
>
> I'm trying to build a stand alone scala spark application that uses opencv
> for image processing.
> To get opencv works with scala one need to call
>
>  System.loadLibrary(Core.NATIVE_LIBRARY_NAME)
>
> once per JVM process. How to call it inside spark application distributed
> on several nodes ?
>
> Best regards,
>
> Jaonary
>