You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Lochana Menikarachchi <lo...@gmail.com> on 2014/12/01 11:58:50 UTC

packaging spark run time with osgi service

I have spark core and mllib as dependencies for a spark based osgi 
service. When I call the model building method through a unit test 
(without osgi) it works OK. When I call it through the osgi service, 
nothing happens. I tried adding spark assembly jar. Now it throws 
following error..

An error occurred while building supervised machine learning model: No 
configuration setting found for key 'akka.version'
com.typesafe.config.ConfigException$Missing: No configuration setting 
found for key 'akka.version'
     at com.typesafe.config.impl.SimpleConfig.findKey(SimpleConfig.java:115)
     at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:136)
     at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:142)
     at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:150)
     at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:155)
     at 
com.typesafe.config.impl.SimpleConfig.getString(SimpleConfig.java:197)

What is the correct way to include spark runtime dependencies to osgi 
service.. Thanks.

Lochana

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: packaging spark run time with osgi service

Posted by Lochana Menikarachchi <lo...@gmail.com>.
Already tried the solutions they provided.. Did not workout..
On 12/2/14 8:17 AM, Dinesh J. Weerakkody wrote:
> Hi Lochana,
>
> can you please go through this mail thread [1]. I haven't tried but 
> can be useful.
>
> [1] 
> http://apache-spark-user-list.1001560.n3.nabble.com/Packaging-a-spark-job-using-maven-td5615.html 
>
>
> On Mon, Dec 1, 2014 at 4:28 PM, Lochana Menikarachchi 
> <lochanac@gmail.com <ma...@gmail.com>> wrote:
>
>     I have spark core and mllib as dependencies for a spark based osgi
>     service. When I call the model building method through a unit test
>     (without osgi) it works OK. When I call it through the osgi
>     service, nothing happens. I tried adding spark assembly jar. Now
>     it throws following error..
>
>     An error occurred while building supervised machine learning
>     model: No configuration setting found for key 'akka.version'
>     com.typesafe.config.ConfigException$Missing: No configuration
>     setting found for key 'akka.version'
>         at
>     com.typesafe.config.impl.SimpleConfig.findKey(SimpleConfig.java:115)
>         at
>     com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:136)
>         at
>     com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:142)
>         at
>     com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:150)
>         at
>     com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:155)
>         at
>     com.typesafe.config.impl.SimpleConfig.getString(SimpleConfig.java:197)
>
>     What is the correct way to include spark runtime dependencies to
>     osgi service.. Thanks.
>
>     Lochana
>
>     ---------------------------------------------------------------------
>     To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>     <ma...@spark.apache.org>
>     For additional commands, e-mail: dev-help@spark.apache.org
>     <ma...@spark.apache.org>
>
>
>
>
> -- 
> Thanks & Best Regards,
>
> *Dinesh J. Weerakkody*
> /www.dineshjweerakkody.com <http://www.dineshjweerakkody.com>/


Re: packaging spark run time with osgi service

Posted by Lochana Menikarachchi <lo...@gmail.com>.
I think the problem has to do with akka not picking up the 
reference.conf file in the assembly.jar

We managed to make akka pick the conf file by temporary switching the 
class loaders.

Thread.currentThread().setContextClassLoader(JavaSparkContext.class.getClassLoader());

The model gets build but execution fails during some later stage with a snappy error..

14/12/04 08:07:44 ERROR Executor: Exception in task 0.0 in stage 105.0 (TID 104)
java.lang.UnsatisfiedLinkError: org.xerial.snappy.SnappyNative.maxCompressedLength(I)I
	at org.xerial.snappy.SnappyNative.maxCompressedLength(Native Method)
	at org.xerial.snappy.Snappy.maxCompressedLength(Snappy.java:320)
	at org.xerial.snappy.SnappyOutputStream.<init>(SnappyOutputStream.java:79)
	at org.apache.spark.io.SnappyCompressionCodec.compressedOutputStream(CompressionCodec.scala:125)

According to akka documentation a conf file can be parsed with -Dconfig.file= but, we couldn't get it to work..

Any ideas how to do this?

Lochana




On 12/2/14 8:17 AM, Dinesh J. Weerakkody wrote:
> Hi Lochana,
>
> can you please go through this mail thread [1]. I haven't tried but 
> can be useful.
>
> [1] 
> http://apache-spark-user-list.1001560.n3.nabble.com/Packaging-a-spark-job-using-maven-td5615.html 
>
>
> On Mon, Dec 1, 2014 at 4:28 PM, Lochana Menikarachchi 
> <lochanac@gmail.com <ma...@gmail.com>> wrote:
>
>     I have spark core and mllib as dependencies for a spark based osgi
>     service. When I call the model building method through a unit test
>     (without osgi) it works OK. When I call it through the osgi
>     service, nothing happens. I tried adding spark assembly jar. Now
>     it throws following error..
>
>     An error occurred while building supervised machine learning
>     model: No configuration setting found for key 'akka.version'
>     com.typesafe.config.ConfigException$Missing: No configuration
>     setting found for key 'akka.version'
>         at
>     com.typesafe.config.impl.SimpleConfig.findKey(SimpleConfig.java:115)
>         at
>     com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:136)
>         at
>     com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:142)
>         at
>     com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:150)
>         at
>     com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:155)
>         at
>     com.typesafe.config.impl.SimpleConfig.getString(SimpleConfig.java:197)
>
>     What is the correct way to include spark runtime dependencies to
>     osgi service.. Thanks.
>
>     Lochana
>
>     ---------------------------------------------------------------------
>     To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>     <ma...@spark.apache.org>
>     For additional commands, e-mail: dev-help@spark.apache.org
>     <ma...@spark.apache.org>
>
>
>
>
> -- 
> Thanks & Best Regards,
>
> *Dinesh J. Weerakkody*
> /www.dineshjweerakkody.com <http://www.dineshjweerakkody.com>/


Re: packaging spark run time with osgi service

Posted by "Dinesh J. Weerakkody" <di...@gmail.com>.
Hi Lochana,

can you please go through this mail thread [1]. I haven't tried but can be
useful.

[1]
http://apache-spark-user-list.1001560.n3.nabble.com/Packaging-a-spark-job-using-maven-td5615.html

On Mon, Dec 1, 2014 at 4:28 PM, Lochana Menikarachchi <lo...@gmail.com>
wrote:

> I have spark core and mllib as dependencies for a spark based osgi
> service. When I call the model building method through a unit test (without
> osgi) it works OK. When I call it through the osgi service, nothing
> happens. I tried adding spark assembly jar. Now it throws following error..
>
> An error occurred while building supervised machine learning model: No
> configuration setting found for key 'akka.version'
> com.typesafe.config.ConfigException$Missing: No configuration setting
> found for key 'akka.version'
>     at com.typesafe.config.impl.SimpleConfig.findKey(
> SimpleConfig.java:115)
>     at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:136)
>     at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:142)
>     at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:150)
>     at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:155)
>     at com.typesafe.config.impl.SimpleConfig.getString(
> SimpleConfig.java:197)
>
> What is the correct way to include spark runtime dependencies to osgi
> service.. Thanks.
>
> Lochana
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
>
>


-- 
Thanks & Best Regards,

*Dinesh J. Weerakkody*
*www.dineshjweerakkody.com <http://www.dineshjweerakkody.com>*