You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by sp...@orbit-x.de on 2014/09/11 16:34:33 UTC

Re[2]: HBase 0.96+ with Spark 1.0+

Hi guys,

any luck with this issue, anyone?

I aswell tried all the possible exclusion combos to a no avail.

thanks for your ideas
reinis

-----Original-Nachricht----- 
> Von: "Stephen Boesch" <ja...@gmail.com> 
> An: user <us...@spark.apache.org> 
> Datum: 28-06-2014 15:12 
> Betreff: Re: HBase 0.96+ with Spark 1.0+ 
> 
> Hi Siyuan,
 Thanks for the input. We are preferring to use the SparkBuild.scala instead of maven.  I did not see any protobuf.version  related settings in that file. But - as noted by Sean Owen - in any case the issue we are facing presently is about the duplicate incompatible javax.servlet entries - apparently from the org.mortbay artifacts.
 
 
> 
> 2014-06-28 6:01 GMT-07:00 Siyuan he <hs...@gmail.com>:
> Hi Stephen,
> 
I am using spark1.0+ HBase0.96.2. This is what I did:
1) rebuild spark using: mvn -Dhadoop.version=2.3.0 -Dprotobuf.version=2.5.0 -DskipTests clean package
2) In spark-env.sh, set SPARK_CLASSPATH = /path-to/hbase-protocol-0.96.2-hadoop2.jar 

> 
Hopefully it can help.
Siyuan
 
 
> 
> On Sat, Jun 28, 2014 at 8:52 AM, Stephen Boesch <ja...@gmail.com> wrote:
>  
> 
Thanks Sean.  I had actually already added exclusion rule for org.mortbay.jetty - and that had not resolved it.
> 
Just in case I used your precise formulation:

> 
val excludeMortbayJetty = ExclusionRule(organization = "org.mortbay.jetty")
..
 
  ,("org.apache.spark" % "spark-core_2.10" % sparkVersion  withSources()).excludeAll(excludeMortbayJetty)
  ,("org.apache.spark" % "spark-sql_2.10" % sparkVersion  withSources()).excludeAll(excludeMortbayJetty)

> 
However the same error still recurs:

> 
14/06/28 05:48:35 INFO HttpServer: Starting HTTP Server
[error] (run-main-0) java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s signer information does not match signer information of other classes in the same package
java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s signer information does not match signer information of other classes in the same package
 
 

> 

> 

> 
> 2014-06-28 4:22 GMT-07:00 Sean Owen <so...@cloudera.com>:

> This sounds like an instance of roughly the same item as in
> https://issues.apache.org/jira/browse/SPARK-1949  Have a look at
> adding that exclude to see if it works.
> 

> On Fri, Jun 27, 2014 at 10:21 PM, Stephen Boesch <ja...@gmail.com> wrote:
> > The present trunk is built and tested against HBase 0.94.
> >
> >
> > I have tried various combinations of versions of HBase 0.96+ and Spark 1.0+
> > and all end up with
> >
> > 14/06/27 20:11:15 INFO HttpServer: Starting HTTP Server
> > [error] (run-main-0) java.lang.SecurityException: class
> > "javax.servlet.FilterRegistration"'s signer information does not match
> > signer information of other classes in the same package
> > java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s
> > signer information does not match signer information of other classes in the
> > same package
> >         at java.lang.ClassLoader.checkCerts(ClassLoader.java:952)
> >
> >
> > I have tried a number of different ways to exclude javax.servlet related
> > jars. But none have avoided this error.
> >
> > Anyone have a (small-ish) build.sbt that works with later versions of HBase?
> >
> >
>  
 
 
>  
 
 
>  
 
>  




---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: HBase 0.96+ with Spark 1.0+

Posted by Ted Yu <yu...@gmail.com>.
The stack trace mentioned OutOfMemory error. 
See:
http://stackoverflow.com/questions/3003855/increase-permgen-space

On Sep 18, 2014, at 1:59 AM, Reinis Vicups <sp...@orbit-x.de> wrote:

> I am humbly bumping this since even after another week of trying I haven't had luck to fix this yet.
> 
> On 14.09.2014 19:21, Reinis Vicups wrote:
>> I did actually try Seans suggestion just before I posted for the first time in this thread. I got an error when doing this and thought that I am not understanding what Sean was suggesting.
>> 
>> Now I re-attempted your suggestions with spark 1.0.0-cdh5.1.0, hbase 0.98.1-cdh5.1.0 and hadoop 2.3.0-cdh5.1.0 I am currently using.
>> 
>> I used following:
>> 
>>   val mortbayEnforce = "org.mortbay.jetty" % "servlet-api" % "3.0.20100224"
>>   val mortbayExclusion = ExclusionRule(organization = "org.mortbay.jetty", name = "servlet-api-2.5")
>> 
>> and applied this to hadoop and hbase dependencies e.g. like this:
>> 
>> val hbase = Seq(HBase.server, HBase.common, HBase.compat, HBase.compat2, HBase.protocol, mortbayEnforce).map(_.excludeAll(HBase.exclusions: _*))
>> 
>> private object HBase {
>>     val server = "org.apache.hbase"  % "hbase-server" % Version.HBase
>>     ...
>>     val exclusions = Seq(ExclusionRule("org.apache.ant"), mortbayExclusion)
>> }
>> 
>> I still get the error I got the last time I tried this experiment:
>> 
>> 14/09/14 18:28:09 ERROR metrics.MetricsSystem: Sink class org.apache.spark.metrics.sink.MetricsServlet cannot be instantialized
>> java.lang.reflect.InvocationTargetException
>>     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native         Method)
>>     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>>     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>     at java.lang.reflect.Constructor.newInstance(Constructor.java:525)
>>     at org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply(MetricsSystem.scala:136)
>>     at org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply(MetricsSystem.scala:130)
>>     at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
>>     at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
>>     at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
>>     at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
>>     at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
>>     at org.apache.spark.metrics.MetricsSystem.registerSinks(MetricsSystem.scala:130)
>>     at org.apache.spark.metrics.MetricsSystem.<init>(MetricsSystem.scala:84)
>>     at org.apache.spark.metrics.MetricsSystem$.createMetricsSystem(MetricsSystem.scala:167)
>>     at org.apache.spark.SparkEnv$.create(SparkEnv.scala:230)
>>     at org.apache.spark.SparkContext.<init>(SparkContext.scala:202)
>>     at d.s.f.s.t.SimpleTicketTextSimilaritySparkJobSpec$$anonfun$1.apply$mcV$sp(SimpleTicketTextSimilaritySparkJobSpec.scala:29)
>>     at d.s.f.s.t.SimpleTicketTextSimilaritySparkJobSpec$$anonfun$1.apply(SimpleTicketTextSimilaritySparkJobSpec.scala:21)
>>     at d.s.f.s.t.SimpleTicketTextSimilaritySparkJobSpec$$anonfun$1.apply(SimpleTicketTextSimilaritySparkJobSpec.scala:21)
>>     at org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)
>>     at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
>>     at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
>>     at org.scalatest.Transformer.apply(Transformer.scala:22)
>>     at org.scalatest.Transformer.apply(Transformer.scala:20)
>>     at org.scalatest.FlatSpecLike$$anon$1.apply(FlatSpecLike.scala:1647)
>>     at org.scalatest.Suite$class.withFixture(Suite.scala:1122)
>>     at org.scalatest.FlatSpec.withFixture(FlatSpec.scala:1683)
>>     at org.scalatest.FlatSpecLike$class.invokeWithFixture$1(FlatSpecLike.scala:1644)
>>     at org.scalatest.FlatSpecLike$$anonfun$runTest$1.apply(FlatSpecLike.scala:1656)
>>     at org.scalatest.FlatSpecLike$$anonfun$runTest$1.apply(FlatSpecLike.scala:1656)
>>     at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
>>     at org.scalatest.FlatSpecLike$class.runTest(FlatSpecLike.scala:1656)
>>     at org.scalatest.FlatSpec.runTest(FlatSpec.scala:1683)
>>     at org.scalatest.FlatSpecLike$$anonfun$runTests$1.apply(FlatSpecLike.scala:1714)
>>     at org.scalatest.FlatSpecLike$$anonfun$runTests$1.apply(FlatSpecLike.scala:1714)
>>     at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:413)
>>     at org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401)
>>     at scala.collection.immutable.List.foreach(List.scala:318)
>>     at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
>>     at org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:396)
>>     at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:483)
>>     at org.scalatest.FlatSpecLike$class.runTests(FlatSpecLike.scala:1714)
>>     at org.scalatest.FlatSpec.runTests(FlatSpec.scala:1683)
>>     at org.scalatest.Suite$class.run(Suite.scala:1424)
>>     at org.scalatest.FlatSpec.org$scalatest$FlatSpecLike$$super$run(FlatSpec.scala:1683)
>>     at org.scalatest.FlatSpecLike$$anonfun$run$1.apply(FlatSpecLike.scala:1760)
>>     at org.scalatest.FlatSpecLike$$anonfun$run$1.apply(FlatSpecLike.scala:1760)
>>     at org.scalatest.SuperEngine.runImpl(Engine.scala:545)
>>     at org.scalatest.FlatSpecLike$class.run(FlatSpecLike.scala:1760)
>>     at d.s.f.s.t.SimpleTicketTextSimilaritySparkJobSpec.org$scalatest$BeforeAndAfterAll$$super$run(SimpleTicketTextSimilaritySparkJobSpec.scala:12)
>>     at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:257)
>>     at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:256)
>>     at d.s.f.s.t.SimpleTicketTextSimilaritySparkJobSpec.run(SimpleTicketTextSimilaritySparkJobSpec.scala:12)
>>     at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:462)
>>     at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:671)
>>     at sbt.ForkMain$Run$2.call(ForkMain.java:294)
>>     at sbt.ForkMain$Run$2.call(ForkMain.java:284)
>>     at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
>>     at java.util.concurrent.FutureTask.run(FutureTask.java:166)
>>     at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>     at java.lang.Thread.run(Thread.java:722)
>> Caused by: java.lang.OutOfMemoryError: PermGen space
>>     at java.lang.ClassLoader.defineClass1(Native Method)
>>     at java.lang.ClassLoader.defineClass(ClassLoader.java:791)
>>     at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>>     at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
>>     at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
>>     at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
>>     at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>>     at java.security.AccessController.doPrivileged(Native Method)
>>     at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>>     at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
>>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>>     at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
>>     at com.fasterxml.jackson.databind.ObjectMapper.<clinit>(ObjectMapper.java:191)
>>     at org.apache.spark.metrics.sink.MetricsServlet.<init>(MetricsServlet.scala:45)
>>     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native         Method)
>>     at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>>     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>     at java.lang.reflect.Constructor.newInstance(Constructor.java:525)
>>     at org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply(MetricsSystem.scala:136)
>>     at org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply(MetricsSystem.scala:130)
>>     at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
>>     at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
>>     at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
>>     at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
>>     at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
>>     at org.apache.spark.metrics.MetricsSystem.registerSinks(MetricsSystem.scala:130)
>>     at org.apache.spark.metrics.MetricsSystem.<init>(MetricsSystem.scala:84)
>>     at org.apache.spark.metrics.MetricsSystem$.createMetricsSystem(MetricsSystem.scala:167)
>>     at org.apache.spark.SparkEnv$.create(SparkEnv.scala:230)
>>     at org.apache.spark.SparkContext.<init>(SparkContext.scala:202)
>>     at d.s.f.s.t.SimpleTicketTextSimilaritySparkJobSpec$$anonfun$1.apply$mcV$sp(SimpleTicketTextSimilaritySparkJobSpec.scala:29)
>>     at d.s.f.s.t.SimpleTicketTextSimilaritySparkJobSpec$$anonfun$1.apply(SimpleTicketTextSimilaritySparkJobSpec.scala:21)
>> 
>> My SimpleTicketTextSimilaritySparkJobSpec is basically this:
>> 
>>     val sc = new SparkContext(sConf)    // <-- OOM occurs in this line
>>     @transient val hConf = HBaseConfiguration.create() // this is actually a little bit more elaborated since I get config from HBaseTestingUtility
>>     val fullInputTableName = TestClient.testClient + ":" + HBaseModel.Quantify.termDictionary
>>     hConf.set(TableInputFormat.INPUT_TABLE, fullInputTableName)
>>     val rdd = sc.newAPIHadoopRDD(hConf, classOf[TableInputFormat], classOf[ImmutableBytesWritable], classOf[Result])
>>     val rddCount = rdd.count()
>> 
>> Since the issue comes from initializing of MetricsServlet and because my spark task is as primitive as it can be, I am considering if the issue is related to me replacing the servlet-api?
>> 
>> Thank you guys for your patience and valuable input!
>> reinis
>> 
>> On 12.09.2014 14:30, Aniket Bhatnagar wrote:
>>> Hi Reinis
>>> 
>>> Try if the exclude suggestion from me and Sean works for you. If not, can you turn on verbose class loading to see from where javax.servlet.ServletRegistration is loaded? The class should load from "org.mortbay.jetty" % "servlet-api" % jettyVersion. If it loads from some other jar, you would have to exclude it from your build.
>>> 
>>> Hope it helps.
>>> 
>>> Thanks,
>>> Aniket
>>> 
>>> On 12 September 2014 02:21, <sp...@orbit-x.de> wrote:
>>>> Thank you, Aniket for your hint!
>>>> 
>>>> Alas, I am facing really "hellish" situation as it seems, because I have integration tests using BOTH spark and HBase (Minicluster). Thus I get either:
>>>> 
>>>> class "javax.servlet.ServletRegistration"'s signer information does not match signer information of other classes in the same package
>>>> java.lang.SecurityException: class "javax.servlet.ServletRegistration"'s signer information does not match signer information of other classes in the same package
>>>>     at java.lang.ClassLoader.checkCerts(ClassLoader.java:943)
>>>>     at java.lang.ClassLoader.preDefineClass(ClassLoader.java:657)
>>>>     at java.lang.ClassLoader.defineClass(ClassLoader.java:785)
>>>> 
>>>> or:
>>>> 
>>>> [info]   Cause: java.lang.ClassNotFoundException: org.mortbay.jetty.servlet.Context
>>>> [info]   at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>>>> [info]   at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>>>> [info]   at java.security.AccessController.doPrivileged(Native Method)
>>>> [info]   at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>>>> [info]   at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
>>>> [info]   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>>>> [info]   at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
>>>> [info]   at org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:661)
>>>> [info]   at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:552)
>>>> [info]   at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:720)
>>>> 
>>>> I am searching the web already for a week trying to figure out how to make this work :-/
>>>> 
>>>> all the help or hints are greatly appreciated
>>>> reinis
>>>> 
>>>> 
>>>> -----Original-Nachricht-----
>>>> Von: "Aniket Bhatnagar" <an...@gmail.com>
>>>> An: spark@orbit-x.de
>>>> Cc: user <us...@spark.apache.org>
>>>> Datum: 11-09-2014 20:00
>>>> Betreff: Re: Re[2]: HBase 0.96+ with Spark 1.0+
>>>> 
>>>> 
>>>> Dependency hell... My fav problem :).
>>>> 
>>>> I had run into a similar issue with hbase and jetty. I cant remember thw exact fix, but is are excerpts from my dependencies that may be relevant:
>>>> 
>>>> val hadoop2Common = "org.apache.hadoop" % "hadoop-common" %                         hadoop2Version excludeAll(
>>>> 
>>>>   ExclusionRule(organization = "javax.servlet"),
>>>> 
>>>>   ExclusionRule(organization = "javax.servlet.jsp"),
>>>> 
>>>> ExclusionRule(organization = "org.mortbay.jetty")
>>>> 
>>>>   )
>>>> 
>>>>   val hadoop2MapRedClient = "org.apache.hadoop" % "hadoop-mapreduce-client-core" % hadoop2Version
>>>> 
>>>>   val hbase = "org.apache.hbase" % "hbase" % hbaseVersion excludeAll(
>>>> 
>>>>   ExclusionRule(organization = "org.apache.maven.wagon"),
>>>> 
>>>>   ExclusionRule(organization = "org.jboss.netty"),
>>>> 
>>>> ExclusionRule(organization = "org.mortbay.jetty"),
>>>> 
>>>>   ExclusionRule(organization = "org.jruby") // Don't need HBASE's jruby. It pulls in whole lot of other dependencies like joda-time.
>>>> 
>>>>   )
>>>> 
>>>> val sparkCore = "org.apache.spark" %% "spark-core" % sparkVersion
>>>> 
>>>>   val sparkStreaming = "org.apache.spark" %% "spark-streaming" % sparkVersion
>>>> 
>>>>   val sparkSQL = "org.apache.spark" %% "spark-sql" % sparkVersion
>>>> 
>>>>   val sparkHive = "org.apache.spark" %% "spark-hive" % sparkVersion
>>>> 
>>>>   val sparkRepl = "org.apache.spark" %% "spark-repl" % sparkVersion
>>>> 
>>>>   val sparkAll = Seq (
>>>> 
>>>>   sparkCore excludeAll(
>>>> 
>>>>   ExclusionRule(organization = "org.apache.hadoop")), // We assume hadoop 2 and hence omit hadoop 1 dependencies
>>>> 
>>>>   sparkSQL,
>>>> 
>>>>   sparkStreaming,
>>>> 
>>>>   hadoop2MapRedClient,
>>>> 
>>>>   hadoop2Common,
>>>> 
>>>>   "org.mortbay.jetty" % "servlet-api" % "3.0.20100224"
>>>> 
>>>>   )
>>>> 
>>>> On Sep 11, 2014 8:05 PM, <sp...@orbit-x.de> wrote:
>>>>> Hi guys,
>>>>> 
>>>>> any luck with this issue, anyone?
>>>>> 
>>>>> I aswell tried all the possible exclusion combos to a no avail.
>>>>> 
>>>>> thanks for your ideas
>>>>> reinis
>>>>> 
>>>>> -----Original-Nachricht-----
>>>>> > Von: "Stephen Boesch" <ja...@gmail.com>
>>>>> > An: user <us...@spark.apache.org>
>>>>> > Datum: 28-06-2014 15:12
>>>>> > Betreff: Re: HBase 0.96+ with Spark 1.0+
>>>>> >
>>>>> > Hi Siyuan,
>>>>> Thanks for the input. We are preferring to use the SparkBuild.scala instead of maven. I did not see any protobuf.version related settings                           in that file. But - as noted by Sean Owen - in any case the issue we are facing presently is about the duplicate incompatible javax.servlet                           entries - apparently from the org.mortbay artifacts.
>>>>> 
>>>>> 
>>>>> >
>>>>> > 2014-06-28 6:01 GMT-07:00 Siyuan he <hs...@gmail.com>:
>>>>> > Hi Stephen,
>>>>> >
>>>>> I am using spark1.0+ HBase0.96.2. This is what I did:
>>>>> 1) rebuild spark using: mvn -Dhadoop.version=2.3.0 -Dprotobuf.version=2.5.0 -DskipTests clean package
>>>>> 2) In spark-env.sh, set SPARK_CLASSPATH = /path-to/hbase-protocol-0.96.2-hadoop2.jar 
>>>>> 
>>>>> >
>>>>> Hopefully it can help.
>>>>> Siyuan
>>>>> 
>>>>> 
>>>>> >
>>>>> > On Sat, Jun 28, 2014 at 8:52 AM, Stephen Boesch <ja...@gmail.com> wrote:
>>>>> > 
>>>>> >
>>>>> Thanks Sean. I had actually already added exclusion rule for org.mortbay.jetty - and that had not resolved it.
>>>>> >
>>>>> Just in case I used your precise formulation:
>>>>> 
>>>>> >
>>>>> val excludeMortbayJetty = ExclusionRule(organization = "org.mortbay.jetty")
>>>>> ..
>>>>> 
>>>>> ,("org.apache.spark" % "spark-core_2.10" % sparkVersion withSources()).excludeAll(excludeMortbayJetty)
>>>>> ,("org.apache.spark" % "spark-sql_2.10" % sparkVersion withSources()).excludeAll(excludeMortbayJetty)
>>>>> 
>>>>> >
>>>>> However the same error still recurs:
>>>>> 
>>>>> >
>>>>> 14/06/28 05:48:35 INFO HttpServer: Starting HTTP Server
>>>>> [error] (run-main-0) java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s signer information does not match signer information of other classes in the same package
>>>>> java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s signer information does not match signer information of other classes in the same package
>>>>> 
>>>>> 
>>>>> 
>>>>> >
>>>>> 
>>>>> >
>>>>> 
>>>>> >
>>>>> > 2014-06-28 4:22 GMT-07:00 Sean Owen <so...@cloudera.com>:
>>>>> 
>>>>> > This sounds like an instance of roughly the same item as in
>>>>> > https://issues.apache.org/jira/browse/SPARK-1949 Have a look at
>>>>> > adding that exclude to see if it works.
>>>>> >
>>>>> 
>>>>> > On Fri, Jun 27, 2014 at 10:21 PM, Stephen Boesch <ja...@gmail.com> wrote:
>>>>> > > The present trunk is built and tested against HBase 0.94.
>>>>> > >
>>>>> > >
>>>>> > > I have tried various combinations of versions of HBase 0.96+ and Spark 1.0+
>>>>> > > and all end up with
>>>>> > >
>>>>> > > 14/06/27 20:11:15 INFO HttpServer: Starting HTTP Server
>>>>> > > [error] (run-main-0) java.lang.SecurityException: class
>>>>> > > "javax.servlet.FilterRegistration"'s signer information does not match
>>>>> > > signer information of other classes in the same package
>>>>> > > java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s
>>>>> > > signer information does not match signer information of other classes in the
>>>>> > > same package
>>>>> > > at java.lang.ClassLoader.checkCerts(ClassLoader.java:952)
>>>>> > >
>>>>> > >
>>>>> > > I have tried a number of different ways to exclude javax.servlet related
>>>>> > > jars. But none have avoided this error.
>>>>> > >
>>>>> > > Anyone have a (small-ish) build.sbt that works with later versions of HBase?
>>>>> > >
>>>>> > >
>>>>> > 
>>>>> 
>>>>> 
>>>>> > 
>>>>> 
>>>>> 
>>>>> > 
>>>>> 
>>>>> > 
>>>>> 
>>>>> 
>>>>> 
>>>>> 
>>>>> ---------------------------------------------------------------------
>>>>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>>>>> For additional commands, e-mail: user-help@spark.apache.org

Re: HBase 0.96+ with Spark 1.0+

Posted by Reinis Vicups <sp...@orbit-x.de>.
I am humbly bumping this since even after another week of trying I 
haven't had luck to fix this yet.

On 14.09.2014 19:21, Reinis Vicups wrote:
> I did actually try Seans suggestion just before I posted for the first 
> time in this thread. I got an error when doing this and thought that I 
> am not understanding what Sean was suggesting.
>
> Now I re-attempted your suggestions with spark 1.0.0-cdh5.1.0, hbase 
> 0.98.1-cdh5.1.0 and hadoop 2.3.0-cdh5.1.0 I am currently using.
>
> I used following:
>
>   val mortbayEnforce = "org.mortbay.jetty" % "servlet-api" % 
> "3.0.20100224"
>   val mortbayExclusion = ExclusionRule(organization = 
> "org.mortbay.jetty", name = "servlet-api-2.5")
>
> and applied this to hadoop and hbase dependencies e.g. like this:
>
> val hbase = Seq(HBase.server, HBase.common, HBase.compat, 
> HBase.compat2, HBase.protocol, 
> mortbayEnforce).map(_.excludeAll(HBase.exclusions: _*))
>
> private object HBase {
>     val server = "org.apache.hbase"  % "hbase-server" % Version.HBase
>     ...
>     val exclusions = Seq(ExclusionRule("org.apache.ant"), 
> mortbayExclusion)
> }
>
> I still get the error I got the last time I tried this experiment:
>
> 14/09/14 18:28:09 ERROR metrics.MetricsSystem: Sink class 
> org.apache.spark.metrics.sink.MetricsServlet cannot be instantialized
> java.lang.reflect.InvocationTargetException
>     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
> Method)
>     at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>     at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>     at java.lang.reflect.Constructor.newInstance(Constructor.java:525)
>     at 
> org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply(MetricsSystem.scala:136)
>     at 
> org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply(MetricsSystem.scala:130)
>     at 
> scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
>     at 
> scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
>     at 
> scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
>     at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
>     at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
>     at 
> org.apache.spark.metrics.MetricsSystem.registerSinks(MetricsSystem.scala:130)
>     at 
> org.apache.spark.metrics.MetricsSystem.<init>(MetricsSystem.scala:84)
>     at 
> org.apache.spark.metrics.MetricsSystem$.createMetricsSystem(MetricsSystem.scala:167)
>     at org.apache.spark.SparkEnv$.create(SparkEnv.scala:230)
>     at org.apache.spark.SparkContext.<init>(SparkContext.scala:202)
>     at 
> d.s.f.s.t.SimpleTicketTextSimilaritySparkJobSpec$$anonfun$1.apply$mcV$sp(SimpleTicketTextSimilaritySparkJobSpec.scala:29)
>     at 
> d.s.f.s.t.SimpleTicketTextSimilaritySparkJobSpec$$anonfun$1.apply(SimpleTicketTextSimilaritySparkJobSpec.scala:21)
>     at 
> d.s.f.s.t.SimpleTicketTextSimilaritySparkJobSpec$$anonfun$1.apply(SimpleTicketTextSimilaritySparkJobSpec.scala:21)
>     at 
> org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)
>     at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
>     at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
>     at org.scalatest.Transformer.apply(Transformer.scala:22)
>     at org.scalatest.Transformer.apply(Transformer.scala:20)
>     at org.scalatest.FlatSpecLike$$anon$1.apply(FlatSpecLike.scala:1647)
>     at org.scalatest.Suite$class.withFixture(Suite.scala:1122)
>     at org.scalatest.FlatSpec.withFixture(FlatSpec.scala:1683)
>     at 
> org.scalatest.FlatSpecLike$class.invokeWithFixture$1(FlatSpecLike.scala:1644)
>     at 
> org.scalatest.FlatSpecLike$$anonfun$runTest$1.apply(FlatSpecLike.scala:1656)
>     at 
> org.scalatest.FlatSpecLike$$anonfun$runTest$1.apply(FlatSpecLike.scala:1656)
>     at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
>     at org.scalatest.FlatSpecLike$class.runTest(FlatSpecLike.scala:1656)
>     at org.scalatest.FlatSpec.runTest(FlatSpec.scala:1683)
>     at 
> org.scalatest.FlatSpecLike$$anonfun$runTests$1.apply(FlatSpecLike.scala:1714)
>     at 
> org.scalatest.FlatSpecLike$$anonfun$runTests$1.apply(FlatSpecLike.scala:1714)
>     at 
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:413)
>     at 
> org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401)
>     at scala.collection.immutable.List.foreach(List.scala:318)
>     at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
>     at 
> org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:396)
>     at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:483)
>     at org.scalatest.FlatSpecLike$class.runTests(FlatSpecLike.scala:1714)
>     at org.scalatest.FlatSpec.runTests(FlatSpec.scala:1683)
>     at org.scalatest.Suite$class.run(Suite.scala:1424)
>     at 
> org.scalatest.FlatSpec.org$scalatest$FlatSpecLike$$super$run(FlatSpec.scala:1683)
>     at 
> org.scalatest.FlatSpecLike$$anonfun$run$1.apply(FlatSpecLike.scala:1760)
>     at 
> org.scalatest.FlatSpecLike$$anonfun$run$1.apply(FlatSpecLike.scala:1760)
>     at org.scalatest.SuperEngine.runImpl(Engine.scala:545)
>     at org.scalatest.FlatSpecLike$class.run(FlatSpecLike.scala:1760)
>     at 
> d.s.f.s.t.SimpleTicketTextSimilaritySparkJobSpec.org$scalatest$BeforeAndAfterAll$$super$run(SimpleTicketTextSimilaritySparkJobSpec.scala:12)
>     at 
> org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:257)
>     at 
> org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:256)
>     at 
> d.s.f.s.t.SimpleTicketTextSimilaritySparkJobSpec.run(SimpleTicketTextSimilaritySparkJobSpec.scala:12)
>     at 
> org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:462)
>     at 
> org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:671)
>     at sbt.ForkMain$Run$2.call(ForkMain.java:294)
>     at sbt.ForkMain$Run$2.call(ForkMain.java:284)
>     at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
>     at java.util.concurrent.FutureTask.run(FutureTask.java:166)
>     at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>     at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>     at java.lang.Thread.run(Thread.java:722)
> Caused by: java.lang.OutOfMemoryError: PermGen space
>     at java.lang.ClassLoader.defineClass1(Native Method)
>     at java.lang.ClassLoader.defineClass(ClassLoader.java:791)
>     at 
> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>     at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
>     at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
>     at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
>     at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
>     at 
> com.fasterxml.jackson.databind.ObjectMapper.<clinit>(ObjectMapper.java:191)
>     at 
> org.apache.spark.metrics.sink.MetricsServlet.<init>(MetricsServlet.scala:45)
>     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
> Method)
>     at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>     at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>     at java.lang.reflect.Constructor.newInstance(Constructor.java:525)
>     at 
> org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply(MetricsSystem.scala:136)
>     at 
> org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply(MetricsSystem.scala:130)
>     at 
> scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
>     at 
> scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
>     at 
> scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
>     at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
>     at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
>     at 
> org.apache.spark.metrics.MetricsSystem.registerSinks(MetricsSystem.scala:130)
>     at 
> org.apache.spark.metrics.MetricsSystem.<init>(MetricsSystem.scala:84)
>     at 
> org.apache.spark.metrics.MetricsSystem$.createMetricsSystem(MetricsSystem.scala:167)
>     at org.apache.spark.SparkEnv$.create(SparkEnv.scala:230)
>     at org.apache.spark.SparkContext.<init>(SparkContext.scala:202)
>     at 
> d.s.f.s.t.SimpleTicketTextSimilaritySparkJobSpec$$anonfun$1.apply$mcV$sp(SimpleTicketTextSimilaritySparkJobSpec.scala:29)
>     at 
> d.s.f.s.t.SimpleTicketTextSimilaritySparkJobSpec$$anonfun$1.apply(SimpleTicketTextSimilaritySparkJobSpec.scala:21)
>
> My SimpleTicketTextSimilaritySparkJobSpec is basically this:
>
>     val sc = new SparkContext(sConf)    // <-- OOM occurs in this line
>     @transient val hConf = HBaseConfiguration.create() // this is 
> actually a little bit more elaborated since I get config from 
> HBaseTestingUtility
>     val fullInputTableName = TestClient.testClient + ":" + 
> HBaseModel.Quantify.termDictionary
>     hConf.set(TableInputFormat.INPUT_TABLE, fullInputTableName)
>     val rdd = sc.newAPIHadoopRDD(hConf, classOf[TableInputFormat], 
> classOf[ImmutableBytesWritable], classOf[Result])
>     val rddCount = rdd.count()
>
> Since the issue comes from initializing of MetricsServlet and because 
> my spark task is as primitive as it can be, I am considering if the 
> issue is related to me replacing the servlet-api?
>
> Thank you guys for your patience and valuable input!
> reinis
>
> On 12.09.2014 14:30, Aniket Bhatnagar wrote:
>> Hi Reinis
>>
>> Try if the exclude suggestion from me and Sean works for you. If not, 
>> can you turn on verbose class loading to see from where 
>> javax.servlet.ServletRegistration is loaded? The class should load 
>> from "org.mortbay.jetty" % "servlet-api" % jettyVersion. If it loads 
>> from some other jar, you would have to exclude it from your build.
>>
>> Hope it helps.
>>
>> Thanks,
>> Aniket
>>
>> On 12 September 2014 02:21, <spark@orbit-x.de 
>> <ma...@orbit-x.de>> wrote:
>>
>>     Thank you, Aniket for your hint!
>>
>>     Alas, I am facing really "hellish" situation as it seems, because
>>     I have integration tests using BOTH spark and HBase
>>     (Minicluster). Thus I get either:
>>
>>     class "javax.servlet.ServletRegistration"'s signer information
>>     does not match signer information of other classes in the same
>>     package
>>     java.lang.SecurityException: class
>>     "javax.servlet.ServletRegistration"'s signer information does not
>>     match signer information of other classes in the same package
>>         at java.lang.ClassLoader.checkCerts(ClassLoader.java:943)
>>         at java.lang.ClassLoader.preDefineClass(ClassLoader.java:657)
>>         at java.lang.ClassLoader.defineClass(ClassLoader.java:785)
>>
>>     or:
>>
>>     [info]   Cause: java.lang.ClassNotFoundException:
>>     org.mortbay.jetty.servlet.Context
>>     [info]   at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>>     [info]   at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>>     [info]   at java.security.AccessController.doPrivileged(Native
>>     Method)
>>     [info]   at
>>     java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>>     [info]   at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
>>     [info]   at
>>     sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>>     [info]   at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
>>     [info]   at
>>     org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:661)
>>     [info]   at
>>     org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:552)
>>     [info]   at
>>     org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:720)
>>
>>     I am searching the web already for a week trying to figure out
>>     how to make this work :-/
>>
>>     all the help or hints are greatly appreciated
>>     reinis
>>
>>
>>         ------------------------------------------------------------------------
>>         -----Original-Nachricht-----
>>         Von: "Aniket Bhatnagar" <aniket.bhatnagar@gmail.com
>>         <ma...@gmail.com>>
>>         An: spark@orbit-x.de <ma...@orbit-x.de>
>>         Cc: user <user@spark.apache.org <ma...@spark.apache.org>>
>>         Datum: 11-09-2014 20:00
>>         Betreff: Re: Re[2]: HBase 0.96+ with Spark 1.0+
>>
>>
>>         Dependency hell... My fav problem :).
>>
>>         I had run into a similar issue with hbase and jetty. I cant
>>         remember thw exact fix, but is are excerpts from my
>>         dependencies that may be relevant:
>>
>>         val hadoop2Common = "org.apache.hadoop" % "hadoop-common" %
>>         hadoop2Version excludeAll(
>>
>>           ExclusionRule(organization = "javax.servlet"),
>>
>>           ExclusionRule(organization = "javax.servlet.jsp"),
>>
>>         ExclusionRule(organization = "org.mortbay.jetty")
>>
>>           )
>>
>>           val hadoop2MapRedClient = "org.apache.hadoop" %
>>         "hadoop-mapreduce-client-core" % hadoop2Version
>>
>>           val hbase = "org.apache.hbase" % "hbase" % hbaseVersion
>>         excludeAll(
>>
>>           ExclusionRule(organization = "org.apache.maven.wagon"),
>>
>>           ExclusionRule(organization = "org.jboss.netty"),
>>
>>         ExclusionRule(organization = "org.mortbay.jetty"),
>>
>>           ExclusionRule(organization = "org.jruby") // Don't need
>>         HBASE's jruby. It pulls in whole lot of other dependencies
>>         like joda-time.
>>
>>           )
>>
>>         val sparkCore = "org.apache.spark" %% "spark-core" % sparkVersion
>>
>>           val sparkStreaming = "org.apache.spark" %%
>>         "spark-streaming" % sparkVersion
>>
>>           val sparkSQL = "org.apache.spark" %% "spark-sql" % sparkVersion
>>
>>           val sparkHive = "org.apache.spark" %% "spark-hive" %
>>         sparkVersion
>>
>>           val sparkRepl = "org.apache.spark" %% "spark-repl" %
>>         sparkVersion
>>
>>           val sparkAll = Seq (
>>
>>           sparkCore excludeAll(
>>
>>           ExclusionRule(organization = "org.apache.hadoop")), // We
>>         assume hadoop 2 and hence omit hadoop 1 dependencies
>>
>>           sparkSQL,
>>
>>           sparkStreaming,
>>
>>           hadoop2MapRedClient,
>>
>>           hadoop2Common,
>>
>>           "org.mortbay.jetty" % "servlet-api" % "3.0.20100224"
>>
>>           )
>>
>>         On Sep 11, 2014 8:05 PM, <spark@orbit-x.de
>>         <ma...@orbit-x.de>> wrote:
>>
>>             Hi guys,
>>
>>             any luck with this issue, anyone?
>>
>>             I aswell tried all the possible exclusion combos to a no
>>             avail.
>>
>>             thanks for your ideas
>>             reinis
>>
>>             -----Original-Nachricht-----
>>             > Von: "Stephen Boesch" <javadba@gmail.com
>>             <ma...@gmail.com>>
>>             > An: user <user@spark.apache.org
>>             <ma...@spark.apache.org>>
>>             > Datum: 28-06-2014 15:12
>>             > Betreff: Re: HBase 0.96+ with Spark 1.0+
>>             >
>>             > Hi Siyuan,
>>             Thanks for the input. We are preferring to use the
>>             SparkBuild.scala instead of maven. I did not see any
>>             protobuf.version related settings in that file. But - as
>>             noted by Sean Owen - in any case the issue we are facing
>>             presently is about the duplicate incompatible
>>             javax.servlet entries - apparently from the org.mortbay
>>             artifacts.
>>
>>
>>             >
>>             > 2014-06-28 6:01 GMT-07:00 Siyuan he <hsy008@gmail.com
>>             <ma...@gmail.com>>:
>>             > Hi Stephen,
>>             >
>>             I am using spark1.0+ HBase0.96.2. This is what I did:
>>             1) rebuild spark using: mvn -Dhadoop.version=2.3.0
>>             -Dprotobuf.version=2.5.0 -DskipTests clean package
>>             2) In spark-env.sh, set SPARK_CLASSPATH =
>>             /path-to/hbase-protocol-0.96.2-hadoop2.jar
>>
>>             >
>>             Hopefully it can help.
>>             Siyuan
>>
>>
>>             >
>>             > On Sat, Jun 28, 2014 at 8:52 AM, Stephen Boesch
>>             <javadba@gmail.com <ma...@gmail.com>> wrote:
>>             >
>>             >
>>             Thanks Sean. I had actually already added exclusion rule
>>             for org.mortbay.jetty - and that had not resolved it.
>>             >
>>             Just in case I used your precise formulation:
>>
>>             >
>>             val excludeMortbayJetty = ExclusionRule(organization =
>>             "org.mortbay.jetty")
>>             ..
>>
>>             ,("org.apache.spark" % "spark-core_2.10" % sparkVersion
>>             withSources()).excludeAll(excludeMortbayJetty)
>>             ,("org.apache.spark" % "spark-sql_2.10" % sparkVersion
>>             withSources()).excludeAll(excludeMortbayJetty)
>>
>>             >
>>             However the same error still recurs:
>>
>>             >
>>             14/06/28 05:48:35 INFO HttpServer: Starting HTTP Server
>>             [error] (run-main-0) java.lang.SecurityException: class
>>             "javax.servlet.FilterRegistration"'s signer information
>>             does not match signer information of other classes in the
>>             same package
>>             java.lang.SecurityException: class
>>             "javax.servlet.FilterRegistration"'s signer information
>>             does not match signer information of other classes in the
>>             same package
>>
>>
>>
>>             >
>>
>>             >
>>
>>             >
>>             > 2014-06-28 4:22 GMT-07:00 Sean Owen <sowen@cloudera.com
>>             <ma...@cloudera.com>>:
>>
>>             > This sounds like an instance of roughly the same item as in
>>             > https://issues.apache.org/jira/browse/SPARK-1949 Have a
>>             look at
>>             > adding that exclude to see if it works.
>>             >
>>
>>             > On Fri, Jun 27, 2014 at 10:21 PM, Stephen Boesch
>>             <javadba@gmail.com <ma...@gmail.com>> wrote:
>>             > > The present trunk is built and tested against HBase 0.94.
>>             > >
>>             > >
>>             > > I have tried various combinations of versions of
>>             HBase 0.96+ and Spark 1.0+
>>             > > and all end up with
>>             > >
>>             > > 14/06/27 20:11:15 INFO HttpServer: Starting HTTP Server
>>             > > [error] (run-main-0) java.lang.SecurityException: class
>>             > > "javax.servlet.FilterRegistration"'s signer
>>             information does not match
>>             > > signer information of other classes in the same package
>>             > > java.lang.SecurityException: class
>>             "javax.servlet.FilterRegistration"'s
>>             > > signer information does not match signer information
>>             of other classes in the
>>             > > same package
>>             > > at java.lang.ClassLoader.checkCerts(ClassLoader.java:952)
>>             > >
>>             > >
>>             > > I have tried a number of different ways to exclude
>>             javax.servlet related
>>             > > jars. But none have avoided this error.
>>             > >
>>             > > Anyone have a (small-ish) build.sbt that works with
>>             later versions of HBase?
>>             > >
>>             > >
>>             >
>>
>>
>>             >
>>
>>
>>             >
>>
>>             >
>>
>>
>>
>>
>>             ---------------------------------------------------------------------
>>             To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>>             <ma...@spark.apache.org>
>>             For additional commands, e-mail:
>>             user-help@spark.apache.org
>>             <ma...@spark.apache.org>
>>
>>
>

Re: HBase 0.96+ with Spark 1.0+

Posted by Reinis Vicups <sp...@orbit-x.de>.
I did actually try Seans suggestion just before I posted for the first 
time in this thread. I got an error when doing this and thought that I 
am not understanding what Sean was suggesting.

Now I re-attempted your suggestions with spark 1.0.0-cdh5.1.0, hbase 
0.98.1-cdh5.1.0 and hadoop 2.3.0-cdh5.1.0 I am currently using.

I used following:

   val mortbayEnforce = "org.mortbay.jetty" % "servlet-api" % "3.0.20100224"
   val mortbayExclusion = ExclusionRule(organization = 
"org.mortbay.jetty", name = "servlet-api-2.5")

and applied this to hadoop and hbase dependencies e.g. like this:

val hbase = Seq(HBase.server, HBase.common, HBase.compat, HBase.compat2, 
HBase.protocol, mortbayEnforce).map(_.excludeAll(HBase.exclusions: _*))

private object HBase {
     val server = "org.apache.hbase"  % "hbase-server" % Version.HBase
     ...
     val exclusions = Seq(ExclusionRule("org.apache.ant"), mortbayExclusion)
}

I still get the error I got the last time I tried this experiment:

14/09/14 18:28:09 ERROR metrics.MetricsSystem: Sink class 
org.apache.spark.metrics.sink.MetricsServlet cannot be instantialized
java.lang.reflect.InvocationTargetException
     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
Method)
     at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
     at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
     at java.lang.reflect.Constructor.newInstance(Constructor.java:525)
     at 
org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply(MetricsSystem.scala:136)
     at 
org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply(MetricsSystem.scala:130)
     at 
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
     at 
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
     at 
scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
     at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
     at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
     at 
org.apache.spark.metrics.MetricsSystem.registerSinks(MetricsSystem.scala:130)
     at 
org.apache.spark.metrics.MetricsSystem.<init>(MetricsSystem.scala:84)
     at 
org.apache.spark.metrics.MetricsSystem$.createMetricsSystem(MetricsSystem.scala:167)
     at org.apache.spark.SparkEnv$.create(SparkEnv.scala:230)
     at org.apache.spark.SparkContext.<init>(SparkContext.scala:202)
     at 
d.s.f.s.t.SimpleTicketTextSimilaritySparkJobSpec$$anonfun$1.apply$mcV$sp(SimpleTicketTextSimilaritySparkJobSpec.scala:29)
     at 
d.s.f.s.t.SimpleTicketTextSimilaritySparkJobSpec$$anonfun$1.apply(SimpleTicketTextSimilaritySparkJobSpec.scala:21)
     at 
d.s.f.s.t.SimpleTicketTextSimilaritySparkJobSpec$$anonfun$1.apply(SimpleTicketTextSimilaritySparkJobSpec.scala:21)
     at 
org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)
     at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
     at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
     at org.scalatest.Transformer.apply(Transformer.scala:22)
     at org.scalatest.Transformer.apply(Transformer.scala:20)
     at org.scalatest.FlatSpecLike$$anon$1.apply(FlatSpecLike.scala:1647)
     at org.scalatest.Suite$class.withFixture(Suite.scala:1122)
     at org.scalatest.FlatSpec.withFixture(FlatSpec.scala:1683)
     at 
org.scalatest.FlatSpecLike$class.invokeWithFixture$1(FlatSpecLike.scala:1644)
     at 
org.scalatest.FlatSpecLike$$anonfun$runTest$1.apply(FlatSpecLike.scala:1656)
     at 
org.scalatest.FlatSpecLike$$anonfun$runTest$1.apply(FlatSpecLike.scala:1656)
     at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)
     at org.scalatest.FlatSpecLike$class.runTest(FlatSpecLike.scala:1656)
     at org.scalatest.FlatSpec.runTest(FlatSpec.scala:1683)
     at 
org.scalatest.FlatSpecLike$$anonfun$runTests$1.apply(FlatSpecLike.scala:1714)
     at 
org.scalatest.FlatSpecLike$$anonfun$runTests$1.apply(FlatSpecLike.scala:1714)
     at 
org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:413)
     at 
org.scalatest.SuperEngine$$anonfun$traverseSubNodes$1$1.apply(Engine.scala:401)
     at scala.collection.immutable.List.foreach(List.scala:318)
     at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)
     at 
org.scalatest.SuperEngine.org$scalatest$SuperEngine$$runTestsInBranch(Engine.scala:396)
     at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:483)
     at org.scalatest.FlatSpecLike$class.runTests(FlatSpecLike.scala:1714)
     at org.scalatest.FlatSpec.runTests(FlatSpec.scala:1683)
     at org.scalatest.Suite$class.run(Suite.scala:1424)
     at 
org.scalatest.FlatSpec.org$scalatest$FlatSpecLike$$super$run(FlatSpec.scala:1683)
     at 
org.scalatest.FlatSpecLike$$anonfun$run$1.apply(FlatSpecLike.scala:1760)
     at 
org.scalatest.FlatSpecLike$$anonfun$run$1.apply(FlatSpecLike.scala:1760)
     at org.scalatest.SuperEngine.runImpl(Engine.scala:545)
     at org.scalatest.FlatSpecLike$class.run(FlatSpecLike.scala:1760)
     at 
d.s.f.s.t.SimpleTicketTextSimilaritySparkJobSpec.org$scalatest$BeforeAndAfterAll$$super$run(SimpleTicketTextSimilaritySparkJobSpec.scala:12)
     at 
org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:257)
     at 
org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:256)
     at 
d.s.f.s.t.SimpleTicketTextSimilaritySparkJobSpec.run(SimpleTicketTextSimilaritySparkJobSpec.scala:12)
     at 
org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:462)
     at 
org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:671)
     at sbt.ForkMain$Run$2.call(ForkMain.java:294)
     at sbt.ForkMain$Run$2.call(ForkMain.java:284)
     at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
     at java.util.concurrent.FutureTask.run(FutureTask.java:166)
     at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
     at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
     at java.lang.Thread.run(Thread.java:722)
Caused by: java.lang.OutOfMemoryError: PermGen space
     at java.lang.ClassLoader.defineClass1(Native Method)
     at java.lang.ClassLoader.defineClass(ClassLoader.java:791)
     at 
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
     at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
     at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
     at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
     at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
     at java.security.AccessController.doPrivileged(Native Method)
     at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
     at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
     at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
     at 
com.fasterxml.jackson.databind.ObjectMapper.<clinit>(ObjectMapper.java:191)
     at 
org.apache.spark.metrics.sink.MetricsServlet.<init>(MetricsServlet.scala:45)
     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
Method)
     at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
     at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
     at java.lang.reflect.Constructor.newInstance(Constructor.java:525)
     at 
org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply(MetricsSystem.scala:136)
     at 
org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply(MetricsSystem.scala:130)
     at 
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
     at 
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
     at 
scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
     at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
     at scala.collection.mutable.HashMap.foreach(HashMap.scala:98)
     at 
org.apache.spark.metrics.MetricsSystem.registerSinks(MetricsSystem.scala:130)
     at 
org.apache.spark.metrics.MetricsSystem.<init>(MetricsSystem.scala:84)
     at 
org.apache.spark.metrics.MetricsSystem$.createMetricsSystem(MetricsSystem.scala:167)
     at org.apache.spark.SparkEnv$.create(SparkEnv.scala:230)
     at org.apache.spark.SparkContext.<init>(SparkContext.scala:202)
     at 
d.s.f.s.t.SimpleTicketTextSimilaritySparkJobSpec$$anonfun$1.apply$mcV$sp(SimpleTicketTextSimilaritySparkJobSpec.scala:29)
     at 
d.s.f.s.t.SimpleTicketTextSimilaritySparkJobSpec$$anonfun$1.apply(SimpleTicketTextSimilaritySparkJobSpec.scala:21)

My SimpleTicketTextSimilaritySparkJobSpec is basically this:

     val sc = new SparkContext(sConf)    // <-- OOM occurs in this line
     @transient val hConf = HBaseConfiguration.create() // this is 
actually a little bit more elaborated since I get config from 
HBaseTestingUtility
     val fullInputTableName = TestClient.testClient + ":" + 
HBaseModel.Quantify.termDictionary
     hConf.set(TableInputFormat.INPUT_TABLE, fullInputTableName)
     val rdd = sc.newAPIHadoopRDD(hConf, classOf[TableInputFormat], 
classOf[ImmutableBytesWritable], classOf[Result])
     val rddCount = rdd.count()

Since the issue comes from initializing of MetricsServlet and because my 
spark task is as primitive as it can be, I am considering if the issue 
is related to me replacing the servlet-api?

Thank you guys for your patience and valuable input!
reinis

On 12.09.2014 14:30, Aniket Bhatnagar wrote:
> Hi Reinis
>
> Try if the exclude suggestion from me and Sean works for you. If not, 
> can you turn on verbose class loading to see from where 
> javax.servlet.ServletRegistration is loaded? The class should load 
> from "org.mortbay.jetty" % "servlet-api" % jettyVersion. If it loads 
> from some other jar, you would have to exclude it from your build.
>
> Hope it helps.
>
> Thanks,
> Aniket
>
> On 12 September 2014 02:21, <spark@orbit-x.de 
> <ma...@orbit-x.de>> wrote:
>
>     Thank you, Aniket for your hint!
>
>     Alas, I am facing really "hellish" situation as it seems, because
>     I have integration tests using BOTH spark and HBase (Minicluster).
>     Thus I get either:
>
>     class "javax.servlet.ServletRegistration"'s signer information
>     does not match signer information of other classes in the same package
>     java.lang.SecurityException: class
>     "javax.servlet.ServletRegistration"'s signer information does not
>     match signer information of other classes in the same package
>         at java.lang.ClassLoader.checkCerts(ClassLoader.java:943)
>         at java.lang.ClassLoader.preDefineClass(ClassLoader.java:657)
>         at java.lang.ClassLoader.defineClass(ClassLoader.java:785)
>
>     or:
>
>     [info]   Cause: java.lang.ClassNotFoundException:
>     org.mortbay.jetty.servlet.Context
>     [info]   at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>     [info]   at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>     [info]   at java.security.AccessController.doPrivileged(Native Method)
>     [info]   at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>     [info]   at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
>     [info]   at
>     sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>     [info]   at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
>     [info]   at
>     org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:661)
>     [info]   at
>     org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:552)
>     [info]   at
>     org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:720)
>
>     I am searching the web already for a week trying to figure out how
>     to make this work :-/
>
>     all the help or hints are greatly appreciated
>     reinis
>
>
>         ------------------------------------------------------------------------
>         -----Original-Nachricht-----
>         Von: "Aniket Bhatnagar" <aniket.bhatnagar@gmail.com
>         <ma...@gmail.com>>
>         An: spark@orbit-x.de <ma...@orbit-x.de>
>         Cc: user <user@spark.apache.org <ma...@spark.apache.org>>
>         Datum: 11-09-2014 20:00
>         Betreff: Re: Re[2]: HBase 0.96+ with Spark 1.0+
>
>
>         Dependency hell... My fav problem :).
>
>         I had run into a similar issue with hbase and jetty. I cant
>         remember thw exact fix, but is are excerpts from my
>         dependencies that may be relevant:
>
>         val hadoop2Common = "org.apache.hadoop" % "hadoop-common" %
>         hadoop2Version excludeAll(
>
>           ExclusionRule(organization = "javax.servlet"),
>
>           ExclusionRule(organization = "javax.servlet.jsp"),
>
>         ExclusionRule(organization = "org.mortbay.jetty")
>
>           )
>
>           val hadoop2MapRedClient = "org.apache.hadoop" %
>         "hadoop-mapreduce-client-core" % hadoop2Version
>
>           val hbase = "org.apache.hbase" % "hbase" % hbaseVersion
>         excludeAll(
>
>           ExclusionRule(organization = "org.apache.maven.wagon"),
>
>           ExclusionRule(organization = "org.jboss.netty"),
>
>         ExclusionRule(organization = "org.mortbay.jetty"),
>
>           ExclusionRule(organization = "org.jruby") // Don't need
>         HBASE's jruby. It pulls in whole lot of other dependencies
>         like joda-time.
>
>           )
>
>         val sparkCore = "org.apache.spark" %% "spark-core" % sparkVersion
>
>           val sparkStreaming = "org.apache.spark" %% "spark-streaming"
>         % sparkVersion
>
>           val sparkSQL = "org.apache.spark" %% "spark-sql" % sparkVersion
>
>           val sparkHive = "org.apache.spark" %% "spark-hive" %
>         sparkVersion
>
>           val sparkRepl = "org.apache.spark" %% "spark-repl" %
>         sparkVersion
>
>           val sparkAll = Seq (
>
>           sparkCore excludeAll(
>
>           ExclusionRule(organization = "org.apache.hadoop")), // We
>         assume hadoop 2 and hence omit hadoop 1 dependencies
>
>           sparkSQL,
>
>           sparkStreaming,
>
>           hadoop2MapRedClient,
>
>           hadoop2Common,
>
>           "org.mortbay.jetty" % "servlet-api" % "3.0.20100224"
>
>           )
>
>         On Sep 11, 2014 8:05 PM, <spark@orbit-x.de
>         <ma...@orbit-x.de>> wrote:
>
>             Hi guys,
>
>             any luck with this issue, anyone?
>
>             I aswell tried all the possible exclusion combos to a no
>             avail.
>
>             thanks for your ideas
>             reinis
>
>             -----Original-Nachricht-----
>             > Von: "Stephen Boesch" <javadba@gmail.com
>             <ma...@gmail.com>>
>             > An: user <user@spark.apache.org
>             <ma...@spark.apache.org>>
>             > Datum: 28-06-2014 15:12
>             > Betreff: Re: HBase 0.96+ with Spark 1.0+
>             >
>             > Hi Siyuan,
>             Thanks for the input. We are preferring to use the
>             SparkBuild.scala instead of maven. I did not see any
>             protobuf.version related settings in that file. But - as
>             noted by Sean Owen - in any case the issue we are facing
>             presently is about the duplicate incompatible
>             javax.servlet entries - apparently from the org.mortbay
>             artifacts.
>
>
>             >
>             > 2014-06-28 6:01 GMT-07:00 Siyuan he <hsy008@gmail.com
>             <ma...@gmail.com>>:
>             > Hi Stephen,
>             >
>             I am using spark1.0+ HBase0.96.2. This is what I did:
>             1) rebuild spark using: mvn -Dhadoop.version=2.3.0
>             -Dprotobuf.version=2.5.0 -DskipTests clean package
>             2) In spark-env.sh, set SPARK_CLASSPATH =
>             /path-to/hbase-protocol-0.96.2-hadoop2.jar
>
>             >
>             Hopefully it can help.
>             Siyuan
>
>
>             >
>             > On Sat, Jun 28, 2014 at 8:52 AM, Stephen Boesch
>             <javadba@gmail.com <ma...@gmail.com>> wrote:
>             >
>             >
>             Thanks Sean. I had actually already added exclusion rule
>             for org.mortbay.jetty - and that had not resolved it.
>             >
>             Just in case I used your precise formulation:
>
>             >
>             val excludeMortbayJetty = ExclusionRule(organization =
>             "org.mortbay.jetty")
>             ..
>
>             ,("org.apache.spark" % "spark-core_2.10" % sparkVersion
>             withSources()).excludeAll(excludeMortbayJetty)
>             ,("org.apache.spark" % "spark-sql_2.10" % sparkVersion
>             withSources()).excludeAll(excludeMortbayJetty)
>
>             >
>             However the same error still recurs:
>
>             >
>             14/06/28 05:48:35 INFO HttpServer: Starting HTTP Server
>             [error] (run-main-0) java.lang.SecurityException: class
>             "javax.servlet.FilterRegistration"'s signer information
>             does not match signer information of other classes in the
>             same package
>             java.lang.SecurityException: class
>             "javax.servlet.FilterRegistration"'s signer information
>             does not match signer information of other classes in the
>             same package
>
>
>
>             >
>
>             >
>
>             >
>             > 2014-06-28 4:22 GMT-07:00 Sean Owen <sowen@cloudera.com
>             <ma...@cloudera.com>>:
>
>             > This sounds like an instance of roughly the same item as in
>             > https://issues.apache.org/jira/browse/SPARK-1949 Have a
>             look at
>             > adding that exclude to see if it works.
>             >
>
>             > On Fri, Jun 27, 2014 at 10:21 PM, Stephen Boesch
>             <javadba@gmail.com <ma...@gmail.com>> wrote:
>             > > The present trunk is built and tested against HBase 0.94.
>             > >
>             > >
>             > > I have tried various combinations of versions of HBase
>             0.96+ and Spark 1.0+
>             > > and all end up with
>             > >
>             > > 14/06/27 20:11:15 INFO HttpServer: Starting HTTP Server
>             > > [error] (run-main-0) java.lang.SecurityException: class
>             > > "javax.servlet.FilterRegistration"'s signer
>             information does not match
>             > > signer information of other classes in the same package
>             > > java.lang.SecurityException: class
>             "javax.servlet.FilterRegistration"'s
>             > > signer information does not match signer information
>             of other classes in the
>             > > same package
>             > > at java.lang.ClassLoader.checkCerts(ClassLoader.java:952)
>             > >
>             > >
>             > > I have tried a number of different ways to exclude
>             javax.servlet related
>             > > jars. But none have avoided this error.
>             > >
>             > > Anyone have a (small-ish) build.sbt that works with
>             later versions of HBase?
>             > >
>             > >
>             >
>
>
>             >
>
>
>             >
>
>             >
>
>
>
>
>             ---------------------------------------------------------------------
>             To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>             <ma...@spark.apache.org>
>             For additional commands, e-mail:
>             user-help@spark.apache.org <ma...@spark.apache.org>
>
>


Re: Re[2]: HBase 0.96+ with Spark 1.0+

Posted by Aniket Bhatnagar <an...@gmail.com>.
Hi Reinis

Try if the exclude suggestion from me and Sean works for you. If not, can
you turn on verbose class loading to see from where
javax.servlet.ServletRegistration is loaded? The class should load
from "org.mortbay.jetty"
% "servlet-api" % jettyVersion. If it loads from some other jar, you would
have to exclude it from your build.

Hope it helps.

Thanks,
Aniket

On 12 September 2014 02:21, <sp...@orbit-x.de> wrote:

> Thank you, Aniket for your hint!
>
> Alas, I am facing really "hellish" situation as it seems, because I have
> integration tests using BOTH spark and HBase (Minicluster). Thus I get
> either:
>
> class "javax.servlet.ServletRegistration"'s signer information does not
> match signer information of other classes in the same package
> java.lang.SecurityException: class "javax.servlet.ServletRegistration"'s
> signer information does not match signer information of other classes in
> the same package
>     at java.lang.ClassLoader.checkCerts(ClassLoader.java:943)
>     at java.lang.ClassLoader.preDefineClass(ClassLoader.java:657)
>     at java.lang.ClassLoader.defineClass(ClassLoader.java:785)
>
> or:
>
> [info]   Cause: java.lang.ClassNotFoundException:
> org.mortbay.jetty.servlet.Context
> [info]   at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> [info]   at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> [info]   at java.security.AccessController.doPrivileged(Native Method)
> [info]   at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> [info]   at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
> [info]   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> [info]   at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
> [info]   at
> org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:661)
> [info]   at
> org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:552)
> [info]   at
> org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:720)
>
> I am searching the web already for a week trying to figure out how to make
> this work :-/
>
> all the help or hints are greatly appreciated
> reinis
>
>
> ------------------------------
> -----Original-Nachricht-----
> Von: "Aniket Bhatnagar" <an...@gmail.com>
> An: spark@orbit-x.de
> Cc: user <us...@spark.apache.org>
> Datum: 11-09-2014 20:00
> Betreff: Re: Re[2]: HBase 0.96+ with Spark 1.0+
>
>
> Dependency hell... My fav problem :).
>
> I had run into a similar issue with hbase and jetty. I cant remember thw
> exact fix, but is are excerpts from my dependencies that may be relevant:
>
> val hadoop2Common = "org.apache.hadoop" % "hadoop-common" % hadoop2Version
> excludeAll(
>
>   ExclusionRule(organization = "javax.servlet"),
>
>   ExclusionRule(organization = "javax.servlet.jsp"),
>
> ExclusionRule(organization = "org.mortbay.jetty")
>
>   )
>
>   val hadoop2MapRedClient = "org.apache.hadoop" %
> "hadoop-mapreduce-client-core" % hadoop2Version
>
>   val hbase = "org.apache.hbase" % "hbase" % hbaseVersion excludeAll(
>
>   ExclusionRule(organization = "org.apache.maven.wagon"),
>
>   ExclusionRule(organization = "org.jboss.netty"),
>
> ExclusionRule(organization = "org.mortbay.jetty"),
>
>   ExclusionRule(organization = "org.jruby") // Don't need HBASE's jruby.
> It pulls in whole lot of other dependencies like joda-time.
>
>   )
>
> val sparkCore = "org.apache.spark" %% "spark-core" % sparkVersion
>
>   val sparkStreaming = "org.apache.spark" %% "spark-streaming" %
> sparkVersion
>
>   val sparkSQL = "org.apache.spark" %% "spark-sql" % sparkVersion
>
>   val sparkHive = "org.apache.spark" %% "spark-hive" % sparkVersion
>
>   val sparkRepl = "org.apache.spark" %% "spark-repl" % sparkVersion
>
>   val sparkAll = Seq (
>
>   sparkCore excludeAll(
>
>   ExclusionRule(organization = "org.apache.hadoop")), // We assume hadoop
> 2 and hence omit hadoop 1 dependencies
>
>   sparkSQL,
>
>   sparkStreaming,
>
>   hadoop2MapRedClient,
>
>   hadoop2Common,
>
>   "org.mortbay.jetty" % "servlet-api" % "3.0.20100224"
>
>   )
>
> On Sep 11, 2014 8:05 PM, <sp...@orbit-x.de> wrote:
>
>> Hi guys,
>>
>> any luck with this issue, anyone?
>>
>> I aswell tried all the possible exclusion combos to a no avail.
>>
>> thanks for your ideas
>> reinis
>>
>> -----Original-Nachricht-----
>> > Von: "Stephen Boesch" <ja...@gmail.com>
>> > An: user <us...@spark.apache.org>
>> > Datum: 28-06-2014 15:12
>> > Betreff: Re: HBase 0.96+ with Spark 1.0+
>> >
>> > Hi Siyuan,
>> Thanks for the input. We are preferring to use the SparkBuild.scala
>> instead of maven. I did not see any protobuf.version related settings in
>> that file. But - as noted by Sean Owen - in any case the issue we are
>> facing presently is about the duplicate incompatible javax.servlet entries
>> - apparently from the org.mortbay artifacts.
>>
>>
>> >
>> > 2014-06-28 6:01 GMT-07:00 Siyuan he <hs...@gmail.com>:
>> > Hi Stephen,
>> >
>> I am using spark1.0+ HBase0.96.2. This is what I did:
>> 1) rebuild spark using: mvn -Dhadoop.version=2.3.0
>> -Dprotobuf.version=2.5.0 -DskipTests clean package
>> 2) In spark-env.sh, set SPARK_CLASSPATH =
>> /path-to/hbase-protocol-0.96.2-hadoop2.jar
>>
>> >
>> Hopefully it can help.
>> Siyuan
>>
>>
>> >
>> > On Sat, Jun 28, 2014 at 8:52 AM, Stephen Boesch <ja...@gmail.com>
>> wrote:
>> >
>> >
>> Thanks Sean. I had actually already added exclusion rule for
>> org.mortbay.jetty - and that had not resolved it.
>> >
>> Just in case I used your precise formulation:
>>
>> >
>> val excludeMortbayJetty = ExclusionRule(organization =
>> "org.mortbay.jetty")
>> ..
>>
>> ,("org.apache.spark" % "spark-core_2.10" % sparkVersion
>> withSources()).excludeAll(excludeMortbayJetty)
>> ,("org.apache.spark" % "spark-sql_2.10" % sparkVersion
>> withSources()).excludeAll(excludeMortbayJetty)
>>
>> >
>> However the same error still recurs:
>>
>> >
>> 14/06/28 05:48:35 INFO HttpServer: Starting HTTP Server
>> [error] (run-main-0) java.lang.SecurityException: class
>> "javax.servlet.FilterRegistration"'s signer information does not match
>> signer information of other classes in the same package
>> java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s
>> signer information does not match signer information of other classes in
>> the same package
>>
>>
>>
>> >
>>
>> >
>>
>> >
>> > 2014-06-28 4:22 GMT-07:00 Sean Owen <so...@cloudera.com>:
>>
>> > This sounds like an instance of roughly the same item as in
>> > https://issues.apache.org/jira/browse/SPARK-1949 Have a look at
>> > adding that exclude to see if it works.
>> >
>>
>> > On Fri, Jun 27, 2014 at 10:21 PM, Stephen Boesch <ja...@gmail.com>
>> wrote:
>> > > The present trunk is built and tested against HBase 0.94.
>> > >
>> > >
>> > > I have tried various combinations of versions of HBase 0.96+ and
>> Spark 1.0+
>> > > and all end up with
>> > >
>> > > 14/06/27 20:11:15 INFO HttpServer: Starting HTTP Server
>> > > [error] (run-main-0) java.lang.SecurityException: class
>> > > "javax.servlet.FilterRegistration"'s signer information does not match
>> > > signer information of other classes in the same package
>> > > java.lang.SecurityException: class
>> "javax.servlet.FilterRegistration"'s
>> > > signer information does not match signer information of other classes
>> in the
>> > > same package
>> > > at java.lang.ClassLoader.checkCerts(ClassLoader.java:952)
>> > >
>> > >
>> > > I have tried a number of different ways to exclude javax.servlet
>> related
>> > > jars. But none have avoided this error.
>> > >
>> > > Anyone have a (small-ish) build.sbt that works with later versions of
>> HBase?
>> > >
>> > >
>> >
>>
>>
>> >
>>
>>
>> >
>>
>> >
>>
>>
>>
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> For additional commands, e-mail: user-help@spark.apache.org
>>
>>
>
>

Re: Re[2]: HBase 0.96+ with Spark 1.0+

Posted by Sean Owen <so...@cloudera.com>.
This was already answered at the bottom of this same thread -- read below.

On Thu, Sep 11, 2014 at 9:51 PM,  <sp...@orbit-x.de> wrote:
> class "javax.servlet.ServletRegistration"'s signer information does not
> match signer information of other classes in the same package
> java.lang.SecurityException: class "javax.servlet.ServletRegistration"'s
> signer information does not match signer information of other classes in the
> same package
>     at java.lang.ClassLoader.checkCerts(ClassLoader.java:943)
>     at java.lang.ClassLoader.preDefineClass(ClassLoader.java:657)
>     at java.lang.ClassLoader.defineClass(ClassLoader.java:785)

>> > 2014-06-28 4:22 GMT-07:00 Sean Owen <so...@cloudera.com>:
>>
>> > This sounds like an instance of roughly the same item as in
>> > https://issues.apache.org/jira/browse/SPARK-1949 Have a look at
>> > adding that exclude to see if it works.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re[2]: HBase 0.96+ with Spark 1.0+

Posted by sp...@orbit-x.de.
Thank you, Aniket for your hint!

Alas, I am facing really "hellish" situation as it seems, because I have integration tests using BOTH spark and HBase (Minicluster). Thus I get either:

class "javax.servlet.ServletRegistration"'s signer information does not match signer information of other classes in the same package
java.lang.SecurityException: class "javax.servlet.ServletRegistration"'s signer information does not match signer information of other classes in the same package
    at java.lang.ClassLoader.checkCerts(ClassLoader.java:943)
    at java.lang.ClassLoader.preDefineClass(ClassLoader.java:657)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:785)

or:

[info]   Cause: java.lang.ClassNotFoundException: org.mortbay.jetty.servlet.Context
[info]   at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
[info]   at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
[info]   at java.security.AccessController.doPrivileged(Native Method)
[info]   at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
[info]   at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
[info]   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
[info]   at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
[info]   at org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:661)
[info]   at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:552)
[info]   at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:720)

I am searching the web already for a week trying to figure out how to make this work :-/

all the help or hints are greatly appreciated
reinis



-----Original-Nachricht-----
Von: "Aniket Bhatnagar" <an...@gmail.com>
An: spark@orbit-x.de
Cc: user <us...@spark.apache.org>
Datum: 11-09-2014 20:00
Betreff: Re: Re[2]: HBase 0.96+ with Spark 1.0+


Dependency hell... My fav problem :).
I had run into a similar issue with hbase and jetty. I cant remember thw exact fix, but is are excerpts from my dependencies that may be relevant:
val hadoop2Common = "org.apache.hadoop" % "hadoop-common" % hadoop2Version excludeAll(
                        ExclusionRule(organization = "javax.servlet"),
                        ExclusionRule(organization = "javax.servlet.jsp"),
ExclusionRule(organization = "org.mortbay.jetty")
                       )
  val hadoop2MapRedClient = "org.apache.hadoop" % "hadoop-mapreduce-client-core" % hadoop2Version
  val hbase = "org.apache.hbase" % "hbase" % hbaseVersion excludeAll(
                ExclusionRule(organization = "org.apache.maven.wagon"),
                ExclusionRule(organization = "org.jboss.netty"),
ExclusionRule(organization = "org.mortbay.jetty"),
                ExclusionRule(organization = "org.jruby") // Don't need HBASE's jruby. It pulls in whole lot of other dependencies like joda-time.
    )
val sparkCore = "org.apache.spark" %% "spark-core" % sparkVersion
  val sparkStreaming = "org.apache.spark" %% "spark-streaming" % sparkVersion
  val sparkSQL = "org.apache.spark" %% "spark-sql" % sparkVersion
  val sparkHive = "org.apache.spark" %% "spark-hive" % sparkVersion
  val sparkRepl = "org.apache.spark" %% "spark-repl" % sparkVersion
  val sparkAll = Seq (
    sparkCore excludeAll(
      ExclusionRule(organization = "org.apache.hadoop")), // We assume hadoop 2 and hence omit hadoop 1 dependencies
    sparkSQL,
    sparkStreaming,
    hadoop2MapRedClient,
    hadoop2Common,
    "org.mortbay.jetty" % "servlet-api" % "3.0.20100224"
  )

On Sep 11, 2014 8:05 PM, <sp...@orbit-x.de> wrote:
Hi guys,

any luck with this issue, anyone?

I aswell tried all the possible exclusion combos to a no avail.

thanks for your ideas
reinis

-----Original-Nachricht-----
> Von: "Stephen Boesch" <ja...@gmail.com>
> An: user <us...@spark.apache.org>
> Datum: 28-06-2014 15:12
> Betreff: Re: HBase 0.96+ with Spark 1.0+
>
> Hi Siyuan,
 Thanks for the input. We are preferring to use the SparkBuild.scala instead of maven.  I did not see any protobuf.version  related settings in that file. But - as noted by Sean Owen - in any case the issue we are facing presently is about the duplicate incompatible javax.servlet entries - apparently from the org.mortbay artifacts.
 
 
>
> 2014-06-28 6:01 GMT-07:00 Siyuan he <hs...@gmail.com>:
> Hi Stephen,
>
I am using spark1.0+ HBase0.96.2. This is what I did:
1) rebuild spark using: mvn -Dhadoop.version=2.3.0 -Dprotobuf.version=2.5.0 -DskipTests clean package
2) In spark-env.sh, set SPARK_CLASSPATH = /path-to/hbase-protocol-0.96.2-hadoop2.jar 

>
Hopefully it can help.
Siyuan
 
 
>
> On Sat, Jun 28, 2014 at 8:52 AM, Stephen Boesch <ja...@gmail.com> wrote:
>  
>
Thanks Sean.  I had actually already added exclusion rule for org.mortbay.jetty - and that had not resolved it.
>
Just in case I used your precise formulation:

>
val excludeMortbayJetty = ExclusionRule(organization = "org.mortbay.jetty")
..
 
  ,("org.apache.spark" % "spark-core_2.10" % sparkVersion  withSources()).excludeAll(excludeMortbayJetty)
  ,("org.apache.spark" % "spark-sql_2.10" % sparkVersion  withSources()).excludeAll(excludeMortbayJetty)

>
However the same error still recurs:

>
14/06/28 05:48:35 INFO HttpServer: Starting HTTP Server
[error] (run-main-0) java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s signer information does not match signer information of other classes in the same package
java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s signer information does not match signer information of other classes in the same package
 
 

>

>

>
> 2014-06-28 4:22 GMT-07:00 Sean Owen <so...@cloudera.com>:

> This sounds like an instance of roughly the same item as in
> https://issues.apache.org/jira/browse/SPARK-1949  Have a look at
> adding that exclude to see if it works.
>

> On Fri, Jun 27, 2014 at 10:21 PM, Stephen Boesch <ja...@gmail.com> wrote:
> > The present trunk is built and tested against HBase 0.94.
> >
> >
> > I have tried various combinations of versions of HBase 0.96+ and Spark 1.0+
> > and all end up with
> >
> > 14/06/27 20:11:15 INFO HttpServer: Starting HTTP Server
> > [error] (run-main-0) java.lang.SecurityException: class
> > "javax.servlet.FilterRegistration"'s signer information does not match
> > signer information of other classes in the same package
> > java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s
> > signer information does not match signer information of other classes in the
> > same package
> >         at java.lang.ClassLoader.checkCerts(ClassLoader.java:952)
> >
> >
> > I have tried a number of different ways to exclude javax.servlet related
> > jars. But none have avoided this error.
> >
> > Anyone have a (small-ish) build.sbt that works with later versions of HBase?
> >
> >
>  
 
 
>  
 
 
>  
 
>  




---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org

 





Re: Re[2]: HBase 0.96+ with Spark 1.0+

Posted by Aniket Bhatnagar <an...@gmail.com>.
Dependency hell... My fav problem :).

I had run into a similar issue with hbase and jetty. I cant remember thw
exact fix, but is are excerpts from my dependencies that may be relevant:

val hadoop2Common = "org.apache.hadoop" % "hadoop-common" % hadoop2Version
excludeAll(

                        ExclusionRule(organization = "javax.servlet"),

                        ExclusionRule(organization = "javax.servlet.jsp"),

ExclusionRule(organization = "org.mortbay.jetty")

                       )

  val hadoop2MapRedClient = "org.apache.hadoop" %
"hadoop-mapreduce-client-core" % hadoop2Version

  val hbase = "org.apache.hbase" % "hbase" % hbaseVersion excludeAll(

                ExclusionRule(organization = "org.apache.maven.wagon"),

                ExclusionRule(organization = "org.jboss.netty"),

ExclusionRule(organization = "org.mortbay.jetty"),

                ExclusionRule(organization = "org.jruby") // Don't need
HBASE's jruby. It pulls in whole lot of other dependencies like joda-time.

    )

val sparkCore = "org.apache.spark" %% "spark-core" % sparkVersion

  val sparkStreaming = "org.apache.spark" %% "spark-streaming" %
sparkVersion

  val sparkSQL = "org.apache.spark" %% "spark-sql" % sparkVersion

  val sparkHive = "org.apache.spark" %% "spark-hive" % sparkVersion

  val sparkRepl = "org.apache.spark" %% "spark-repl" % sparkVersion

  val sparkAll = Seq (

    sparkCore excludeAll(

      ExclusionRule(organization = "org.apache.hadoop")), // We assume
hadoop 2 and hence omit hadoop 1 dependencies

    sparkSQL,

    sparkStreaming,

    hadoop2MapRedClient,

    hadoop2Common,

    "org.mortbay.jetty" % "servlet-api" % "3.0.20100224"

  )

On Sep 11, 2014 8:05 PM, <sp...@orbit-x.de> wrote:

> Hi guys,
>
> any luck with this issue, anyone?
>
> I aswell tried all the possible exclusion combos to a no avail.
>
> thanks for your ideas
> reinis
>
> -----Original-Nachricht-----
> > Von: "Stephen Boesch" <ja...@gmail.com>
> > An: user <us...@spark.apache.org>
> > Datum: 28-06-2014 15:12
> > Betreff: Re: HBase 0.96+ with Spark 1.0+
> >
> > Hi Siyuan,
>  Thanks for the input. We are preferring to use the SparkBuild.scala
> instead of maven.  I did not see any protobuf.version  related settings in
> that file. But - as noted by Sean Owen - in any case the issue we are
> facing presently is about the duplicate incompatible javax.servlet entries
> - apparently from the org.mortbay artifacts.
>
>
> >
> > 2014-06-28 6:01 GMT-07:00 Siyuan he <hs...@gmail.com>:
> > Hi Stephen,
> >
> I am using spark1.0+ HBase0.96.2. This is what I did:
> 1) rebuild spark using: mvn -Dhadoop.version=2.3.0
> -Dprotobuf.version=2.5.0 -DskipTests clean package
> 2) In spark-env.sh, set SPARK_CLASSPATH =
> /path-to/hbase-protocol-0.96.2-hadoop2.jar
>
> >
> Hopefully it can help.
> Siyuan
>
>
> >
> > On Sat, Jun 28, 2014 at 8:52 AM, Stephen Boesch <ja...@gmail.com>
> wrote:
> >
> >
> Thanks Sean.  I had actually already added exclusion rule for
> org.mortbay.jetty - and that had not resolved it.
> >
> Just in case I used your precise formulation:
>
> >
> val excludeMortbayJetty = ExclusionRule(organization = "org.mortbay.jetty")
> ..
>
>   ,("org.apache.spark" % "spark-core_2.10" % sparkVersion
> withSources()).excludeAll(excludeMortbayJetty)
>   ,("org.apache.spark" % "spark-sql_2.10" % sparkVersion
> withSources()).excludeAll(excludeMortbayJetty)
>
> >
> However the same error still recurs:
>
> >
> 14/06/28 05:48:35 INFO HttpServer: Starting HTTP Server
> [error] (run-main-0) java.lang.SecurityException: class
> "javax.servlet.FilterRegistration"'s signer information does not match
> signer information of other classes in the same package
> java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s
> signer information does not match signer information of other classes in
> the same package
>
>
>
> >
>
> >
>
> >
> > 2014-06-28 4:22 GMT-07:00 Sean Owen <so...@cloudera.com>:
>
> > This sounds like an instance of roughly the same item as in
> > https://issues.apache.org/jira/browse/SPARK-1949  Have a look at
> > adding that exclude to see if it works.
> >
>
> > On Fri, Jun 27, 2014 at 10:21 PM, Stephen Boesch <ja...@gmail.com>
> wrote:
> > > The present trunk is built and tested against HBase 0.94.
> > >
> > >
> > > I have tried various combinations of versions of HBase 0.96+ and Spark
> 1.0+
> > > and all end up with
> > >
> > > 14/06/27 20:11:15 INFO HttpServer: Starting HTTP Server
> > > [error] (run-main-0) java.lang.SecurityException: class
> > > "javax.servlet.FilterRegistration"'s signer information does not match
> > > signer information of other classes in the same package
> > > java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s
> > > signer information does not match signer information of other classes
> in the
> > > same package
> > >         at java.lang.ClassLoader.checkCerts(ClassLoader.java:952)
> > >
> > >
> > > I have tried a number of different ways to exclude javax.servlet
> related
> > > jars. But none have avoided this error.
> > >
> > > Anyone have a (small-ish) build.sbt that works with later versions of
> HBase?
> > >
> > >
> >
>
>
> >
>
>
> >
>
> >
>
>
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>