You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Mohit Nayak <wi...@gmail.com> on 2014/06/03 01:21:15 UTC
Fwd: SecurityException when running tests with Spark 1.0.0
Hi,
I've upgraded to Spark 1.0.0. I'm not able to run any tests. They throw a
*java.lang.SecurityException: class
"javax.servlet.FilterRegistration"'s signer information does not match
signer information of other classes in the same package*
I'm using Hadoop-core 1.0.4 and running this locally.
I noticed that there was an issue regarding this and was marked as resolved
[https://issues.apache.org/jira/browse/SPARK-1693]
Please guide..
--
-Mohit
wizardm@gmail.com
--
-Mohit
wizardm@gmail.com
Re: SecurityException when running tests with Spark 1.0.0
Posted by Matei Zaharia <ma...@gmail.com>.
You can just use the Maven build for now, even for Spark 1.0.0.
Matei
On Jun 2, 2014, at 5:30 PM, Mohit Nayak <wi...@gmail.com> wrote:
> Hey,
> Yup that fixed it. Thanks so much!
>
> Is this the only solution, or could this be resolved in future versions of Spark ?
>
>
> On Mon, Jun 2, 2014 at 5:14 PM, Sean Owen <so...@cloudera.com> wrote:
> If it's the SBT build, I suspect you are hitting
> https://issues.apache.org/jira/browse/SPARK-1949
>
> Can you try to apply the excludes you see at
> https://github.com/apache/spark/pull/906/files to your build to see if
> it resolves it?
>
> If so I think this could be helpful to commit.
>
> On Tue, Jun 3, 2014 at 1:01 AM, Mohit Nayak <wi...@gmail.com> wrote:
> > Hey,
> > Thanks for the reply.
> >
> > I am using SBT. Here is a list of my dependancies:
> > val sparkCore = "org.apache.spark" % "spark-core_2.10" % V.spark
> > val hadoopCore = "org.apache.hadoop" % "hadoop-core" %
> > V.hadoop % "provided"
> > val jodaTime = "com.github.nscala-time" %% "nscala-time" %
> > "0.8.0"
> > val scalaUtil = "com.twitter" %% "util-collection" %
> > V.util
> > val logback = "ch.qos.logback" % "logback-classic" % "1.0.6" %
> > "runtime"
> > var openCsv = "net.sf.opencsv" % "opencsv" % "2.1"
> > var scalaTest = "org.scalatest" % "scalatest_2.10" % "2.1.0" % "test"
> > var scalaIOCore = "com.github.scala-incubator.io" %% "scala-io-core" %
> > V.scalaIO
> > var scalaIOFile = "com.github.scala-incubator.io" %% "scala-io-file" %
> > V.scalaIO
> > var kryo = "com.esotericsoftware.kryo" % "kryo" % "2.16"
> > var spray = "io.spray" %% "spray-json" % "1.2.5"
> > var scala_reflect = "org.scala-lang" % "scala-reflect" % "2.10.3"
> >
> >
> >
> > On Mon, Jun 2, 2014 at 4:23 PM, Sean Owen <so...@cloudera.com> wrote:
> >>
> >> This ultimately means you have a couple copies of the servlet APIs in
> >> the build. What is your build like (SBT? Maven?) and what exactly are
> >> you depending on?
> >>
> >> On Tue, Jun 3, 2014 at 12:21 AM, Mohit Nayak <wi...@gmail.com> wrote:
> >> > Hi,
> >> > I've upgraded to Spark 1.0.0. I'm not able to run any tests. They throw
> >> > a
> >> >
> >> > java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s
> >> > signer information does not match signer information of other classes in
> >> > the
> >> > same package
> >> >
> >> >
> >> > I'm using Hadoop-core 1.0.4 and running this locally.
> >> > I noticed that there was an issue regarding this and was marked as
> >> > resolved
> >> > [https://issues.apache.org/jira/browse/SPARK-1693]
> >> > Please guide..
> >> >
> >> > --
> >> > -Mohit
> >> > wizardm@gmail.com
> >> >
> >> >
> >> >
> >> > --
> >> > -Mohit
> >> > wizardm@gmail.com
> >
> >
> >
> >
> > --
> > -Mohit
> > wizardm@gmail.com
>
>
>
> --
> -Mohit
> wizardm@gmail.com
Re: SecurityException when running tests with Spark 1.0.0
Posted by Mohit Nayak <wi...@gmail.com>.
Hey,
Yup that fixed it. Thanks so much!
Is this the only solution, or could this be resolved in future versions of
Spark ?
On Mon, Jun 2, 2014 at 5:14 PM, Sean Owen <so...@cloudera.com> wrote:
> If it's the SBT build, I suspect you are hitting
> https://issues.apache.org/jira/browse/SPARK-1949
>
> Can you try to apply the excludes you see at
> https://github.com/apache/spark/pull/906/files to your build to see if
> it resolves it?
>
> If so I think this could be helpful to commit.
>
> On Tue, Jun 3, 2014 at 1:01 AM, Mohit Nayak <wi...@gmail.com> wrote:
> > Hey,
> > Thanks for the reply.
> >
> > I am using SBT. Here is a list of my dependancies:
> > val sparkCore = "org.apache.spark" % "spark-core_2.10" % V.spark
> > val hadoopCore = "org.apache.hadoop" % "hadoop-core" %
> > V.hadoop % "provided"
> > val jodaTime = "com.github.nscala-time" %% "nscala-time" %
> > "0.8.0"
> > val scalaUtil = "com.twitter" %% "util-collection" %
> > V.util
> > val logback = "ch.qos.logback" % "logback-classic" % "1.0.6" %
> > "runtime"
> > var openCsv = "net.sf.opencsv" % "opencsv" % "2.1"
> > var scalaTest = "org.scalatest" % "scalatest_2.10" % "2.1.0" %
> "test"
> > var scalaIOCore = "com.github.scala-incubator.io" %%
> "scala-io-core" %
> > V.scalaIO
> > var scalaIOFile = "com.github.scala-incubator.io" %%
> "scala-io-file" %
> > V.scalaIO
> > var kryo = "com.esotericsoftware.kryo" % "kryo" % "2.16"
> > var spray = "io.spray" %% "spray-json" % "1.2.5"
> > var scala_reflect = "org.scala-lang" % "scala-reflect" % "2.10.3"
> >
> >
> >
> > On Mon, Jun 2, 2014 at 4:23 PM, Sean Owen <so...@cloudera.com> wrote:
> >>
> >> This ultimately means you have a couple copies of the servlet APIs in
> >> the build. What is your build like (SBT? Maven?) and what exactly are
> >> you depending on?
> >>
> >> On Tue, Jun 3, 2014 at 12:21 AM, Mohit Nayak <wi...@gmail.com> wrote:
> >> > Hi,
> >> > I've upgraded to Spark 1.0.0. I'm not able to run any tests. They
> throw
> >> > a
> >> >
> >> > java.lang.SecurityException: class
> "javax.servlet.FilterRegistration"'s
> >> > signer information does not match signer information of other classes
> in
> >> > the
> >> > same package
> >> >
> >> >
> >> > I'm using Hadoop-core 1.0.4 and running this locally.
> >> > I noticed that there was an issue regarding this and was marked as
> >> > resolved
> >> > [https://issues.apache.org/jira/browse/SPARK-1693]
> >> > Please guide..
> >> >
> >> > --
> >> > -Mohit
> >> > wizardm@gmail.com
> >> >
> >> >
> >> >
> >> > --
> >> > -Mohit
> >> > wizardm@gmail.com
> >
> >
> >
> >
> > --
> > -Mohit
> > wizardm@gmail.com
>
--
-Mohit
wizardm@gmail.com
Re: SecurityException when running tests with Spark 1.0.0
Posted by Sean Owen <so...@cloudera.com>.
If it's the SBT build, I suspect you are hitting
https://issues.apache.org/jira/browse/SPARK-1949
Can you try to apply the excludes you see at
https://github.com/apache/spark/pull/906/files to your build to see if
it resolves it?
If so I think this could be helpful to commit.
On Tue, Jun 3, 2014 at 1:01 AM, Mohit Nayak <wi...@gmail.com> wrote:
> Hey,
> Thanks for the reply.
>
> I am using SBT. Here is a list of my dependancies:
> val sparkCore = "org.apache.spark" % "spark-core_2.10" % V.spark
> val hadoopCore = "org.apache.hadoop" % "hadoop-core" %
> V.hadoop % "provided"
> val jodaTime = "com.github.nscala-time" %% "nscala-time" %
> "0.8.0"
> val scalaUtil = "com.twitter" %% "util-collection" %
> V.util
> val logback = "ch.qos.logback" % "logback-classic" % "1.0.6" %
> "runtime"
> var openCsv = "net.sf.opencsv" % "opencsv" % "2.1"
> var scalaTest = "org.scalatest" % "scalatest_2.10" % "2.1.0" % "test"
> var scalaIOCore = "com.github.scala-incubator.io" %% "scala-io-core" %
> V.scalaIO
> var scalaIOFile = "com.github.scala-incubator.io" %% "scala-io-file" %
> V.scalaIO
> var kryo = "com.esotericsoftware.kryo" % "kryo" % "2.16"
> var spray = "io.spray" %% "spray-json" % "1.2.5"
> var scala_reflect = "org.scala-lang" % "scala-reflect" % "2.10.3"
>
>
>
> On Mon, Jun 2, 2014 at 4:23 PM, Sean Owen <so...@cloudera.com> wrote:
>>
>> This ultimately means you have a couple copies of the servlet APIs in
>> the build. What is your build like (SBT? Maven?) and what exactly are
>> you depending on?
>>
>> On Tue, Jun 3, 2014 at 12:21 AM, Mohit Nayak <wi...@gmail.com> wrote:
>> > Hi,
>> > I've upgraded to Spark 1.0.0. I'm not able to run any tests. They throw
>> > a
>> >
>> > java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s
>> > signer information does not match signer information of other classes in
>> > the
>> > same package
>> >
>> >
>> > I'm using Hadoop-core 1.0.4 and running this locally.
>> > I noticed that there was an issue regarding this and was marked as
>> > resolved
>> > [https://issues.apache.org/jira/browse/SPARK-1693]
>> > Please guide..
>> >
>> > --
>> > -Mohit
>> > wizardm@gmail.com
>> >
>> >
>> >
>> > --
>> > -Mohit
>> > wizardm@gmail.com
>
>
>
>
> --
> -Mohit
> wizardm@gmail.com
Re: SecurityException when running tests with Spark 1.0.0
Posted by Mohit Nayak <wi...@gmail.com>.
Hey,
Thanks for the reply.
I am using SBT. Here is a list of my dependancies:
val sparkCore = "org.apache.spark" % "spark-core_2.10" % V.spark
val hadoopCore = "org.apache.hadoop" % "hadoop-core" %
V.hadoop % "provided"
val jodaTime = "com.github.nscala-time" %% "nscala-time" %
"0.8.0"
val scalaUtil = "com.twitter" %% "util-collection" %
V.util
val logback = "ch.qos.logback" % "logback-classic" % "1.0.6" %
"runtime"
var openCsv = "net.sf.opencsv" % "opencsv" % "2.1"
var scalaTest = "org.scalatest" % "scalatest_2.10" % "2.1.0" % "test"
var scalaIOCore = "com.github.scala-incubator.io" %% "scala-io-core" %
V.scalaIO
var scalaIOFile = "com.github.scala-incubator.io" %% "scala-io-file" %
V.scalaIO
var kryo = "com.esotericsoftware.kryo" % "kryo" % "2.16"
var spray = "io.spray" %% "spray-json" % "1.2.5"
var scala_reflect = "org.scala-lang" % "scala-reflect" % "2.10.3"
On Mon, Jun 2, 2014 at 4:23 PM, Sean Owen <so...@cloudera.com> wrote:
> This ultimately means you have a couple copies of the servlet APIs in
> the build. What is your build like (SBT? Maven?) and what exactly are
> you depending on?
>
> On Tue, Jun 3, 2014 at 12:21 AM, Mohit Nayak <wi...@gmail.com> wrote:
> > Hi,
> > I've upgraded to Spark 1.0.0. I'm not able to run any tests. They throw a
> >
> > java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s
> > signer information does not match signer information of other classes in
> the
> > same package
> >
> >
> > I'm using Hadoop-core 1.0.4 and running this locally.
> > I noticed that there was an issue regarding this and was marked as
> resolved
> > [https://issues.apache.org/jira/browse/SPARK-1693]
> > Please guide..
> >
> > --
> > -Mohit
> > wizardm@gmail.com
> >
> >
> >
> > --
> > -Mohit
> > wizardm@gmail.com
>
--
-Mohit
wizardm@gmail.com
Re: SecurityException when running tests with Spark 1.0.0
Posted by Sean Owen <so...@cloudera.com>.
This ultimately means you have a couple copies of the servlet APIs in
the build. What is your build like (SBT? Maven?) and what exactly are
you depending on?
On Tue, Jun 3, 2014 at 12:21 AM, Mohit Nayak <wi...@gmail.com> wrote:
> Hi,
> I've upgraded to Spark 1.0.0. I'm not able to run any tests. They throw a
>
> java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s
> signer information does not match signer information of other classes in the
> same package
>
>
> I'm using Hadoop-core 1.0.4 and running this locally.
> I noticed that there was an issue regarding this and was marked as resolved
> [https://issues.apache.org/jira/browse/SPARK-1693]
> Please guide..
>
> --
> -Mohit
> wizardm@gmail.com
>
>
>
> --
> -Mohit
> wizardm@gmail.com