You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Emre Sevinc <em...@gmail.com> on 2014/12/24 13:02:26 UTC

Why does consuming a RESTful web service (using javax.ws.rs.* and Jsersey) work in unit test but not when submitted to Spark?

Hello,

I have a piece of code that runs inside Spark Streaming and tries to get
some data from a RESTful web service (that runs locally on my machine). The
code snippet in question is:

     Client client = ClientBuilder.newClient();
     WebTarget target = client.target("http://localhost:2222/rest");
     target = target.path("annotate")
                 .queryParam("text",
UrlEscapers.urlFragmentEscaper().escape(spotlightSubmission))
                 .queryParam("confidence", "0.3");

      logger.warn("!!! DEBUG !!! target: {}", target.getUri().toString());

      String response =
target.request().accept(MediaType.APPLICATION_JSON_TYPE).get(String.class);

      logger.warn("!!! DEBUG !!! Spotlight response: {}", response);

When run inside a unit test as follows:

     mvn clean test -Dtest=SpotlightTest#testCountWords

it contacts the RESTful web service and retrieves some data as expected.
But when the same code is run as part of the application that is submitted
to Spark, using spark-submit script I receive the following error:

      java.lang.NoSuchMethodError:
javax.ws.rs.core.MultivaluedMap.addAll(Ljava/lang/Object;[Ljava/lang/Object;)V

I'm using Spark 1.1.0 and for consuming the web service I'm using Jersey in
my project's pom.xml:

     <dependency>
      <groupId>org.glassfish.jersey.containers</groupId>
      <artifactId>jersey-container-servlet-core</artifactId>
      <version>2.14</version>
    </dependency>

So I suspect that when the application is submitted to Spark, somehow
there's a different JAR in the environment that uses a different version of
Jersey / javax.ws.rs.*

Does anybody know which version of Jersey / javax.ws.rs.*  is used in the
Spark environment, or how to solve this conflict?


-- 
Emre Sevinç
https://be.linkedin.com/in/emresevinc/

Re: Why does consuming a RESTful web service (using javax.ws.rs.* and Jsersey) work in unit test but not when submitted to Spark?

Posted by Emre Sevinc <em...@gmail.com>.
On Wed, Dec 24, 2014 at 1:46 PM, Sean Owen <so...@cloudera.com> wrote:

> I'd take a look with 'mvn dependency:tree' on your own code first.
> Maybe you are including JavaEE 6 for example?
>

For reference, my complete pom.xml looks like:

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="
http://www.w3.org/2001/XMLSchema-instance"
  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>

  <groupId>bigcontent</groupId>
  <artifactId>bigcontent</artifactId>
  <version>1.0-SNAPSHOT</version>
  <packaging>jar</packaging>

  <name>bigcontent</name>
  <url>http://maven.apache.org</url>

  <properties>
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
  </properties>

  <build>
    <plugins>
      <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-shade-plugin</artifactId>
        <version>2.3</version>
        <configuration>
          <!-- put your configurations here -->
        </configuration>
        <executions>
          <execution>
            <phase>package</phase>
            <goals>
              <goal>shade</goal>
            </goals>
          </execution>
        </executions>
      </plugin>

      <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-compiler-plugin</artifactId>
        <version>3.2</version>
        <configuration>
          <source>1.7</source>
          <target>1.7</target>
        </configuration>
      </plugin>
    </plugins>
  </build>

  <dependencies>
    <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-streaming_2.10</artifactId>
      <version>1.1.1</version>
      <scope>provided</scope>
    </dependency>

    <dependency>
      <groupId>org.glassfish.jersey.containers</groupId>
      <artifactId>jersey-container-servlet-core</artifactId>
      <version>2.14</version>
    </dependency>

    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-client</artifactId>
      <version>2.4.0</version>
    </dependency>

    <dependency>
      <groupId>com.google.guava</groupId>
      <artifactId>guava</artifactId>
      <version>16.0</version>
    </dependency>

    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-mapreduce-client-core</artifactId>
      <version>2.4.0</version>
    </dependency>

    <dependency>
      <groupId>json-mapreduce</groupId>
      <artifactId>json-mapreduce</artifactId>
      <version>1.0-SNAPSHOT</version>
      <exclusions>
      <exclusion>
        <groupId>javax.servlet</groupId>
        <artifactId>*</artifactId>
      </exclusion>
        <exclusion>
          <groupId>commons-io</groupId>
          <artifactId>*</artifactId>
          </exclusion>
      <exclusion>
          <groupId>commons-lang</groupId>
          <artifactId>*</artifactId>
      </exclusion>
        <exclusion>
          <groupId>org.apache.hadoop</groupId>
          <artifactId>hadoop-common</artifactId>
        </exclusion>
      </exclusions>
    </dependency>

    <dependency>
      <groupId>org.apache.avro</groupId>
      <artifactId>avro-mapred</artifactId>
      <version>1.7.7</version>
      <exclusions>
        <exclusion>
          <groupId>javax.servlet</groupId>
          <artifactId>*</artifactId>
        </exclusion>
        <exclusion>
          <groupId>org.apache.hadoop</groupId>
          <artifactId>hadoop-common</artifactId>
        </exclusion>
      </exclusions>
    </dependency>

    <dependency>
      <groupId>junit</groupId>
      <artifactId>junit</artifactId>
      <version>4.11</version>
      <scope>test</scope>
    </dependency>

    <dependency>
      <groupId>org.apache.avro</groupId>
      <artifactId>avro</artifactId>
      <version>1.7.7</version>
      <exclusions>
        <exclusion>
          <groupId>javax.servlet</groupId>
          <artifactId>*</artifactId>
        </exclusion>
      </exclusions>
    </dependency>

    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-common</artifactId>
      <version>2.4.0</version>
      <scope>provided</scope>
      <exclusions>
        <exclusion>
          <groupId>javax.servlet</groupId>
          <artifactId>*</artifactId>
        </exclusion>
        <exclusion>
          <groupId>com.google.guava</groupId>
          <artifactId>*</artifactId>
        </exclusion>
      </exclusions>
    </dependency>

    <dependency>
      <groupId>org.slf4j</groupId>
      <artifactId>slf4j-log4j12</artifactId>
      <version>1.7.7</version>
    </dependency>
  </dependencies>
</project>

And 'mvn dependency:tree' produces the following output:



[INFO] Scanning for projects...
[INFO]

[INFO]
------------------------------------------------------------------------
[INFO] Building bigcontent 1.0-SNAPSHOT
[INFO]
------------------------------------------------------------------------
[INFO]
[INFO] --- maven-dependency-plugin:2.8:tree (default-cli) @ bigcontent ---
[INFO] bigcontent:bigcontent:jar:1.0-SNAPSHOT
[INFO] +- org.apache.spark:spark-streaming_2.10:jar:1.1.1:provided
[INFO] |  +- org.apache.spark:spark-core_2.10:jar:1.1.1:provided
[INFO] |  |  +- org.apache.curator:curator-recipes:jar:2.4.0:provided
[INFO] |  |  |  \- org.apache.curator:curator-framework:jar:2.4.0:provided
[INFO] |  |  |     \- org.apache.curator:curator-client:jar:2.4.0:provided
[INFO] |  |  +- org.eclipse.jetty:jetty-plus:jar:8.1.14.v20131031:provided
[INFO] |  |  |  +-
org.eclipse.jetty.orbit:javax.transaction:jar:1.1.1.v201105210645:provided
[INFO] |  |  |  +-
org.eclipse.jetty:jetty-webapp:jar:8.1.14.v20131031:provided
[INFO] |  |  |  |  +-
org.eclipse.jetty:jetty-xml:jar:8.1.14.v20131031:provided
[INFO] |  |  |  |  \-
org.eclipse.jetty:jetty-servlet:jar:8.1.14.v20131031:provided
[INFO] |  |  |  \-
org.eclipse.jetty:jetty-jndi:jar:8.1.14.v20131031:provided
[INFO] |  |  |     \-
org.eclipse.jetty.orbit:javax.mail.glassfish:jar:1.4.1.v201005082020:provided
[INFO] |  |  |        \-
org.eclipse.jetty.orbit:javax.activation:jar:1.1.0.v201105071233:provided
[INFO] |  |  +-
org.eclipse.jetty:jetty-security:jar:8.1.14.v20131031:provided
[INFO] |  |  +- org.eclipse.jetty:jetty-util:jar:8.1.14.v20131031:provided
[INFO] |  |  +- org.apache.commons:commons-lang3:jar:3.3.2:provided
[INFO] |  |  +- org.slf4j:jul-to-slf4j:jar:1.7.5:provided
[INFO] |  |  +- org.slf4j:jcl-over-slf4j:jar:1.7.5:provided
[INFO] |  |  +- com.ning:compress-lzf:jar:1.0.0:provided
[INFO] |  |  +- net.jpountz.lz4:lz4:jar:1.2.0:provided
[INFO] |  |  +- com.twitter:chill_2.10:jar:0.3.6:provided
[INFO] |  |  |  \- com.esotericsoftware.kryo:kryo:jar:2.21:provided
[INFO] |  |  |     +-
com.esotericsoftware.reflectasm:reflectasm:jar:shaded:1.07:provided
[INFO] |  |  |     +- com.esotericsoftware.minlog:minlog:jar:1.2:provided
[INFO] |  |  |     \- org.objenesis:objenesis:jar:1.2:provided
[INFO] |  |  +- com.twitter:chill-java:jar:0.3.6:provided
[INFO] |  |  +-
org.spark-project.akka:akka-remote_2.10:jar:2.2.3-shaded-protobuf:provided
[INFO] |  |  |  +-
org.spark-project.akka:akka-actor_2.10:jar:2.2.3-shaded-protobuf:provided
[INFO] |  |  |  |  \- com.typesafe:config:jar:1.0.2:provided
[INFO] |  |  |  +-
org.spark-project.protobuf:protobuf-java:jar:2.4.1-shaded:provided
[INFO] |  |  |  \- org.uncommons.maths:uncommons-maths:jar:1.2.2a:provided
[INFO] |  |  +-
org.spark-project.akka:akka-slf4j_2.10:jar:2.2.3-shaded-protobuf:provided
[INFO] |  |  +- org.json4s:json4s-jackson_2.10:jar:3.2.10:provided
[INFO] |  |  |  +- org.json4s:json4s-core_2.10:jar:3.2.10:provided
[INFO] |  |  |  |  +- org.json4s:json4s-ast_2.10:jar:3.2.10:provided
[INFO] |  |  |  |  \- org.scala-lang:scalap:jar:2.10.0:provided
[INFO] |  |  |  |     \- org.scala-lang:scala-compiler:jar:2.10.0:provided
[INFO] |  |  |  |        \- org.scala-lang:scala-reflect:jar:2.10.0:provided
[INFO] |  |  |  \-
com.fasterxml.jackson.core:jackson-databind:jar:2.3.1:provided
[INFO] |  |  |     +-
com.fasterxml.jackson.core:jackson-annotations:jar:2.3.0:provided
[INFO] |  |  |     \-
com.fasterxml.jackson.core:jackson-core:jar:2.3.1:provided
[INFO] |  |  +- colt:colt:jar:1.2.0:provided
[INFO] |  |  |  \- concurrent:concurrent:jar:1.3.4:provided
[INFO] |  |  +- org.apache.mesos:mesos:jar:shaded-protobuf:0.18.1:provided
[INFO] |  |  +- io.netty:netty-all:jar:4.0.23.Final:provided
[INFO] |  |  +- com.clearspring.analytics:stream:jar:2.7.0:provided
[INFO] |  |  +- com.codahale.metrics:metrics-core:jar:3.0.0:provided
[INFO] |  |  +- com.codahale.metrics:metrics-jvm:jar:3.0.0:provided
[INFO] |  |  +- com.codahale.metrics:metrics-json:jar:3.0.0:provided
[INFO] |  |  +- com.codahale.metrics:metrics-graphite:jar:3.0.0:provided
[INFO] |  |  +- org.tachyonproject:tachyon-client:jar:0.5.0:provided
[INFO] |  |  |  \- org.tachyonproject:tachyon:jar:0.5.0:provided
[INFO] |  |  +- org.spark-project:pyrolite:jar:2.0.1:provided
[INFO] |  |  \- net.sf.py4j:py4j:jar:0.8.2.1:provided
[INFO] |  +- org.eclipse.jetty:jetty-server:jar:8.1.14.v20131031:provided
[INFO] |  |  +-
org.eclipse.jetty.orbit:javax.servlet:jar:3.0.0.v201112011016:provided
[INFO] |  |  +-
org.eclipse.jetty:jetty-continuation:jar:8.1.14.v20131031:provided
[INFO] |  |  \- org.eclipse.jetty:jetty-http:jar:8.1.14.v20131031:provided
[INFO] |  |     \- org.eclipse.jetty:jetty-io:jar:8.1.14.v20131031:provided
[INFO] |  \- org.scala-lang:scala-library:jar:2.10.4:provided
[INFO] +-
org.glassfish.jersey.containers:jersey-container-servlet-core:jar:2.14:compile
[INFO] |  +- org.glassfish.hk2.external:javax.inject:jar:2.4.0-b06:compile
[INFO] |  +- org.glassfish.jersey.core:jersey-common:jar:2.14:compile
[INFO] |  |  +- javax.annotation:javax.annotation-api:jar:1.2:compile
[INFO] |  |  +-
org.glassfish.jersey.bundles.repackaged:jersey-guava:jar:2.14:compile
[INFO] |  |  +- org.glassfish.hk2:hk2-api:jar:2.4.0-b06:compile
[INFO] |  |  |  +- org.glassfish.hk2:hk2-utils:jar:2.4.0-b06:compile
[INFO] |  |  |  \-
org.glassfish.hk2.external:aopalliance-repackaged:jar:2.4.0-b06:compile
[INFO] |  |  +- org.glassfish.hk2:hk2-locator:jar:2.4.0-b06:compile
[INFO] |  |  |  \- org.javassist:javassist:jar:3.18.1-GA:compile
[INFO] |  |  \- org.glassfish.hk2:osgi-resource-locator:jar:1.0.1:compile
[INFO] |  +- org.glassfish.jersey.core:jersey-server:jar:2.14:compile
[INFO] |  |  +- org.glassfish.jersey.core:jersey-client:jar:2.14:compile
[INFO] |  |  \- javax.validation:validation-api:jar:1.1.0.Final:compile
[INFO] |  \- javax.ws.rs:javax.ws.rs-api:jar:2.0.1:compile
[INFO] +- org.apache.hadoop:hadoop-client:jar:2.4.0:compile
[INFO] |  +- org.apache.hadoop:hadoop-hdfs:jar:2.4.0:compile
[INFO] |  +- org.apache.hadoop:hadoop-mapreduce-client-app:jar:2.4.0:compile
[INFO] |  |  +-
org.apache.hadoop:hadoop-mapreduce-client-common:jar:2.4.0:compile
[INFO] |  |  |  +- org.apache.hadoop:hadoop-yarn-client:jar:2.4.0:compile
[INFO] |  |  |  |  \- com.sun.jersey:jersey-client:jar:1.9:compile
[INFO] |  |  |  \-
org.apache.hadoop:hadoop-yarn-server-common:jar:2.4.0:compile
[INFO] |  |  \-
org.apache.hadoop:hadoop-mapreduce-client-shuffle:jar:2.4.0:compile
[INFO] |  +- org.apache.hadoop:hadoop-yarn-api:jar:2.4.0:compile
[INFO] |  +-
org.apache.hadoop:hadoop-mapreduce-client-jobclient:jar:2.4.0:compile
[INFO] |  \- org.apache.hadoop:hadoop-annotations:jar:2.4.0:compile
[INFO] +- com.google.guava:guava:jar:16.0:compile
[INFO] +- org.apache.hadoop:hadoop-mapreduce-client-core:jar:2.4.0:compile
[INFO] |  +- org.apache.hadoop:hadoop-yarn-common:jar:2.4.0:compile
[INFO] |  |  +- javax.xml.bind:jaxb-api:jar:2.2.2:compile
[INFO] |  |  |  +- javax.xml.stream:stax-api:jar:1.0-2:compile
[INFO] |  |  |  \- javax.activation:activation:jar:1.1:compile
[INFO] |  |  +- javax.servlet:servlet-api:jar:2.5:compile
[INFO] |  |  +- com.google.inject:guice:jar:3.0:compile
[INFO] |  |  |  +- javax.inject:javax.inject:jar:1:compile
[INFO] |  |  |  \- aopalliance:aopalliance:jar:1.0:compile
[INFO] |  |  \- com.sun.jersey.contribs:jersey-guice:jar:1.9:compile
[INFO] |  +- com.google.protobuf:protobuf-java:jar:2.5.0:compile
[INFO] |  +- org.slf4j:slf4j-api:jar:1.7.5:compile
[INFO] |  +- com.google.inject.extensions:guice-servlet:jar:3.0:compile
[INFO] |  \- io.netty:netty:jar:3.6.2.Final:compile
[INFO] +- json-mapreduce:json-mapreduce:jar:1.0-SNAPSHOT:compile
[INFO] +- org.apache.avro:avro-mapred:jar:1.7.7:compile
[INFO] |  +- org.apache.avro:avro-ipc:jar:1.7.7:compile
[INFO] |  |  +- org.apache.velocity:velocity:jar:1.7:compile
[INFO] |  |  \- org.mortbay.jetty:servlet-api:jar:2.5-20081211:compile
[INFO] |  +- org.apache.avro:avro-ipc:jar:tests:1.7.7:compile
[INFO] |  +- org.codehaus.jackson:jackson-core-asl:jar:1.9.13:compile
[INFO] |  \- org.codehaus.jackson:jackson-mapper-asl:jar:1.9.13:compile
[INFO] +- junit:junit:jar:4.11:test
[INFO] |  \- org.hamcrest:hamcrest-core:jar:1.3:test
[INFO] +- org.apache.avro:avro:jar:1.7.7:compile
[INFO] |  +- com.thoughtworks.paranamer:paranamer:jar:2.3:compile
[INFO] |  +- org.xerial.snappy:snappy-java:jar:1.0.5:compile
[INFO] |  \- org.apache.commons:commons-compress:jar:1.4.1:compile
[INFO] |     \- org.tukaani:xz:jar:1.0:compile
[INFO] +- org.apache.hadoop:hadoop-common:jar:2.4.0:provided
[INFO] |  +- commons-cli:commons-cli:jar:1.2:compile
[INFO] |  +- org.apache.commons:commons-math3:jar:3.1.1:provided
[INFO] |  +- xmlenc:xmlenc:jar:0.52:compile
[INFO] |  +- commons-httpclient:commons-httpclient:jar:3.1:provided
[INFO] |  +- commons-codec:commons-codec:jar:1.4:compile
[INFO] |  +- commons-io:commons-io:jar:2.4:compile
[INFO] |  +- commons-net:commons-net:jar:3.1:provided
[INFO] |  +- commons-collections:commons-collections:jar:3.2.1:compile
[INFO] |  +- org.mortbay.jetty:jetty:jar:6.1.26:compile
[INFO] |  +- org.mortbay.jetty:jetty-util:jar:6.1.26:compile
[INFO] |  +- com.sun.jersey:jersey-core:jar:1.9:compile
[INFO] |  +- com.sun.jersey:jersey-json:jar:1.9:compile
[INFO] |  |  +- org.codehaus.jettison:jettison:jar:1.1:compile
[INFO] |  |  +- com.sun.xml.bind:jaxb-impl:jar:2.2.3-1:compile
[INFO] |  |  +- org.codehaus.jackson:jackson-jaxrs:jar:1.8.3:compile
[INFO] |  |  \- org.codehaus.jackson:jackson-xc:jar:1.8.3:compile
[INFO] |  +- com.sun.jersey:jersey-server:jar:1.9:compile
[INFO] |  |  \- asm:asm:jar:3.1:compile
[INFO] |  +- tomcat:jasper-compiler:jar:5.5.23:provided
[INFO] |  +- tomcat:jasper-runtime:jar:5.5.23:provided
[INFO] |  +- javax.servlet.jsp:jsp-api:jar:2.1:provided
[INFO] |  +- commons-el:commons-el:jar:1.0:provided
[INFO] |  +- commons-logging:commons-logging:jar:1.1.3:compile
[INFO] |  +- log4j:log4j:jar:1.2.17:compile
[INFO] |  +- net.java.dev.jets3t:jets3t:jar:0.9.0:provided
[INFO] |  |  +- org.apache.httpcomponents:httpclient:jar:4.1.2:provided
[INFO] |  |  +- org.apache.httpcomponents:httpcore:jar:4.1.2:provided
[INFO] |  |  \- com.jamesmurty.utils:java-xmlbuilder:jar:0.4:provided
[INFO] |  +- commons-lang:commons-lang:jar:2.6:compile
[INFO] |  +- commons-configuration:commons-configuration:jar:1.6:provided
[INFO] |  |  +- commons-digester:commons-digester:jar:1.8:provided
[INFO] |  |  |  \- commons-beanutils:commons-beanutils:jar:1.7.0:provided
[INFO] |  |  \- commons-beanutils:commons-beanutils-core:jar:1.8.0:provided
[INFO] |  +- org.apache.hadoop:hadoop-auth:jar:2.4.0:provided
[INFO] |  +- com.jcraft:jsch:jar:0.1.42:provided
[INFO] |  +- com.google.code.findbugs:jsr305:jar:1.3.9:provided
[INFO] |  \- org.apache.zookeeper:zookeeper:jar:3.4.5:compile
[INFO] \- org.slf4j:slf4j-log4j12:jar:1.7.7:compile
[INFO]
------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO]
------------------------------------------------------------------------
[INFO] Total time: 3.231 s
[INFO] Finished at: 2014-12-24T13:54:02+01:00
[INFO] Final Memory: 17M/173M
[INFO]
------------------------------------------------------------------------



--
Emre

Re: Why does consuming a RESTful web service (using javax.ws.rs.* and Jsersey) work in unit test but not when submitted to Spark?

Posted by Emre Sevinc <em...@gmail.com>.
Sean,

Thanks a lot for the important information, especially  userClassPathFirst.

Cheers,
Emre

On Wed, Dec 24, 2014 at 3:38 PM, Sean Owen <so...@cloudera.com> wrote:

> That could well be it -- oops, I forgot to run with the YARN profile
> and so didn't see the YARN dependencies. Try the userClassPathFirst
> option to try to make your app's copy take precedence.
>
> The second error is really a JVM bug, but, is from having too little
> memory available for the unit tests.
>
>
> http://spark.apache.org/docs/latest/building-spark.html#setting-up-mavens-memory-usage
>
> On Wed, Dec 24, 2014 at 1:56 PM, Emre Sevinc <em...@gmail.com>
> wrote:
> > It seems like YARN depends an older version of Jersey, that is 1.9:
> >
> >   https://github.com/apache/spark/blob/master/yarn/pom.xml
> >
> > When I've modified my dependencies to have only:
> >
> >   <dependency>
> >       <groupId>com.sun.jersey</groupId>
> >       <artifactId>jersey-core</artifactId>
> >       <version>1.9.1</version>
> >     </dependency>
> >
> > And then modified the code to use the old Jersey API:
> >
> >     Client c = Client.create();
> >     WebResource r = c.resource("http://localhost:2222/rest")
> >                      .path("annotate")
> >                      .queryParam("text",
> > UrlEscapers.urlFragmentEscaper().escape(spotlightSubmission))
> >                      .queryParam("confidence", "0.3");
> >
> >     logger.warn("!!! DEBUG !!! target: {}", r.getURI());
> >
> >     String response = r.accept(MediaType.APPLICATION_JSON_TYPE)
> >                        //.header("")
> >                        .get(String.class);
> >
> >     logger.warn("!!! DEBUG !!! Spotlight response: {}", response);
> >
> > It seems to work when I use spark-submit to submit the application that
> > includes this code.
> >
> > Funny thing is, now my relevant unit test does not run, complaining about
> > not having enough memory:
> >
> > Java HotSpot(TM) 64-Bit Server VM warning: INFO:
> > os::commit_memory(0x00000000c4900000, 25165824, 0) failed; error='Cannot
> > allocate memory' (errno=12)
> > #
> > # There is insufficient memory for the Java Runtime Environment to
> continue.
> > # Native memory allocation (mmap) failed to map 25165824 bytes for
> > committing reserved memory.
> >
> > --
> > Emre
> >
> >
> > On Wed, Dec 24, 2014 at 1:46 PM, Sean Owen <so...@cloudera.com> wrote:
> >>
> >> Your guess is right, that there are two incompatible versions of
> >> Jersey (or really, JAX-RS) in your runtime. Spark doesn't use Jersey,
> >> but its transitive dependencies may, or your transitive dependencies
> >> may.
> >>
> >> I don't see Jersey in Spark's dependency tree except from HBase tests,
> >> which in turn only appear in examples, so that's unlikely to be it.
> >> I'd take a look with 'mvn dependency:tree' on your own code first.
> >> Maybe you are including JavaEE 6 for example?
> >>
> >> On Wed, Dec 24, 2014 at 12:02 PM, Emre Sevinc <em...@gmail.com>
> >> wrote:
> >> > Hello,
> >> >
> >> > I have a piece of code that runs inside Spark Streaming and tries to
> get
> >> > some data from a RESTful web service (that runs locally on my
> machine).
> >> > The
> >> > code snippet in question is:
> >> >
> >> >      Client client = ClientBuilder.newClient();
> >> >      WebTarget target = client.target("http://localhost:2222/rest");
> >> >      target = target.path("annotate")
> >> >                  .queryParam("text",
> >> > UrlEscapers.urlFragmentEscaper().escape(spotlightSubmission))
> >> >                  .queryParam("confidence", "0.3");
> >> >
> >> >       logger.warn("!!! DEBUG !!! target: {}",
> >> > target.getUri().toString());
> >> >
> >> >       String response =
> >> >
> >> >
> target.request().accept(MediaType.APPLICATION_JSON_TYPE).get(String.class);
> >> >
> >> >       logger.warn("!!! DEBUG !!! Spotlight response: {}", response);
> >> >
> >> > When run inside a unit test as follows:
> >> >
> >> >      mvn clean test -Dtest=SpotlightTest#testCountWords
> >> >
> >> > it contacts the RESTful web service and retrieves some data as
> expected.
> >> > But
> >> > when the same code is run as part of the application that is submitted
> >> > to
> >> > Spark, using spark-submit script I receive the following error:
> >> >
> >> >       java.lang.NoSuchMethodError:
> >> >
> >> >
> javax.ws.rs.core.MultivaluedMap.addAll(Ljava/lang/Object;[Ljava/lang/Object;)V
> >> >
> >> > I'm using Spark 1.1.0 and for consuming the web service I'm using
> Jersey
> >> > in
> >> > my project's pom.xml:
> >> >
> >> >      <dependency>
> >> >       <groupId>org.glassfish.jersey.containers</groupId>
> >> >       <artifactId>jersey-container-servlet-core</artifactId>
> >> >       <version>2.14</version>
> >> >     </dependency>
> >> >
> >> > So I suspect that when the application is submitted to Spark, somehow
> >> > there's a different JAR in the environment that uses a different
> version
> >> > of
> >> > Jersey / javax.ws.rs.*
> >> >
> >> > Does anybody know which version of Jersey / javax.ws.rs.*  is used in
> >> > the
> >> > Spark environment, or how to solve this conflict?
> >> >
> >> >
> >> > --
> >> > Emre Sevinç
> >> > https://be.linkedin.com/in/emresevinc/
> >> >
> >
> >
> >
> >
> > --
> > Emre Sevinc
>



-- 
Emre Sevinc

Re: Why does consuming a RESTful web service (using javax.ws.rs.* and Jsersey) work in unit test but not when submitted to Spark?

Posted by Sean Owen <so...@cloudera.com>.
That could well be it -- oops, I forgot to run with the YARN profile
and so didn't see the YARN dependencies. Try the userClassPathFirst
option to try to make your app's copy take precedence.

The second error is really a JVM bug, but, is from having too little
memory available for the unit tests.

http://spark.apache.org/docs/latest/building-spark.html#setting-up-mavens-memory-usage

On Wed, Dec 24, 2014 at 1:56 PM, Emre Sevinc <em...@gmail.com> wrote:
> It seems like YARN depends an older version of Jersey, that is 1.9:
>
>   https://github.com/apache/spark/blob/master/yarn/pom.xml
>
> When I've modified my dependencies to have only:
>
>   <dependency>
>       <groupId>com.sun.jersey</groupId>
>       <artifactId>jersey-core</artifactId>
>       <version>1.9.1</version>
>     </dependency>
>
> And then modified the code to use the old Jersey API:
>
>     Client c = Client.create();
>     WebResource r = c.resource("http://localhost:2222/rest")
>                      .path("annotate")
>                      .queryParam("text",
> UrlEscapers.urlFragmentEscaper().escape(spotlightSubmission))
>                      .queryParam("confidence", "0.3");
>
>     logger.warn("!!! DEBUG !!! target: {}", r.getURI());
>
>     String response = r.accept(MediaType.APPLICATION_JSON_TYPE)
>                        //.header("")
>                        .get(String.class);
>
>     logger.warn("!!! DEBUG !!! Spotlight response: {}", response);
>
> It seems to work when I use spark-submit to submit the application that
> includes this code.
>
> Funny thing is, now my relevant unit test does not run, complaining about
> not having enough memory:
>
> Java HotSpot(TM) 64-Bit Server VM warning: INFO:
> os::commit_memory(0x00000000c4900000, 25165824, 0) failed; error='Cannot
> allocate memory' (errno=12)
> #
> # There is insufficient memory for the Java Runtime Environment to continue.
> # Native memory allocation (mmap) failed to map 25165824 bytes for
> committing reserved memory.
>
> --
> Emre
>
>
> On Wed, Dec 24, 2014 at 1:46 PM, Sean Owen <so...@cloudera.com> wrote:
>>
>> Your guess is right, that there are two incompatible versions of
>> Jersey (or really, JAX-RS) in your runtime. Spark doesn't use Jersey,
>> but its transitive dependencies may, or your transitive dependencies
>> may.
>>
>> I don't see Jersey in Spark's dependency tree except from HBase tests,
>> which in turn only appear in examples, so that's unlikely to be it.
>> I'd take a look with 'mvn dependency:tree' on your own code first.
>> Maybe you are including JavaEE 6 for example?
>>
>> On Wed, Dec 24, 2014 at 12:02 PM, Emre Sevinc <em...@gmail.com>
>> wrote:
>> > Hello,
>> >
>> > I have a piece of code that runs inside Spark Streaming and tries to get
>> > some data from a RESTful web service (that runs locally on my machine).
>> > The
>> > code snippet in question is:
>> >
>> >      Client client = ClientBuilder.newClient();
>> >      WebTarget target = client.target("http://localhost:2222/rest");
>> >      target = target.path("annotate")
>> >                  .queryParam("text",
>> > UrlEscapers.urlFragmentEscaper().escape(spotlightSubmission))
>> >                  .queryParam("confidence", "0.3");
>> >
>> >       logger.warn("!!! DEBUG !!! target: {}",
>> > target.getUri().toString());
>> >
>> >       String response =
>> >
>> > target.request().accept(MediaType.APPLICATION_JSON_TYPE).get(String.class);
>> >
>> >       logger.warn("!!! DEBUG !!! Spotlight response: {}", response);
>> >
>> > When run inside a unit test as follows:
>> >
>> >      mvn clean test -Dtest=SpotlightTest#testCountWords
>> >
>> > it contacts the RESTful web service and retrieves some data as expected.
>> > But
>> > when the same code is run as part of the application that is submitted
>> > to
>> > Spark, using spark-submit script I receive the following error:
>> >
>> >       java.lang.NoSuchMethodError:
>> >
>> > javax.ws.rs.core.MultivaluedMap.addAll(Ljava/lang/Object;[Ljava/lang/Object;)V
>> >
>> > I'm using Spark 1.1.0 and for consuming the web service I'm using Jersey
>> > in
>> > my project's pom.xml:
>> >
>> >      <dependency>
>> >       <groupId>org.glassfish.jersey.containers</groupId>
>> >       <artifactId>jersey-container-servlet-core</artifactId>
>> >       <version>2.14</version>
>> >     </dependency>
>> >
>> > So I suspect that when the application is submitted to Spark, somehow
>> > there's a different JAR in the environment that uses a different version
>> > of
>> > Jersey / javax.ws.rs.*
>> >
>> > Does anybody know which version of Jersey / javax.ws.rs.*  is used in
>> > the
>> > Spark environment, or how to solve this conflict?
>> >
>> >
>> > --
>> > Emre Sevinç
>> > https://be.linkedin.com/in/emresevinc/
>> >
>
>
>
>
> --
> Emre Sevinc

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Why does consuming a RESTful web service (using javax.ws.rs.* and Jsersey) work in unit test but not when submitted to Spark?

Posted by Emre Sevinc <em...@gmail.com>.
It seems like YARN depends an older version of Jersey, that is 1.9:

  https://github.com/apache/spark/blob/master/yarn/pom.xml

When I've modified my dependencies to have only:

  <dependency>
      <groupId>com.sun.jersey</groupId>
      <artifactId>jersey-core</artifactId>
      <version>1.9.1</version>
    </dependency>

And then modified the code to use the old Jersey API:

    Client c = Client.create();
    WebResource r = c.resource("http://localhost:2222/rest")
                     .path("annotate")
                     .queryParam("text",
UrlEscapers.urlFragmentEscaper().escape(spotlightSubmission))
                     .queryParam("confidence", "0.3");

    logger.warn("!!! DEBUG !!! target: {}", r.getURI());

    String response = r.accept(MediaType.APPLICATION_JSON_TYPE)
                       //.header("")
                       .get(String.class);

    logger.warn("!!! DEBUG !!! Spotlight response: {}", response);

It seems to work when I use spark-submit to submit the application that
includes this code.

Funny thing is, now my relevant unit test does not run, complaining about
not having enough memory:

Java HotSpot(TM) 64-Bit Server VM warning: INFO:
os::commit_memory(0x00000000c4900000, 25165824, 0) failed; error='Cannot
allocate memory' (errno=12)
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Native memory allocation (mmap) failed to map 25165824 bytes for
committing reserved memory.

--
Emre


On Wed, Dec 24, 2014 at 1:46 PM, Sean Owen <so...@cloudera.com> wrote:

> Your guess is right, that there are two incompatible versions of
> Jersey (or really, JAX-RS) in your runtime. Spark doesn't use Jersey,
> but its transitive dependencies may, or your transitive dependencies
> may.
>
> I don't see Jersey in Spark's dependency tree except from HBase tests,
> which in turn only appear in examples, so that's unlikely to be it.
> I'd take a look with 'mvn dependency:tree' on your own code first.
> Maybe you are including JavaEE 6 for example?
>
> On Wed, Dec 24, 2014 at 12:02 PM, Emre Sevinc <em...@gmail.com>
> wrote:
> > Hello,
> >
> > I have a piece of code that runs inside Spark Streaming and tries to get
> > some data from a RESTful web service (that runs locally on my machine).
> The
> > code snippet in question is:
> >
> >      Client client = ClientBuilder.newClient();
> >      WebTarget target = client.target("http://localhost:2222/rest");
> >      target = target.path("annotate")
> >                  .queryParam("text",
> > UrlEscapers.urlFragmentEscaper().escape(spotlightSubmission))
> >                  .queryParam("confidence", "0.3");
> >
> >       logger.warn("!!! DEBUG !!! target: {}",
> target.getUri().toString());
> >
> >       String response =
> >
> target.request().accept(MediaType.APPLICATION_JSON_TYPE).get(String.class);
> >
> >       logger.warn("!!! DEBUG !!! Spotlight response: {}", response);
> >
> > When run inside a unit test as follows:
> >
> >      mvn clean test -Dtest=SpotlightTest#testCountWords
> >
> > it contacts the RESTful web service and retrieves some data as expected.
> But
> > when the same code is run as part of the application that is submitted to
> > Spark, using spark-submit script I receive the following error:
> >
> >       java.lang.NoSuchMethodError:
> >
> javax.ws.rs.core.MultivaluedMap.addAll(Ljava/lang/Object;[Ljava/lang/Object;)V
> >
> > I'm using Spark 1.1.0 and for consuming the web service I'm using Jersey
> in
> > my project's pom.xml:
> >
> >      <dependency>
> >       <groupId>org.glassfish.jersey.containers</groupId>
> >       <artifactId>jersey-container-servlet-core</artifactId>
> >       <version>2.14</version>
> >     </dependency>
> >
> > So I suspect that when the application is submitted to Spark, somehow
> > there's a different JAR in the environment that uses a different version
> of
> > Jersey / javax.ws.rs.*
> >
> > Does anybody know which version of Jersey / javax.ws.rs.*  is used in the
> > Spark environment, or how to solve this conflict?
> >
> >
> > --
> > Emre Sevinç
> > https://be.linkedin.com/in/emresevinc/
> >
>



-- 
Emre Sevinc

Re: Why does consuming a RESTful web service (using javax.ws.rs.* and Jsersey) work in unit test but not when submitted to Spark?

Posted by Sean Owen <so...@cloudera.com>.
Your guess is right, that there are two incompatible versions of
Jersey (or really, JAX-RS) in your runtime. Spark doesn't use Jersey,
but its transitive dependencies may, or your transitive dependencies
may.

I don't see Jersey in Spark's dependency tree except from HBase tests,
which in turn only appear in examples, so that's unlikely to be it.
I'd take a look with 'mvn dependency:tree' on your own code first.
Maybe you are including JavaEE 6 for example?

On Wed, Dec 24, 2014 at 12:02 PM, Emre Sevinc <em...@gmail.com> wrote:
> Hello,
>
> I have a piece of code that runs inside Spark Streaming and tries to get
> some data from a RESTful web service (that runs locally on my machine). The
> code snippet in question is:
>
>      Client client = ClientBuilder.newClient();
>      WebTarget target = client.target("http://localhost:2222/rest");
>      target = target.path("annotate")
>                  .queryParam("text",
> UrlEscapers.urlFragmentEscaper().escape(spotlightSubmission))
>                  .queryParam("confidence", "0.3");
>
>       logger.warn("!!! DEBUG !!! target: {}", target.getUri().toString());
>
>       String response =
> target.request().accept(MediaType.APPLICATION_JSON_TYPE).get(String.class);
>
>       logger.warn("!!! DEBUG !!! Spotlight response: {}", response);
>
> When run inside a unit test as follows:
>
>      mvn clean test -Dtest=SpotlightTest#testCountWords
>
> it contacts the RESTful web service and retrieves some data as expected. But
> when the same code is run as part of the application that is submitted to
> Spark, using spark-submit script I receive the following error:
>
>       java.lang.NoSuchMethodError:
> javax.ws.rs.core.MultivaluedMap.addAll(Ljava/lang/Object;[Ljava/lang/Object;)V
>
> I'm using Spark 1.1.0 and for consuming the web service I'm using Jersey in
> my project's pom.xml:
>
>      <dependency>
>       <groupId>org.glassfish.jersey.containers</groupId>
>       <artifactId>jersey-container-servlet-core</artifactId>
>       <version>2.14</version>
>     </dependency>
>
> So I suspect that when the application is submitted to Spark, somehow
> there's a different JAR in the environment that uses a different version of
> Jersey / javax.ws.rs.*
>
> Does anybody know which version of Jersey / javax.ws.rs.*  is used in the
> Spark environment, or how to solve this conflict?
>
>
> --
> Emre Sevinç
> https://be.linkedin.com/in/emresevinc/
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org