You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Ryan Compton (JIRA)" <ji...@apache.org> on 2014/05/28 23:21:02 UTC
[jira] [Updated] (SPARK-1952) slf4j version conflicts with pig
[ https://issues.apache.org/jira/browse/SPARK-1952?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Ryan Compton updated SPARK-1952:
--------------------------------
Description:
Upgrading from Spark-0.9.1 to Spark-1.0.0 causes all Pig scripts to fail when they "register" a jar containing Spark. The error appears to be related to org.slf4j.spi.LocationAwareLogger.log.
{code}
Caused by: java.lang.RuntimeException: Could not resolve error that
occured when launching map reduce job: java.lang.NoSuchMethodError:
org.slf4j.spi.LocationAwareLogger.log(Lorg/slf4j/Marker;Ljava/lang/String;ILjava/lang/String;[Ljava/lang/Object;Ljava/lang/Throwable;)V
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$JobControlThreadExceptionHandler.uncaughtException(MapReduceLauncher.java:598)
at java.lang.Thread.dispatchUncaughtException(Thread.java:1874)
{code}
To reproduce: compile Spark via ```$ SPARK_HADOOP_VERSION=0.20.2-cdh3u4 sbt/sbt assembly``` and register the resulting jar into a pig script. E.g.
```
REGISTER /usr/share/spark-1.0.0/assembly/target/scala-2.10/spark-assembly-1.0.0-SNAPSHOT-hadoop0.20.2-cdh3u4.jar;
data0 = LOAD 'data' USING PigStorage();
ttt = LIMIT data0 10;
DUMP ttt;
```
The Spark-1.0 jar includes some slf4j dependencies that were not present in 0.9.1
```
rfcompton@node19 /u/s/o/s/a/t/scala-2.10> jar tvf spark-assembly-1.0.0-SNAPSHOT-hadoop0.20.2-cdh3u4.jar | grep -i "slf" | grep LocationAware
3259 Mon Mar 25 21:49:34 PDT 2013 org/apache/commons/logging/impl/SLF4JLocationAwareLog.class
455 Mon Mar 25 21:49:22 PDT 2013 org/slf4j/spi/LocationAwareLogger.class
479 Fri Dec 13 16:44:40 PST 2013 parquet/org/slf4j/spi/LocationAwareLogger.class
```
vs.
```
rfcompton@node19 /u/s/o/s/a/t/scala-2.10> jar tvf spark-assembly-0.9.1-hadoop0.20.2-cdh3u3.jar | grep -i "slf" | grep LocationAware
455 Mon Mar 25 21:49:22 PDT 2013 org/slf4j/spi/LocationAwareLogger.class
```
was:
Upgrading from Spark-0.9.1 to Spark-1.0.0 causes all Pig scripts to fail when they "register" a jar containing Spark. The error appears to be related to org.slf4j.spi.LocationAwareLogger.log.
```
Caused by: java.lang.RuntimeException: Could not resolve error that
occured when launching map reduce job: java.lang.NoSuchMethodError:
org.slf4j.spi.LocationAwareLogger.log(Lorg/slf4j/Marker;Ljava/lang/String;ILjava/lang/String;[Ljava/lang/Object;Ljava/lang/Throwable;)V
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$JobControlThreadExceptionHandler.uncaughtException(MapReduceLauncher.java:598)
at java.lang.Thread.dispatchUncaughtException(Thread.java:1874)
```
To reproduce: compile Spark via ```$ SPARK_HADOOP_VERSION=0.20.2-cdh3u4 sbt/sbt assembly``` and register the resulting jar into a pig script. E.g.
```
REGISTER /usr/share/spark-1.0.0/assembly/target/scala-2.10/spark-assembly-1.0.0-SNAPSHOT-hadoop0.20.2-cdh3u4.jar;
data0 = LOAD 'data' USING PigStorage();
ttt = LIMIT data0 10;
DUMP ttt;
```
The Spark-1.0 jar includes some slf4j dependencies that were not present in 0.9.1
```
rfcompton@node19 /u/s/o/s/a/t/scala-2.10> jar tvf spark-assembly-1.0.0-SNAPSHOT-hadoop0.20.2-cdh3u4.jar | grep -i "slf" | grep LocationAware
3259 Mon Mar 25 21:49:34 PDT 2013 org/apache/commons/logging/impl/SLF4JLocationAwareLog.class
455 Mon Mar 25 21:49:22 PDT 2013 org/slf4j/spi/LocationAwareLogger.class
479 Fri Dec 13 16:44:40 PST 2013 parquet/org/slf4j/spi/LocationAwareLogger.class
```
vs.
```
rfcompton@node19 /u/s/o/s/a/t/scala-2.10> jar tvf spark-assembly-0.9.1-hadoop0.20.2-cdh3u3.jar | grep -i "slf" | grep LocationAware
455 Mon Mar 25 21:49:22 PDT 2013 org/slf4j/spi/LocationAwareLogger.class
```
> slf4j version conflicts with pig
> --------------------------------
>
> Key: SPARK-1952
> URL: https://issues.apache.org/jira/browse/SPARK-1952
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 1.0.0
> Environment: pig 12.1 on Cloudera Hadoop, CDH3
> Reporter: Ryan Compton
> Labels: pig, slf4j
> Fix For: 1.0.0
>
>
> Upgrading from Spark-0.9.1 to Spark-1.0.0 causes all Pig scripts to fail when they "register" a jar containing Spark. The error appears to be related to org.slf4j.spi.LocationAwareLogger.log.
> {code}
> Caused by: java.lang.RuntimeException: Could not resolve error that
> occured when launching map reduce job: java.lang.NoSuchMethodError:
> org.slf4j.spi.LocationAwareLogger.log(Lorg/slf4j/Marker;Ljava/lang/String;ILjava/lang/String;[Ljava/lang/Object;Ljava/lang/Throwable;)V
> at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$JobControlThreadExceptionHandler.uncaughtException(MapReduceLauncher.java:598)
> at java.lang.Thread.dispatchUncaughtException(Thread.java:1874)
> {code}
> To reproduce: compile Spark via ```$ SPARK_HADOOP_VERSION=0.20.2-cdh3u4 sbt/sbt assembly``` and register the resulting jar into a pig script. E.g.
> ```
> REGISTER /usr/share/spark-1.0.0/assembly/target/scala-2.10/spark-assembly-1.0.0-SNAPSHOT-hadoop0.20.2-cdh3u4.jar;
> data0 = LOAD 'data' USING PigStorage();
> ttt = LIMIT data0 10;
> DUMP ttt;
> ```
> The Spark-1.0 jar includes some slf4j dependencies that were not present in 0.9.1
> ```
> rfcompton@node19 /u/s/o/s/a/t/scala-2.10> jar tvf spark-assembly-1.0.0-SNAPSHOT-hadoop0.20.2-cdh3u4.jar | grep -i "slf" | grep LocationAware
> 3259 Mon Mar 25 21:49:34 PDT 2013 org/apache/commons/logging/impl/SLF4JLocationAwareLog.class
> 455 Mon Mar 25 21:49:22 PDT 2013 org/slf4j/spi/LocationAwareLogger.class
> 479 Fri Dec 13 16:44:40 PST 2013 parquet/org/slf4j/spi/LocationAwareLogger.class
> ```
> vs.
> ```
> rfcompton@node19 /u/s/o/s/a/t/scala-2.10> jar tvf spark-assembly-0.9.1-hadoop0.20.2-cdh3u3.jar | grep -i "slf" | grep LocationAware
> 455 Mon Mar 25 21:49:22 PDT 2013 org/slf4j/spi/LocationAwareLogger.class
> ```
--
This message was sent by Atlassian JIRA
(v6.2#6252)