You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@sqoop.apache.org by Jarek Jarcec Cecho <ja...@apache.org> on 2013/02/15 02:33:26 UTC

Re: Incompatible Class Change Error

Hi sir,
Hadoop has gone through a huge code refactoring from Hadoop 1.0 to Hadoop 2.0. One side effect is that code compiled against Hadoop 1.0 is not compatible with Hadoop 2.0 and vice-versa. However source code is mostly compatible and thus one just need to recompile code with target Hadoop distribution.

The exception "Found interface X, but class was expected" is very common when you're running code that is compiled for Hadoop 1.0 on Hadoop 2.0 or vice-versa.

One of the solutions is to go to our download page [1] and download file ending with "hadoop-1.0.0.tar.gz" to ensure that your Sqoop distributions matches your hadoop distribution.

Jarcec

Links:
1: http://www.apache.org/dist/sqoop/1.4.2/

On Thu, Feb 14, 2013 at 12:11:07PM -0500, cd@agnik.com wrote:
> Hi, intern here.
> 
> My setup is the following, Hadoop 1.0.4, Sqoop 1.4.2, Hive 0.9.0
> 
> Hardware: Memory 495.6 MiB
> 	  Processor Intel(R) Pentium(R) 4 CPU 2.66GHz
> 
> Ubuntu 10.04 Lucid
> 
> I am trying to access a sqlserver located on another computer in the local
> network.
> If the issue is the compatibility with Hadoop then which one would then be
> recommended.
> 
> I am new to the usage of these tools. So I am trying to get familiar to
> them on a single node.
> The content of importCyr.txt contains the command for an import from a
> table located on the server I am trying to connect to.
> 
> Here is the output of what I am getting:
> 
> user@user-7:~/sqoop$ bin/sqoop --options-file importCyr.txt
> Warning: /usr/lib/hbase does not exist! HBase imports will fail.
> Please set $HBASE_HOME to the root of your HBase installation.
> Warning: $HADOOP_HOME is deprecated.
> 
> 13/02/14 12:01:29 INFO SqlServer.MSSQLServerManagerFactory: Using
> Microsoft's SQL Server - Hadoop Connector
> 13/02/14 12:01:29 INFO manager.SqlManager: Using default fetchSize of 1000
> 13/02/14 12:01:29 INFO tool.CodeGenTool: Beginning code generation
> 13/02/14 12:01:30 INFO manager.SqlManager: Executing SQL statement: SELECT
> TOP 1 * FROM [client]
> 13/02/14 12:01:30 INFO manager.SqlManager: Executing SQL statement: SELECT
> TOP 1 * FROM [client]
> 13/02/14 12:01:30 INFO orm.CompilationManager: HADOOP_HOME is
> /home/user/hadoop
> Note: /tmp/sqoop-user/compile/2e523bd98b10ffdd1cd99a796f2f54fd/client.java
> uses or overrides a deprecated API.
> Note: Recompile with -Xlint:deprecation for details.
> 13/02/14 12:01:34 INFO orm.CompilationManager: Writing jar file:
> /tmp/sqoop-user/compile/2e523bd98b10ffdd1cd99a796f2f54fd/client.jar
> 13/02/14 12:01:34 INFO mapreduce.ImportJobBase: Beginning import of client
> 13/02/14 12:01:35 INFO manager.SqlManager: Executing SQL statement: SELECT
> TOP 1 * FROM [client]
> 13/02/14 12:01:39 INFO mapred.JobClient: Cleaning up the staging area
> hdfs://localhost:9000/tmp/hadoop-user/mapred/staging/user/.staging/job_201302141200_0002
> Exception in thread "main" java.lang.IncompatibleClassChangeError: Found
> class org.apache.hadoop.mapreduce.JobContext, but interface was expected
> 	at
> org.apache.sqoop.config.ConfigurationHelper.getJobNumMaps(ConfigurationHelper.java:53)
> 	at
> com.cloudera.sqoop.config.ConfigurationHelper.getJobNumMaps(ConfigurationHelper.java:36)
> 	at
> org.apache.sqoop.mapreduce.db.DataDrivenDBInputFormat.getSplits(DataDrivenDBInputFormat.java:121)
> 	at org.apache.hadoop.mapred.JobClient.writeNewSplits(JobClient.java:962)
> 	at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:979)
> 	at org.apache.hadoop.mapred.JobClient.access$600(JobClient.java:174)
> 	at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:897)
> 	at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:850)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:396)
> 	at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
> 	at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:850)
> 	at org.apache.hadoop.mapreduce.Job.submit(Job.java:500)
> 	at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:530)
> 	at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:141)
> 	at
> org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:202)
> 	at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:465)
> 	at
> com.microsoft.sqoop.SqlServer.MSSQLServerManager.importTable(MSSQLServerManager.java:145)
> 	at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:403)
> 	at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:476)
> 	at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
> 	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> 	at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
> 	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
> 	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
> 	at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
> 	at com.cloudera.sqoop.Sqoop.main(Sqoop.java:57)
> user@user-7:~/sqoop$
> 
> Thank you in advance for pointers.
> Regards,
> 
> Cyrille
> 
>