You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@sqoop.apache.org by sam liu <li...@gmail.com> on 2013/09/06 11:39:16 UTC

Error occurs on TestAllTables using Hadoop-2.1.0-beta jar files

Hi,

For Sqoop-1.4.3 project, I updated build.xml and ivy.xml file to try to run
build/UT with Hadoop-2.1.0-beta jar files. The compile could be successful.
However, encounter following error when run UT. Any comments? Thanks!

 <system-err><![CDATA[[Server@260f260f]: [Thread[HSQLDB Server
@260f260f,5,main]]: run()/openServerSocket():
java.net.BindException: Address already in use
        at java.net.PlainSocketImpl.socketBind(Native Method)
        at java.net.PlainSocketImpl.bind(PlainSocketImpl.java:413)
        at java.net.ServerSocket.bind(ServerSocket.java:339)
        at java.net.ServerSocket.<init>(ServerSocket.java:205)
        at java.net.ServerSocket.<init>(ServerSocket.java:117)
        at org.hsqldb.HsqlSocketFactory.createServerSocket(Unknown Source)
        at org.hsqldb.Server.openServerSocket(Unknown Source)
        at org.hsqldb.Server.run(Unknown Source)
        at org.hsqldb.Server.access$000(Unknown Source)
        at org.hsqldb.Server$ServerThread.run(Unknown Source)
Note:
/home/hadoop/sqoop/build/test/data/sqoop-hadoop/compile/cd361bd4917aa57041c693c16e0b1ff0/IMPORT_TABLE_1.java
uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
java.lang.UnsupportedOperationException: This is supposed to be overridden
by subclasses.
        at
com.google.protobuf.GeneratedMessage.getUnknownFields(GeneratedMessage.java:180)
        at
org.apache.hadoop.yarn.proto.YarnProtos$URLProto.hashCode(YarnProtos.java:5487)
        at
org.apache.hadoop.yarn.proto.YarnProtos$LocalResourceProto.hashCode(YarnProtos.java:6167)
        at
org.apache.hadoop.yarn.api.records.impl.pb.LocalResourcePBImpl.hashCode(LocalResourcePBImpl.java:62)
        at java.util.HashMap.hash(HashMap.java:132)
        at java.util.HashMap.putImpl(HashMap.java:695)
        at java.util.HashMap.put(HashMap.java:680)
        at
org.apache.hadoop.mapred.LocalDistributedCacheManager.setup(LocalDistributedCacheManager.java:139)
        at
org.apache.hadoop.mapred.LocalJobRunner$Job.<init>(LocalJobRunner.java:155)
        at
org.apache.hadoop.mapred.LocalJobRunner.submitJob(LocalJobRunner.java:634)
        at
org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:415)
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
        at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
        at
java.security.AccessController.doPrivileged(AccessController.java:310)
        at javax.security.auth.Subject.doAs(Subject.java:573)
        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1494)
        at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1286)
        at
org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:173)
        at
org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:151)
        at
org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:221)
        at
org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:548)
        at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:403)
        at
org.apache.sqoop.tool.ImportAllTablesTool.run(ImportAllTablesTool.java:64)
        at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
        at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
        at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:45)
        at
com.cloudera.sqoop.testutil.ImportJobTestCase.runImport(ImportJobTestCase.java:215)
        at
com.cloudera.sqoop.TestAllTables.testMultiTableImport(TestAllTables.java:110)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

-- 

Sam Liu

Re: Error occurs on TestAllTables using Hadoop-2.1.0-beta jar files

Posted by sam liu <li...@gmail.com>.
Hi Jarek,

I have uploaded a new patch 'SQOOP-1197-trunk.patch' for trunk to
SQOOP-1197. Could you help take a review again? Thanks!


2013/9/10 Jarek Jarcec Cecho <ja...@apache.org>

> Thank you for following up on the issue Sam, greatly appreciated! I've
> took a look on SQOOP-1197 and left some comment there, thank you for
> contributing that work!
>
> Jarcec
>
> On Tue, Sep 10, 2013 at 11:22:15AM +0800, sam liu wrote:
> > Hi Jarcec,
> >
> > Hadoop-2.10-beta upgraded protobuf to 2.5 from 2.4.1, but the version of
> > protobuf  in my env is still 2.4.1, so the sqoop unit tests failed on my
> > env. After I upgraded my protobuf to 2.5, all sqoop unit tests passed.
> So I
> > closed SQOOP-1195 and added comment on it. Hope others who encounter
> > similar issues could benefit from it.
> >
> > For your another comment 'create a JIRA to upgrade the dependency on
> hadoop
> > 2 to 2.1.0-beta', I opened SQOOP-1197 for it, and attached a patch on it.
> > Could you please help review it?
> >
> > Thanks!
> >
> >
> > 2013/9/9 Jarek Jarcec Cecho <ja...@apache.org>
> >
> > > Thank you Sam!
> > >
> > > Jarcec
> > >
> > > On Mon, Sep 09, 2013 at 01:33:59PM +0800, sam liu wrote:
> > > > Hi Jarek,
> > > >
> > > > I opened jira SQOOP-1195(UnsupportedOperationException occurs on
> > > > TestAllTables of Sqoop-1.4.3 with Hadoop-2.1.0-beta jar
> > > > files)<https://issues.apache.org/jira/browse/SQOOP-1195>for this
> > > > issue.
> > > >
> > > > Thanks!
> > > >
> > > >
> > > > 2013/9/8 Jarek Jarcec Cecho <ja...@apache.org>
> > > >
> > > > > Hi Sam,
> > > > > thank you very much for trying this out! Hadoop 2.1.0 do not
> necessary
> > > > > have to be API compatible with the 2.0.0. I would suggest you to
> > > create a
> > > > > JIRA to upgrade the dependency on hadoop 2 to 2.1.0-beta and solve
> the
> > > > > failing test cases there.
> > > > >
> > > > > Jarcec
> > > > >
> > > > > On Fri, Sep 06, 2013 at 05:39:16PM +0800, sam liu wrote:
> > > > > > Hi,
> > > > > >
> > > > > > For Sqoop-1.4.3 project, I updated build.xml and ivy.xml file to
> try
> > > to
> > > > > run
> > > > > > build/UT with Hadoop-2.1.0-beta jar files. The compile could be
> > > > > successful.
> > > > > > However, encounter following error when run UT. Any comments?
> Thanks!
> > > > > >
> > > > > >  <system-err><![CDATA[[Server@260f260f]: [Thread[HSQLDB Server
> > > > > > @260f260f,5,main]]: run()/openServerSocket():
> > > > > > java.net.BindException: Address already in use
> > > > > >         at java.net.PlainSocketImpl.socketBind(Native Method)
> > > > > >         at
> java.net.PlainSocketImpl.bind(PlainSocketImpl.java:413)
> > > > > >         at java.net.ServerSocket.bind(ServerSocket.java:339)
> > > > > >         at java.net.ServerSocket.<init>(ServerSocket.java:205)
> > > > > >         at java.net.ServerSocket.<init>(ServerSocket.java:117)
> > > > > >         at
> org.hsqldb.HsqlSocketFactory.createServerSocket(Unknown
> > > > > Source)
> > > > > >         at org.hsqldb.Server.openServerSocket(Unknown Source)
> > > > > >         at org.hsqldb.Server.run(Unknown Source)
> > > > > >         at org.hsqldb.Server.access$000(Unknown Source)
> > > > > >         at org.hsqldb.Server$ServerThread.run(Unknown Source)
> > > > > > Note:
> > > > > >
> > > > >
> > >
> /home/hadoop/sqoop/build/test/data/sqoop-hadoop/compile/cd361bd4917aa57041c693c16e0b1ff0/IMPORT_TABLE_1.java
> > > > > > uses or overrides a deprecated API.
> > > > > > Note: Recompile with -Xlint:deprecation for details.
> > > > > > java.lang.UnsupportedOperationException: This is supposed to be
> > > > > overridden
> > > > > > by subclasses.
> > > > > >         at
> > > > > >
> > > > >
> > >
> com.google.protobuf.GeneratedMessage.getUnknownFields(GeneratedMessage.java:180)
> > > > > >         at
> > > > > >
> > > > >
> > >
> org.apache.hadoop.yarn.proto.YarnProtos$URLProto.hashCode(YarnProtos.java:5487)
> > > > > >         at
> > > > > >
> > > > >
> > >
> org.apache.hadoop.yarn.proto.YarnProtos$LocalResourceProto.hashCode(YarnProtos.java:6167)
> > > > > >         at
> > > > > >
> > > > >
> > >
> org.apache.hadoop.yarn.api.records.impl.pb.LocalResourcePBImpl.hashCode(LocalResourcePBImpl.java:62)
> > > > > >         at java.util.HashMap.hash(HashMap.java:132)
> > > > > >         at java.util.HashMap.putImpl(HashMap.java:695)
> > > > > >         at java.util.HashMap.put(HashMap.java:680)
> > > > > >         at
> > > > > >
> > > > >
> > >
> org.apache.hadoop.mapred.LocalDistributedCacheManager.setup(LocalDistributedCacheManager.java:139)
> > > > > >         at
> > > > > >
> > > > >
> > >
> org.apache.hadoop.mapred.LocalJobRunner$Job.<init>(LocalJobRunner.java:155)
> > > > > >         at
> > > > > >
> > > > >
> > >
> org.apache.hadoop.mapred.LocalJobRunner.submitJob(LocalJobRunner.java:634)
> > > > > >         at
> > > > > >
> > > > >
> > >
> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:415)
> > > > > >         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
> > > > > >         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
> > > > > >         at
> > > > > >
> > > java.security.AccessController.doPrivileged(AccessController.java:310)
> > > > > >         at javax.security.auth.Subject.doAs(Subject.java:573)
> > > > > >         at
> > > > > >
> > > > >
> > >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1494)
> > > > > >         at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
> > > > > >         at
> > > > > org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1286)
> > > > > >         at
> > > > > >
> > > > >
> > >
> org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:173)
> > > > > >         at
> > > > > >
> > > org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:151)
> > > > > >         at
> > > > > >
> > > > >
> > >
> org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:221)
> > > > > >         at
> > > > > >
> org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:548)
> > > > > >         at
> > > > > org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:403)
> > > > > >         at
> > > > > >
> > > > >
> > >
> org.apache.sqoop.tool.ImportAllTablesTool.run(ImportAllTablesTool.java:64)
> > > > > >         at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
> > > > > >         at
> org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> > > > > >         at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
> > > > > >         at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:45)
> > > > > >         at
> > > > > >
> > > > >
> > >
> com.cloudera.sqoop.testutil.ImportJobTestCase.runImport(ImportJobTestCase.java:215)
> > > > > >         at
> > > > > >
> > > > >
> > >
> com.cloudera.sqoop.TestAllTables.testMultiTableImport(TestAllTables.java:110)
> > > > > >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> > > Method)
> > > > > >
> > > > > > --
> > > > > >
> > > > > > Sam Liu
> > > > >
> > > >
> > > >
> > > >
> > > > --
> > > >
> > > > Sam Liu
> > >
> >
> >
> >
> > --
> >
> > Sam Liu
>



-- 

Sam Liu

Re: Error occurs on TestAllTables using Hadoop-2.1.0-beta jar files

Posted by Jarek Jarcec Cecho <ja...@apache.org>.
Thank you for following up on the issue Sam, greatly appreciated! I've took a look on SQOOP-1197 and left some comment there, thank you for contributing that work!

Jarcec

On Tue, Sep 10, 2013 at 11:22:15AM +0800, sam liu wrote:
> Hi Jarcec,
> 
> Hadoop-2.10-beta upgraded protobuf to 2.5 from 2.4.1, but the version of
> protobuf  in my env is still 2.4.1, so the sqoop unit tests failed on my
> env. After I upgraded my protobuf to 2.5, all sqoop unit tests passed. So I
> closed SQOOP-1195 and added comment on it. Hope others who encounter
> similar issues could benefit from it.
> 
> For your another comment 'create a JIRA to upgrade the dependency on hadoop
> 2 to 2.1.0-beta', I opened SQOOP-1197 for it, and attached a patch on it.
> Could you please help review it?
> 
> Thanks!
> 
> 
> 2013/9/9 Jarek Jarcec Cecho <ja...@apache.org>
> 
> > Thank you Sam!
> >
> > Jarcec
> >
> > On Mon, Sep 09, 2013 at 01:33:59PM +0800, sam liu wrote:
> > > Hi Jarek,
> > >
> > > I opened jira SQOOP-1195(UnsupportedOperationException occurs on
> > > TestAllTables of Sqoop-1.4.3 with Hadoop-2.1.0-beta jar
> > > files)<https://issues.apache.org/jira/browse/SQOOP-1195>for this
> > > issue.
> > >
> > > Thanks!
> > >
> > >
> > > 2013/9/8 Jarek Jarcec Cecho <ja...@apache.org>
> > >
> > > > Hi Sam,
> > > > thank you very much for trying this out! Hadoop 2.1.0 do not necessary
> > > > have to be API compatible with the 2.0.0. I would suggest you to
> > create a
> > > > JIRA to upgrade the dependency on hadoop 2 to 2.1.0-beta and solve the
> > > > failing test cases there.
> > > >
> > > > Jarcec
> > > >
> > > > On Fri, Sep 06, 2013 at 05:39:16PM +0800, sam liu wrote:
> > > > > Hi,
> > > > >
> > > > > For Sqoop-1.4.3 project, I updated build.xml and ivy.xml file to try
> > to
> > > > run
> > > > > build/UT with Hadoop-2.1.0-beta jar files. The compile could be
> > > > successful.
> > > > > However, encounter following error when run UT. Any comments? Thanks!
> > > > >
> > > > >  <system-err><![CDATA[[Server@260f260f]: [Thread[HSQLDB Server
> > > > > @260f260f,5,main]]: run()/openServerSocket():
> > > > > java.net.BindException: Address already in use
> > > > >         at java.net.PlainSocketImpl.socketBind(Native Method)
> > > > >         at java.net.PlainSocketImpl.bind(PlainSocketImpl.java:413)
> > > > >         at java.net.ServerSocket.bind(ServerSocket.java:339)
> > > > >         at java.net.ServerSocket.<init>(ServerSocket.java:205)
> > > > >         at java.net.ServerSocket.<init>(ServerSocket.java:117)
> > > > >         at org.hsqldb.HsqlSocketFactory.createServerSocket(Unknown
> > > > Source)
> > > > >         at org.hsqldb.Server.openServerSocket(Unknown Source)
> > > > >         at org.hsqldb.Server.run(Unknown Source)
> > > > >         at org.hsqldb.Server.access$000(Unknown Source)
> > > > >         at org.hsqldb.Server$ServerThread.run(Unknown Source)
> > > > > Note:
> > > > >
> > > >
> > /home/hadoop/sqoop/build/test/data/sqoop-hadoop/compile/cd361bd4917aa57041c693c16e0b1ff0/IMPORT_TABLE_1.java
> > > > > uses or overrides a deprecated API.
> > > > > Note: Recompile with -Xlint:deprecation for details.
> > > > > java.lang.UnsupportedOperationException: This is supposed to be
> > > > overridden
> > > > > by subclasses.
> > > > >         at
> > > > >
> > > >
> > com.google.protobuf.GeneratedMessage.getUnknownFields(GeneratedMessage.java:180)
> > > > >         at
> > > > >
> > > >
> > org.apache.hadoop.yarn.proto.YarnProtos$URLProto.hashCode(YarnProtos.java:5487)
> > > > >         at
> > > > >
> > > >
> > org.apache.hadoop.yarn.proto.YarnProtos$LocalResourceProto.hashCode(YarnProtos.java:6167)
> > > > >         at
> > > > >
> > > >
> > org.apache.hadoop.yarn.api.records.impl.pb.LocalResourcePBImpl.hashCode(LocalResourcePBImpl.java:62)
> > > > >         at java.util.HashMap.hash(HashMap.java:132)
> > > > >         at java.util.HashMap.putImpl(HashMap.java:695)
> > > > >         at java.util.HashMap.put(HashMap.java:680)
> > > > >         at
> > > > >
> > > >
> > org.apache.hadoop.mapred.LocalDistributedCacheManager.setup(LocalDistributedCacheManager.java:139)
> > > > >         at
> > > > >
> > > >
> > org.apache.hadoop.mapred.LocalJobRunner$Job.<init>(LocalJobRunner.java:155)
> > > > >         at
> > > > >
> > > >
> > org.apache.hadoop.mapred.LocalJobRunner.submitJob(LocalJobRunner.java:634)
> > > > >         at
> > > > >
> > > >
> > org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:415)
> > > > >         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
> > > > >         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
> > > > >         at
> > > > >
> > java.security.AccessController.doPrivileged(AccessController.java:310)
> > > > >         at javax.security.auth.Subject.doAs(Subject.java:573)
> > > > >         at
> > > > >
> > > >
> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1494)
> > > > >         at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
> > > > >         at
> > > > org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1286)
> > > > >         at
> > > > >
> > > >
> > org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:173)
> > > > >         at
> > > > >
> > org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:151)
> > > > >         at
> > > > >
> > > >
> > org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:221)
> > > > >         at
> > > > > org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:548)
> > > > >         at
> > > > org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:403)
> > > > >         at
> > > > >
> > > >
> > org.apache.sqoop.tool.ImportAllTablesTool.run(ImportAllTablesTool.java:64)
> > > > >         at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
> > > > >         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> > > > >         at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
> > > > >         at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:45)
> > > > >         at
> > > > >
> > > >
> > com.cloudera.sqoop.testutil.ImportJobTestCase.runImport(ImportJobTestCase.java:215)
> > > > >         at
> > > > >
> > > >
> > com.cloudera.sqoop.TestAllTables.testMultiTableImport(TestAllTables.java:110)
> > > > >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> > Method)
> > > > >
> > > > > --
> > > > >
> > > > > Sam Liu
> > > >
> > >
> > >
> > >
> > > --
> > >
> > > Sam Liu
> >
> 
> 
> 
> -- 
> 
> Sam Liu

Re: Error occurs on TestAllTables using Hadoop-2.1.0-beta jar files

Posted by sam liu <li...@gmail.com>.
Hi Jarcec,

Hadoop-2.10-beta upgraded protobuf to 2.5 from 2.4.1, but the version of
protobuf  in my env is still 2.4.1, so the sqoop unit tests failed on my
env. After I upgraded my protobuf to 2.5, all sqoop unit tests passed. So I
closed SQOOP-1195 and added comment on it. Hope others who encounter
similar issues could benefit from it.

For your another comment 'create a JIRA to upgrade the dependency on hadoop
2 to 2.1.0-beta', I opened SQOOP-1197 for it, and attached a patch on it.
Could you please help review it?

Thanks!


2013/9/9 Jarek Jarcec Cecho <ja...@apache.org>

> Thank you Sam!
>
> Jarcec
>
> On Mon, Sep 09, 2013 at 01:33:59PM +0800, sam liu wrote:
> > Hi Jarek,
> >
> > I opened jira SQOOP-1195(UnsupportedOperationException occurs on
> > TestAllTables of Sqoop-1.4.3 with Hadoop-2.1.0-beta jar
> > files)<https://issues.apache.org/jira/browse/SQOOP-1195>for this
> > issue.
> >
> > Thanks!
> >
> >
> > 2013/9/8 Jarek Jarcec Cecho <ja...@apache.org>
> >
> > > Hi Sam,
> > > thank you very much for trying this out! Hadoop 2.1.0 do not necessary
> > > have to be API compatible with the 2.0.0. I would suggest you to
> create a
> > > JIRA to upgrade the dependency on hadoop 2 to 2.1.0-beta and solve the
> > > failing test cases there.
> > >
> > > Jarcec
> > >
> > > On Fri, Sep 06, 2013 at 05:39:16PM +0800, sam liu wrote:
> > > > Hi,
> > > >
> > > > For Sqoop-1.4.3 project, I updated build.xml and ivy.xml file to try
> to
> > > run
> > > > build/UT with Hadoop-2.1.0-beta jar files. The compile could be
> > > successful.
> > > > However, encounter following error when run UT. Any comments? Thanks!
> > > >
> > > >  <system-err><![CDATA[[Server@260f260f]: [Thread[HSQLDB Server
> > > > @260f260f,5,main]]: run()/openServerSocket():
> > > > java.net.BindException: Address already in use
> > > >         at java.net.PlainSocketImpl.socketBind(Native Method)
> > > >         at java.net.PlainSocketImpl.bind(PlainSocketImpl.java:413)
> > > >         at java.net.ServerSocket.bind(ServerSocket.java:339)
> > > >         at java.net.ServerSocket.<init>(ServerSocket.java:205)
> > > >         at java.net.ServerSocket.<init>(ServerSocket.java:117)
> > > >         at org.hsqldb.HsqlSocketFactory.createServerSocket(Unknown
> > > Source)
> > > >         at org.hsqldb.Server.openServerSocket(Unknown Source)
> > > >         at org.hsqldb.Server.run(Unknown Source)
> > > >         at org.hsqldb.Server.access$000(Unknown Source)
> > > >         at org.hsqldb.Server$ServerThread.run(Unknown Source)
> > > > Note:
> > > >
> > >
> /home/hadoop/sqoop/build/test/data/sqoop-hadoop/compile/cd361bd4917aa57041c693c16e0b1ff0/IMPORT_TABLE_1.java
> > > > uses or overrides a deprecated API.
> > > > Note: Recompile with -Xlint:deprecation for details.
> > > > java.lang.UnsupportedOperationException: This is supposed to be
> > > overridden
> > > > by subclasses.
> > > >         at
> > > >
> > >
> com.google.protobuf.GeneratedMessage.getUnknownFields(GeneratedMessage.java:180)
> > > >         at
> > > >
> > >
> org.apache.hadoop.yarn.proto.YarnProtos$URLProto.hashCode(YarnProtos.java:5487)
> > > >         at
> > > >
> > >
> org.apache.hadoop.yarn.proto.YarnProtos$LocalResourceProto.hashCode(YarnProtos.java:6167)
> > > >         at
> > > >
> > >
> org.apache.hadoop.yarn.api.records.impl.pb.LocalResourcePBImpl.hashCode(LocalResourcePBImpl.java:62)
> > > >         at java.util.HashMap.hash(HashMap.java:132)
> > > >         at java.util.HashMap.putImpl(HashMap.java:695)
> > > >         at java.util.HashMap.put(HashMap.java:680)
> > > >         at
> > > >
> > >
> org.apache.hadoop.mapred.LocalDistributedCacheManager.setup(LocalDistributedCacheManager.java:139)
> > > >         at
> > > >
> > >
> org.apache.hadoop.mapred.LocalJobRunner$Job.<init>(LocalJobRunner.java:155)
> > > >         at
> > > >
> > >
> org.apache.hadoop.mapred.LocalJobRunner.submitJob(LocalJobRunner.java:634)
> > > >         at
> > > >
> > >
> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:415)
> > > >         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
> > > >         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
> > > >         at
> > > >
> java.security.AccessController.doPrivileged(AccessController.java:310)
> > > >         at javax.security.auth.Subject.doAs(Subject.java:573)
> > > >         at
> > > >
> > >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1494)
> > > >         at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
> > > >         at
> > > org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1286)
> > > >         at
> > > >
> > >
> org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:173)
> > > >         at
> > > >
> org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:151)
> > > >         at
> > > >
> > >
> org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:221)
> > > >         at
> > > > org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:548)
> > > >         at
> > > org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:403)
> > > >         at
> > > >
> > >
> org.apache.sqoop.tool.ImportAllTablesTool.run(ImportAllTablesTool.java:64)
> > > >         at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
> > > >         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> > > >         at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
> > > >         at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:45)
> > > >         at
> > > >
> > >
> com.cloudera.sqoop.testutil.ImportJobTestCase.runImport(ImportJobTestCase.java:215)
> > > >         at
> > > >
> > >
> com.cloudera.sqoop.TestAllTables.testMultiTableImport(TestAllTables.java:110)
> > > >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > > >
> > > > --
> > > >
> > > > Sam Liu
> > >
> >
> >
> >
> > --
> >
> > Sam Liu
>



-- 

Sam Liu

Re: Error occurs on TestAllTables using Hadoop-2.1.0-beta jar files

Posted by Jarek Jarcec Cecho <ja...@apache.org>.
Thank you Sam!

Jarcec

On Mon, Sep 09, 2013 at 01:33:59PM +0800, sam liu wrote:
> Hi Jarek,
> 
> I opened jira SQOOP-1195(UnsupportedOperationException occurs on
> TestAllTables of Sqoop-1.4.3 with Hadoop-2.1.0-beta jar
> files)<https://issues.apache.org/jira/browse/SQOOP-1195>for this
> issue.
> 
> Thanks!
> 
> 
> 2013/9/8 Jarek Jarcec Cecho <ja...@apache.org>
> 
> > Hi Sam,
> > thank you very much for trying this out! Hadoop 2.1.0 do not necessary
> > have to be API compatible with the 2.0.0. I would suggest you to create a
> > JIRA to upgrade the dependency on hadoop 2 to 2.1.0-beta and solve the
> > failing test cases there.
> >
> > Jarcec
> >
> > On Fri, Sep 06, 2013 at 05:39:16PM +0800, sam liu wrote:
> > > Hi,
> > >
> > > For Sqoop-1.4.3 project, I updated build.xml and ivy.xml file to try to
> > run
> > > build/UT with Hadoop-2.1.0-beta jar files. The compile could be
> > successful.
> > > However, encounter following error when run UT. Any comments? Thanks!
> > >
> > >  <system-err><![CDATA[[Server@260f260f]: [Thread[HSQLDB Server
> > > @260f260f,5,main]]: run()/openServerSocket():
> > > java.net.BindException: Address already in use
> > >         at java.net.PlainSocketImpl.socketBind(Native Method)
> > >         at java.net.PlainSocketImpl.bind(PlainSocketImpl.java:413)
> > >         at java.net.ServerSocket.bind(ServerSocket.java:339)
> > >         at java.net.ServerSocket.<init>(ServerSocket.java:205)
> > >         at java.net.ServerSocket.<init>(ServerSocket.java:117)
> > >         at org.hsqldb.HsqlSocketFactory.createServerSocket(Unknown
> > Source)
> > >         at org.hsqldb.Server.openServerSocket(Unknown Source)
> > >         at org.hsqldb.Server.run(Unknown Source)
> > >         at org.hsqldb.Server.access$000(Unknown Source)
> > >         at org.hsqldb.Server$ServerThread.run(Unknown Source)
> > > Note:
> > >
> > /home/hadoop/sqoop/build/test/data/sqoop-hadoop/compile/cd361bd4917aa57041c693c16e0b1ff0/IMPORT_TABLE_1.java
> > > uses or overrides a deprecated API.
> > > Note: Recompile with -Xlint:deprecation for details.
> > > java.lang.UnsupportedOperationException: This is supposed to be
> > overridden
> > > by subclasses.
> > >         at
> > >
> > com.google.protobuf.GeneratedMessage.getUnknownFields(GeneratedMessage.java:180)
> > >         at
> > >
> > org.apache.hadoop.yarn.proto.YarnProtos$URLProto.hashCode(YarnProtos.java:5487)
> > >         at
> > >
> > org.apache.hadoop.yarn.proto.YarnProtos$LocalResourceProto.hashCode(YarnProtos.java:6167)
> > >         at
> > >
> > org.apache.hadoop.yarn.api.records.impl.pb.LocalResourcePBImpl.hashCode(LocalResourcePBImpl.java:62)
> > >         at java.util.HashMap.hash(HashMap.java:132)
> > >         at java.util.HashMap.putImpl(HashMap.java:695)
> > >         at java.util.HashMap.put(HashMap.java:680)
> > >         at
> > >
> > org.apache.hadoop.mapred.LocalDistributedCacheManager.setup(LocalDistributedCacheManager.java:139)
> > >         at
> > >
> > org.apache.hadoop.mapred.LocalJobRunner$Job.<init>(LocalJobRunner.java:155)
> > >         at
> > >
> > org.apache.hadoop.mapred.LocalJobRunner.submitJob(LocalJobRunner.java:634)
> > >         at
> > >
> > org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:415)
> > >         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
> > >         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
> > >         at
> > > java.security.AccessController.doPrivileged(AccessController.java:310)
> > >         at javax.security.auth.Subject.doAs(Subject.java:573)
> > >         at
> > >
> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1494)
> > >         at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
> > >         at
> > org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1286)
> > >         at
> > >
> > org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:173)
> > >         at
> > > org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:151)
> > >         at
> > >
> > org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:221)
> > >         at
> > > org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:548)
> > >         at
> > org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:403)
> > >         at
> > >
> > org.apache.sqoop.tool.ImportAllTablesTool.run(ImportAllTablesTool.java:64)
> > >         at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
> > >         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> > >         at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
> > >         at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:45)
> > >         at
> > >
> > com.cloudera.sqoop.testutil.ImportJobTestCase.runImport(ImportJobTestCase.java:215)
> > >         at
> > >
> > com.cloudera.sqoop.TestAllTables.testMultiTableImport(TestAllTables.java:110)
> > >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > >
> > > --
> > >
> > > Sam Liu
> >
> 
> 
> 
> -- 
> 
> Sam Liu

Re: Error occurs on TestAllTables using Hadoop-2.1.0-beta jar files

Posted by sam liu <li...@gmail.com>.
Hi Jarek,

I opened jira SQOOP-1195(UnsupportedOperationException occurs on
TestAllTables of Sqoop-1.4.3 with Hadoop-2.1.0-beta jar
files)<https://issues.apache.org/jira/browse/SQOOP-1195>for this
issue.

Thanks!


2013/9/8 Jarek Jarcec Cecho <ja...@apache.org>

> Hi Sam,
> thank you very much for trying this out! Hadoop 2.1.0 do not necessary
> have to be API compatible with the 2.0.0. I would suggest you to create a
> JIRA to upgrade the dependency on hadoop 2 to 2.1.0-beta and solve the
> failing test cases there.
>
> Jarcec
>
> On Fri, Sep 06, 2013 at 05:39:16PM +0800, sam liu wrote:
> > Hi,
> >
> > For Sqoop-1.4.3 project, I updated build.xml and ivy.xml file to try to
> run
> > build/UT with Hadoop-2.1.0-beta jar files. The compile could be
> successful.
> > However, encounter following error when run UT. Any comments? Thanks!
> >
> >  <system-err><![CDATA[[Server@260f260f]: [Thread[HSQLDB Server
> > @260f260f,5,main]]: run()/openServerSocket():
> > java.net.BindException: Address already in use
> >         at java.net.PlainSocketImpl.socketBind(Native Method)
> >         at java.net.PlainSocketImpl.bind(PlainSocketImpl.java:413)
> >         at java.net.ServerSocket.bind(ServerSocket.java:339)
> >         at java.net.ServerSocket.<init>(ServerSocket.java:205)
> >         at java.net.ServerSocket.<init>(ServerSocket.java:117)
> >         at org.hsqldb.HsqlSocketFactory.createServerSocket(Unknown
> Source)
> >         at org.hsqldb.Server.openServerSocket(Unknown Source)
> >         at org.hsqldb.Server.run(Unknown Source)
> >         at org.hsqldb.Server.access$000(Unknown Source)
> >         at org.hsqldb.Server$ServerThread.run(Unknown Source)
> > Note:
> >
> /home/hadoop/sqoop/build/test/data/sqoop-hadoop/compile/cd361bd4917aa57041c693c16e0b1ff0/IMPORT_TABLE_1.java
> > uses or overrides a deprecated API.
> > Note: Recompile with -Xlint:deprecation for details.
> > java.lang.UnsupportedOperationException: This is supposed to be
> overridden
> > by subclasses.
> >         at
> >
> com.google.protobuf.GeneratedMessage.getUnknownFields(GeneratedMessage.java:180)
> >         at
> >
> org.apache.hadoop.yarn.proto.YarnProtos$URLProto.hashCode(YarnProtos.java:5487)
> >         at
> >
> org.apache.hadoop.yarn.proto.YarnProtos$LocalResourceProto.hashCode(YarnProtos.java:6167)
> >         at
> >
> org.apache.hadoop.yarn.api.records.impl.pb.LocalResourcePBImpl.hashCode(LocalResourcePBImpl.java:62)
> >         at java.util.HashMap.hash(HashMap.java:132)
> >         at java.util.HashMap.putImpl(HashMap.java:695)
> >         at java.util.HashMap.put(HashMap.java:680)
> >         at
> >
> org.apache.hadoop.mapred.LocalDistributedCacheManager.setup(LocalDistributedCacheManager.java:139)
> >         at
> >
> org.apache.hadoop.mapred.LocalJobRunner$Job.<init>(LocalJobRunner.java:155)
> >         at
> >
> org.apache.hadoop.mapred.LocalJobRunner.submitJob(LocalJobRunner.java:634)
> >         at
> >
> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:415)
> >         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
> >         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
> >         at
> > java.security.AccessController.doPrivileged(AccessController.java:310)
> >         at javax.security.auth.Subject.doAs(Subject.java:573)
> >         at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1494)
> >         at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
> >         at
> org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1286)
> >         at
> >
> org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:173)
> >         at
> > org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:151)
> >         at
> >
> org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:221)
> >         at
> > org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:548)
> >         at
> org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:403)
> >         at
> >
> org.apache.sqoop.tool.ImportAllTablesTool.run(ImportAllTablesTool.java:64)
> >         at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
> >         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> >         at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
> >         at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:45)
> >         at
> >
> com.cloudera.sqoop.testutil.ImportJobTestCase.runImport(ImportJobTestCase.java:215)
> >         at
> >
> com.cloudera.sqoop.TestAllTables.testMultiTableImport(TestAllTables.java:110)
> >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >
> > --
> >
> > Sam Liu
>



-- 

Sam Liu

Re: Error occurs on TestAllTables using Hadoop-2.1.0-beta jar files

Posted by Jarek Jarcec Cecho <ja...@apache.org>.
Hi Sam,
thank you very much for trying this out! Hadoop 2.1.0 do not necessary have to be API compatible with the 2.0.0. I would suggest you to create a JIRA to upgrade the dependency on hadoop 2 to 2.1.0-beta and solve the failing test cases there.

Jarcec

On Fri, Sep 06, 2013 at 05:39:16PM +0800, sam liu wrote:
> Hi,
> 
> For Sqoop-1.4.3 project, I updated build.xml and ivy.xml file to try to run
> build/UT with Hadoop-2.1.0-beta jar files. The compile could be successful.
> However, encounter following error when run UT. Any comments? Thanks!
> 
>  <system-err><![CDATA[[Server@260f260f]: [Thread[HSQLDB Server
> @260f260f,5,main]]: run()/openServerSocket():
> java.net.BindException: Address already in use
>         at java.net.PlainSocketImpl.socketBind(Native Method)
>         at java.net.PlainSocketImpl.bind(PlainSocketImpl.java:413)
>         at java.net.ServerSocket.bind(ServerSocket.java:339)
>         at java.net.ServerSocket.<init>(ServerSocket.java:205)
>         at java.net.ServerSocket.<init>(ServerSocket.java:117)
>         at org.hsqldb.HsqlSocketFactory.createServerSocket(Unknown Source)
>         at org.hsqldb.Server.openServerSocket(Unknown Source)
>         at org.hsqldb.Server.run(Unknown Source)
>         at org.hsqldb.Server.access$000(Unknown Source)
>         at org.hsqldb.Server$ServerThread.run(Unknown Source)
> Note:
> /home/hadoop/sqoop/build/test/data/sqoop-hadoop/compile/cd361bd4917aa57041c693c16e0b1ff0/IMPORT_TABLE_1.java
> uses or overrides a deprecated API.
> Note: Recompile with -Xlint:deprecation for details.
> java.lang.UnsupportedOperationException: This is supposed to be overridden
> by subclasses.
>         at
> com.google.protobuf.GeneratedMessage.getUnknownFields(GeneratedMessage.java:180)
>         at
> org.apache.hadoop.yarn.proto.YarnProtos$URLProto.hashCode(YarnProtos.java:5487)
>         at
> org.apache.hadoop.yarn.proto.YarnProtos$LocalResourceProto.hashCode(YarnProtos.java:6167)
>         at
> org.apache.hadoop.yarn.api.records.impl.pb.LocalResourcePBImpl.hashCode(LocalResourcePBImpl.java:62)
>         at java.util.HashMap.hash(HashMap.java:132)
>         at java.util.HashMap.putImpl(HashMap.java:695)
>         at java.util.HashMap.put(HashMap.java:680)
>         at
> org.apache.hadoop.mapred.LocalDistributedCacheManager.setup(LocalDistributedCacheManager.java:139)
>         at
> org.apache.hadoop.mapred.LocalJobRunner$Job.<init>(LocalJobRunner.java:155)
>         at
> org.apache.hadoop.mapred.LocalJobRunner.submitJob(LocalJobRunner.java:634)
>         at
> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:415)
>         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
>         at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
>         at
> java.security.AccessController.doPrivileged(AccessController.java:310)
>         at javax.security.auth.Subject.doAs(Subject.java:573)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1494)
>         at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
>         at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1286)
>         at
> org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:173)
>         at
> org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:151)
>         at
> org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:221)
>         at
> org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:548)
>         at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:403)
>         at
> org.apache.sqoop.tool.ImportAllTablesTool.run(ImportAllTablesTool.java:64)
>         at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>         at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
>         at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:45)
>         at
> com.cloudera.sqoop.testutil.ImportJobTestCase.runImport(ImportJobTestCase.java:215)
>         at
> com.cloudera.sqoop.TestAllTables.testMultiTableImport(TestAllTables.java:110)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 
> -- 
> 
> Sam Liu