You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@sqoop.apache.org by "arvind@cloudera.com" <ar...@cloudera.com> on 2011/08/29 17:45:26 UTC

Re: [sqoop-user] Sqoop-with Terradata

[Moving the thread to sqoop-user@incubator.apache.org]

Hi Srini,

You should be able to use the generic JDBC connector to import/export
from Teradata. There is also a specialized connector that is available
for use with Teradata if you are interested. This connector is not a
part of Sqoop and can be obtained from Cloudera by going to:

http://www.cloudera.com/partners/connectors/

Thanks,
Arvind

On Mon, Aug 29, 2011 at 8:17 AM, SRINIVAS SURASANI <va...@gmail.com> wrote:
> I have csv file in hadoop and looking to load into Teradata. I was
> wondering does the sqoop works with Terradata.(with JDBC jar placing
> in sqoop lib dir).
>
> Regards
> Srini
>
> --
> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor of Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please subscribe to it by sending an email to incubator-sqoop-user-subscribe@apache.org.
>

Re: [sqoop-user] Sqoop-with Terradata

Posted by "arvind@cloudera.com" <ar...@cloudera.com>.
Hi Srinivas,

I am sorry that you are running into these issues. I think you may be
hitting [SQOOP-314]. We have recently fixed it on the trunk, so can
you please try out a build from the trunk to see if it resolves your
problem. You should run the export with the --batch option.

Also, to subscribe to sqoop-user@incubator.apache.org, you would first
need to send a mail to incubator-sqoop-user-subscribe@apache.org and
follow the instructions that come back.

[SQOOP-314] https://issues.apache.org/jira/browse/SQOOP-314

Thanks,
Arvind

On Thu, Sep 8, 2011 at 8:30 PM, SRINIVAS SURASANI <va...@gmail.com> wrote:
> Arvind,
> I have tried sending an reply to sqoop-user@incubator.apache.org but its
> giving me failure notification.
>
> Sqoop export is hanging up after 100% of map is done. Even  I have tried
> with "sqoop.export.records.per.statement"=1 still after 100% of map, job
> goes into idle condition.Importing into HDFS from Terradata is working
> fine.
>
> Thanks,
> Srini
> On Wed, Aug 31, 2011 at 12:19 AM, SRINIVAS SURASANI <va...@gmail.com>
> wrote:
>>
>> Arvind,
>> Here we have two scenarios.
>> 1.The xml data as it is processed need to be loaded into Terradata ( this
>> is for analysis of intra-day).  So, we will be having around 3500 files/day
>> to be loaded into Terrdata. This is like semi-real kind of loading. Duration
>> between source landing(xml data into HDFS) and loading(csv files into
>> Terradata) is couple of minutes.
>> 2. As a batch. Meaning, proceesing entire day data together ( xml to csv
>> using Pig code) and sending this single directory as a batch.
>>
>> The scenarios might change.. as the project is in early stages.
>> Regards,
>> Srini
>>
>> On Tue, Aug 30, 2011 at 11:48 PM, arvind@cloudera.com
>> <ar...@cloudera.com> wrote:
>>>
>>> Thanks Srini. What kind of work load do you have?
>>>
>>> - Arvind
>>>
>>> On Tue, Aug 30, 2011 at 5:47 PM, SRINIVAS SURASANI <va...@gmail.com>
>>> wrote:
>>> > Arvind,
>>> > I'm working as contractor at MorganStanley.
>>> >
>>> > Srini
>>> > On Tue, Aug 30, 2011 at 5:41 PM, arvind@cloudera.com
>>> > <ar...@cloudera.com>
>>> > wrote:
>>> >>
>>> >> Hi Srini,
>>> >>
>>> >> I am glad I was able to help. Do you mind sharing with me some more
>>> >> details regarding how/where you are using Sqoop? I keep a track of the
>>> >> various deployments in order to help prioritize the feature
>>> >> development so that it is aligned with what the users need rather than
>>> >> what we want to do first.
>>> >>
>>> >> It will be great if you can give me an idea of which company this
>>> >> project was for and what was the high-level workload like, as long as
>>> >> it is public information.
>>> >>
>>> >> Thanks,
>>> >> Arvind
>>> >>
>>> >>
>>> >> ---------- Forwarded message ----------
>>> >> From: SRINIVAS SURASANI <va...@gmail.com>
>>> >> Date: Mon, Aug 29, 2011 at 3:06 PM
>>> >> Subject: Re: [sqoop-user] Sqoop-with Terradata
>>> >> To: sqoop-user@cloudera.org
>>> >>
>>> >>
>>> >> Arvind,
>>> >>
>>> >> Thank-you very much. I'm able to connect to Terradata.
>>> >>
>>> >>
>>> >> Srini
>>> >>
>>> >> On Mon, Aug 29, 2011 at 2:58 PM, arvind@cloudera.com
>>> >> <ar...@cloudera.com> wrote:
>>> >> >
>>> >> > This may be related to not having the GSS configuration classes in
>>> >> > your classpath. I would suggest you add tdgssconfig.jar to the
>>> >> > classpath and try again.
>>> >> >
>>> >> > Thanks,
>>> >> > Arvind
>>> >> >
>>> >> > On Mon, Aug 29, 2011 at 11:19 AM, SRINIVAS SURASANI
>>> >> > <va...@gmail.com>
>>> >> > wrote:
>>> >> > > Arvind,
>>> >> > >
>>> >> > > I have subscribed at sqoop-user@incubator.apache.org and posted
>>> >> > > Question .
>>> >> > > Sorry for the inconvinence from my end, since I'm close to
>>> >> > > deadline
>>> >> > > I'm
>>> >> > > taking your valuable time.
>>> >> > >
>>> >> > > sqoop list-tables --driver com.teradata.jdbc.TeraDriver --connect
>>> >> > > jdbc:teradata://PKTD/E1_CMS_WORK --username srini -P
>>> >> > > Iam getting the following error:
>>> >> > > 11/08/29 13:08:03 INFO manager.SqlManager: Using default fetchSize
>>> >> > > of
>>> >> > > 1000
>>> >> > > GSSException: Failure unspecified at GSS-API level (Mechanism
>>> >> > > level:
>>> >> > > UserFile parameter null)
>>> >> > >         at
>>> >> > > com.teradata.tdgss.jtdgss.TdgssParseXml.<init>(DashoA1*..)
>>> >> > >         at
>>> >> > > com.teradata.tdgss.jtdgss.TdgssConfigApi.<init>(DashoA1*..)
>>> >> > >         at
>>> >> > > com.teradata.tdgss.jtdgss.TdgssManager.<init>(DashoA1*..)
>>> >> > >         at
>>> >> > > com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)
>>> >> > >         at
>>> >> > >
>>> >> > >
>>> >> > > com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:612)
>>> >> > >         at
>>> >> > >
>>> >> > >
>>> >> > > com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:630)
>>> >> > >         at
>>> >> > >
>>> >> > >
>>> >> > > com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(GenericTeraEncrypt.java:723)
>>> >> > >         at
>>> >> > > com.teradata.jdbc.AuthMechanism.<init>(AuthMechanism.java:50)
>>> >> > >         at
>>> >> > >
>>> >> > >
>>> >> > > com.teradata.jdbc.jdbc.GenericInitDBConfigState.action(GenericInitDBConfigState.java:105)
>>> >> > >         at
>>> >> > >
>>> >> > >
>>> >> > > com.teradata.jdbc.jdbc.GenericLogonController.run(GenericLogonController.java:49)
>>> >> > >         at
>>> >> > > com.teradata.jdbc.jdbc_4.TDSession.<init>(TDSession.java:199)
>>> >> > >         at
>>> >> > >
>>> >> > >
>>> >> > > com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalConnection.<init>(TeraLocalConnection.java:95)
>>> >> > >         at
>>> >> > >
>>> >> > >
>>> >> > > com.teradata.jdbc.jdbc.ConnectionFactory.createConnection(ConnectionFactory.java:54)
>>> >> > >         at
>>> >> > > com.teradata.jdbc.TeraDriver.doConnect(TeraDriver.java:217)
>>> >> > >         at
>>> >> > > com.teradata.jdbc.TeraDriver.connect(TeraDriver.java:150)
>>> >> > >         at
>>> >> > > java.sql.DriverManager.getConnection(DriverManager.java:582)
>>> >> > >         at
>>> >> > > java.sql.DriverManager.getConnection(DriverManager.java:185)
>>> >> > >         at
>>> >> > >
>>> >> > >
>>> >> > > com.cloudera.sqoop.manager.SqlManager.makeConnection(SqlManager.java:643)
>>> >> > >         at
>>> >> > >
>>> >> > >
>>> >> > > com.cloudera.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:51)
>>> >> > >         at
>>> >> > >
>>> >> > > com.cloudera.sqoop.manager.SqlManager.listTables(SqlManager.java:270)
>>> >> > >         at
>>> >> > > com.cloudera.sqoop.tool.ListTablesTool.run(ListTablesTool.java:49)
>>> >> > >         at com.cloudera.sqoop.Sqoop.run(Sqoop.java:144)
>>> >> > >         at
>>> >> > > org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>>> >> > >         at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:180)
>>> >> > >         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:219)
>>> >> > >         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:228)
>>> >> > >         at com.cloudera.sqoop.Sqoop.main(Sqoop.java:237)
>>> >> > > 11/08/29 13:08:04 ERROR sqoop.Sqoop: Got exception running Sqoop:
>>> >> > > java.lang.NullPointerException
>>> >> > > java.lang.NullPointerException
>>> >> > >         at
>>> >> > > com.teradata.tdgss.jtdgss.TdgssConfigApi.GetMechanisms(DashoA1*..)
>>> >> > >         at
>>> >> > > com.teradata.tdgss.jtdgss.TdgssManager.<init>(DashoA1*..)
>>> >> > >         at
>>> >> > > com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)
>>> >> > >         at
>>> >> > >
>>> >> > >
>>> >> > > com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:612)
>>> >> > >         at
>>> >> > >
>>> >> > >
>>> >> > > com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:630)
>>> >> > >         at
>>> >> > >
>>> >> > >
>>> >> > > com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(GenericTeraEncrypt.java:723)
>>> >> > >         at
>>> >> > > com.teradata.jdbc.AuthMechanism.<init>(AuthMechanism.java:50)
>>> >> > >         at
>>> >> > >
>>> >> > >
>>> >> > > com.teradata.jdbc.jdbc.GenericInitDBConfigState.action(GenericInitDBConfigState.java:105)
>>> >> > >         at
>>> >> > >
>>> >> > >
>>> >> > > com.teradata.jdbc.jdbc.GenericLogonController.run(GenericLogonController.java:49)
>>> >> > >         at
>>> >> > > com.teradata.jdbc.jdbc_4.TDSession.<init>(TDSession.java:199)
>>> >> > >         at
>>> >> > >
>>> >> > >
>>> >> > > com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalConnection.<init>(TeraLocalConnection.java:95)
>>> >> > >         at
>>> >> > >
>>> >> > >
>>> >> > > com.teradata.jdbc.jdbc.ConnectionFactory.createConnection(ConnectionFactory.java:54)
>>> >> > >         at
>>> >> > > com.teradata.jdbc.TeraDriver.doConnect(TeraDriver.java:217)
>>> >> > >         at
>>> >> > > com.teradata.jdbc.TeraDriver.connect(TeraDriver.java:150)
>>> >> > >         at
>>> >> > > java.sql.DriverManager.getConnection(DriverManager.java:582)
>>> >> > >         at
>>> >> > > java.sql.DriverManager.getConnection(DriverManager.java:185)
>>> >> > >         at
>>> >> > >
>>> >> > >
>>> >> > > com.cloudera.sqoop.manager.SqlManager.makeConnection(SqlManager.java:643)
>>> >> > >         at
>>> >> > >
>>> >> > >
>>> >> > > com.cloudera.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:51)
>>> >> > >         at
>>> >> > >
>>> >> > > com.cloudera.sqoop.manager.SqlManager.listTables(SqlManager.java:270)
>>> >> > >         at
>>> >> > > com.cloudera.sqoop.tool.ListTablesTool.run(ListTablesTool.java:49)
>>> >> > >         at com.cloudera.sqoop.Sqoop.run(Sqoop.java:144)
>>> >> > >         at
>>> >> > > org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>>> >> > >         at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:180)
>>> >> > >         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:219)
>>> >> > >         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:228)
>>> >> > >         at com.cloudera.sqoop.Sqoop.main(Sqoop.java:237)
>>> >> > > Thanks,
>>> >> > > Srini
>>> >> > >
>>> >> > >
>>> >> > >
>>> >> > > On Mon, Aug 29, 2011 at 12:58 PM, arvind@cloudera.com
>>> >> > > <ar...@cloudera.com>
>>> >> > > wrote:
>>> >> > >>
>>> >> > >> [Please subscribe and respond to sqoop-user@incubator.apache.org]
>>> >> > >>
>>> >> > >> Please use HADOOP_CLASSPATH instead of CLASSPATH. Also, in order
>>> >> > >> to
>>> >> > >> use the generic JDBC connector, you will have to specify the
>>> >> > >> driver
>>> >> > >> class explicitly via the command line option --driver
>>> >> > >> com.teradata.jdbc.TeraDriver.
>>> >> > >>
>>> >> > >> Thanks,
>>> >> > >> Arvind
>>> >> > >>
>>> >> > >> On Mon, Aug 29, 2011 at 9:53 AM, SRINIVAS SURASANI
>>> >> > >> <va...@gmail.com>
>>> >> > >> wrote:
>>> >> > >> > Arvind,
>>> >> > >> > I have set the classpath to teradata4.jar [ not placed the
>>> >> > >> > teradata4.jar
>>> >> > >> > in
>>> >> > >> > sqoop lib, as I dont have permissions].
>>> >> > >> > I'm getting the following error
>>> >> > >> >
>>> >> > >> > sqoop list-tables --connect jdbc:teradata://PKTD/E1_CMS_WORK
>>> >> > >> > --username
>>> >> > >> > srini -P
>>> >> > >> > ERROR: tool.BaseSqoopTool: Got error creating database manager:
>>> >> > >> > java.io.IOexception: No manager for connect string:
>>> >> > >> > jdbc:teradata:///PKTD/E1_CMS_WORK
>>> >> > >> >    at
>>> >> > >> > com.cloudera.sqoop.ConnFactory.getManager(ConnFactory.java:119)
>>> >> > >> >    at
>>> >> > >> >
>>> >> > >> > com.cloudera.sqoop.tool.BaseSqoopTool.init(BaseSqoopTool.java:186)
>>> >> > >> >    ...
>>> >> > >> >    ...
>>> >> > >> > Thanks,
>>> >> > >> > Srini
>>> >> > >> >
>>> >> > >> >
>>> >> > >> >
>>> >> > >> > On Mon, Aug 29, 2011 at 8:52 AM, SRINIVAS SURASANI
>>> >> > >> > <va...@gmail.com>
>>> >> > >> > wrote:
>>> >> > >> >>
>>> >> > >> >> Thanks-a-lot Arvind.
>>> >> > >> >>
>>> >> > >> >> On Mon, Aug 29, 2011 at 8:45 AM, arvind@cloudera.com
>>> >> > >> >> <ar...@cloudera.com>
>>> >> > >> >> wrote:
>>> >> > >> >>>
>>> >> > >> >>> [Moving the thread to sqoop-user@incubator.apache.org]
>>> >> > >> >>>
>>> >> > >> >>> Hi Srini,
>>> >> > >> >>>
>>> >> > >> >>> You should be able to use the generic JDBC connector to
>>> >> > >> >>> import/export
>>> >> > >> >>> from Teradata. There is also a specialized connector that is
>>> >> > >> >>> available
>>> >> > >> >>> for use with Teradata if you are interested. This connector
>>> >> > >> >>> is
>>> >> > >> >>> not a
>>> >> > >> >>> part of Sqoop and can be obtained from Cloudera by going to:
>>> >> > >> >>>
>>> >> > >> >>> http://www.cloudera.com/partners/connectors/
>>> >> > >> >>>
>>> >> > >> >>> Thanks,
>>> >> > >> >>> Arvind
>>> >> > >> >>>
>>> >> > >> >>> On Mon, Aug 29, 2011 at 8:17 AM, SRINIVAS SURASANI
>>> >> > >> >>> <va...@gmail.com>
>>> >> > >> >>> wrote:
>>> >> > >> >>> > I have csv file in hadoop and looking to load into
>>> >> > >> >>> > Teradata. I
>>> >> > >> >>> > was
>>> >> > >> >>> > wondering does the sqoop works with Terradata.(with JDBC
>>> >> > >> >>> > jar
>>> >> > >> >>> > placing
>>> >> > >> >>> > in sqoop lib dir).
>>> >> > >> >>> >
>>> >> > >> >>> > Regards
>>> >> > >> >>> > Srini
>>> >> > >> >>> >
>>> >> > >> >>> > --
>>> >> > >> >>> > NOTE: The mailing list sqoop-user@cloudera.org is
>>> >> > >> >>> > deprecated in
>>> >> > >> >>> > favor
>>> >> > >> >>> > of Apache Sqoop mailing list
>>> >> > >> >>> > sqoop-user@incubator.apache.org.
>>> >> > >> >>> > Please
>>> >> > >> >>> > subscribe to it by sending an email to
>>> >> > >> >>> > incubator-sqoop-user-subscribe@apache.org.
>>> >> > >> >>> >
>>> >> > >> >>>
>>> >> > >> >>> --
>>> >> > >> >>> NOTE: The mailing list sqoop-user@cloudera.org is deprecated
>>> >> > >> >>> in
>>> >> > >> >>> favor
>>> >> > >> >>> of
>>> >> > >> >>> Apache Sqoop mailing list sqoop-user@incubator.apache.org.
>>> >> > >> >>> Please
>>> >> > >> >>> subscribe
>>> >> > >> >>> to it by sending an email to
>>> >> > >> >>> incubator-sqoop-user-subscribe@apache.org.
>>> >> > >> >>
>>> >> > >> >
>>> >> > >> > --
>>> >> > >> > NOTE: The mailing list sqoop-user@cloudera.org is deprecated in
>>> >> > >> > favor of
>>> >> > >> > Apache Sqoop mailing list sqoop-user@incubator.apache.org.
>>> >> > >> > Please
>>> >> > >> > subscribe
>>> >> > >> > to it by sending an email to
>>> >> > >> > incubator-sqoop-user-subscribe@apache.org.
>>> >> > >> >
>>> >> > >>
>>> >> > >> --
>>> >> > >> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in
>>> >> > >> favor
>>> >> > >> of
>>> >> > >> Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please
>>> >> > >> subscribe
>>> >> > >> to it by sending an email to
>>> >> > >> incubator-sqoop-user-subscribe@apache.org.
>>> >> > >
>>> >> > > --
>>> >> > > NOTE: The mailing list sqoop-user@cloudera.org is deprecated in
>>> >> > > favor
>>> >> > > of
>>> >> > > Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please
>>> >> > > subscribe
>>> >> > > to it by sending an email to
>>> >> > > incubator-sqoop-user-subscribe@apache.org.
>>> >> > >
>>> >> >
>>> >> > --
>>> >> > NOTE: The mailing list sqoop-user@cloudera.org is deprecated in
>>> >> > favor of
>>> >> > Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please
>>> >> > subscribe
>>> >> > to it by sending an email to
>>> >> > incubator-sqoop-user-subscribe@apache.org.
>>> >>
>>> >> --
>>> >> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor
>>> >> of Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please
>>> >> subscribe to it by sending an email to
>>> >> incubator-sqoop-user-subscribe@apache.org.
>>> >
>>> >
>>
>
>

Re: [sqoop-user] Sqoop-with Terradata

Posted by "arvind@cloudera.com" <ar...@cloudera.com>.
This may be related to not having the GSS configuration classes in
your classpath. I would suggest you add tdgssconfig.jar to the
classpath and try again.

Thanks,
Arvind

On Mon, Aug 29, 2011 at 11:19 AM, SRINIVAS SURASANI <va...@gmail.com> wrote:
> Arvind,
>
> I have subscribed at sqoop-user@incubator.apache.org and posted Question .
> Sorry for the inconvinence from my end, since I'm close to deadline I'm
> taking your valuable time.
>
> sqoop list-tables --driver com.teradata.jdbc.TeraDriver --connect
> jdbc:teradata://PKTD/E1_CMS_WORK --username srini -P
> Iam getting the following error:
> 11/08/29 13:08:03 INFO manager.SqlManager: Using default fetchSize of 1000
> GSSException: Failure unspecified at GSS-API level (Mechanism level:
> UserFile parameter null)
>         at com.teradata.tdgss.jtdgss.TdgssParseXml.<init>(DashoA1*..)
>         at com.teradata.tdgss.jtdgss.TdgssConfigApi.<init>(DashoA1*..)
>         at com.teradata.tdgss.jtdgss.TdgssManager.<init>(DashoA1*..)
>         at com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)
>         at
> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:612)
>         at
> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:630)
>         at
> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(GenericTeraEncrypt.java:723)
>         at com.teradata.jdbc.AuthMechanism.<init>(AuthMechanism.java:50)
>         at
> com.teradata.jdbc.jdbc.GenericInitDBConfigState.action(GenericInitDBConfigState.java:105)
>         at
> com.teradata.jdbc.jdbc.GenericLogonController.run(GenericLogonController.java:49)
>         at com.teradata.jdbc.jdbc_4.TDSession.<init>(TDSession.java:199)
>         at
> com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalConnection.<init>(TeraLocalConnection.java:95)
>         at
> com.teradata.jdbc.jdbc.ConnectionFactory.createConnection(ConnectionFactory.java:54)
>         at com.teradata.jdbc.TeraDriver.doConnect(TeraDriver.java:217)
>         at com.teradata.jdbc.TeraDriver.connect(TeraDriver.java:150)
>         at java.sql.DriverManager.getConnection(DriverManager.java:582)
>         at java.sql.DriverManager.getConnection(DriverManager.java:185)
>         at
> com.cloudera.sqoop.manager.SqlManager.makeConnection(SqlManager.java:643)
>         at
> com.cloudera.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:51)
>         at
> com.cloudera.sqoop.manager.SqlManager.listTables(SqlManager.java:270)
>         at
> com.cloudera.sqoop.tool.ListTablesTool.run(ListTablesTool.java:49)
>         at com.cloudera.sqoop.Sqoop.run(Sqoop.java:144)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>         at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:180)
>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:219)
>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:228)
>         at com.cloudera.sqoop.Sqoop.main(Sqoop.java:237)
> 11/08/29 13:08:04 ERROR sqoop.Sqoop: Got exception running Sqoop:
> java.lang.NullPointerException
> java.lang.NullPointerException
>         at
> com.teradata.tdgss.jtdgss.TdgssConfigApi.GetMechanisms(DashoA1*..)
>         at com.teradata.tdgss.jtdgss.TdgssManager.<init>(DashoA1*..)
>         at com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)
>         at
> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:612)
>         at
> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:630)
>         at
> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(GenericTeraEncrypt.java:723)
>         at com.teradata.jdbc.AuthMechanism.<init>(AuthMechanism.java:50)
>         at
> com.teradata.jdbc.jdbc.GenericInitDBConfigState.action(GenericInitDBConfigState.java:105)
>         at
> com.teradata.jdbc.jdbc.GenericLogonController.run(GenericLogonController.java:49)
>         at com.teradata.jdbc.jdbc_4.TDSession.<init>(TDSession.java:199)
>         at
> com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalConnection.<init>(TeraLocalConnection.java:95)
>         at
> com.teradata.jdbc.jdbc.ConnectionFactory.createConnection(ConnectionFactory.java:54)
>         at com.teradata.jdbc.TeraDriver.doConnect(TeraDriver.java:217)
>         at com.teradata.jdbc.TeraDriver.connect(TeraDriver.java:150)
>         at java.sql.DriverManager.getConnection(DriverManager.java:582)
>         at java.sql.DriverManager.getConnection(DriverManager.java:185)
>         at
> com.cloudera.sqoop.manager.SqlManager.makeConnection(SqlManager.java:643)
>         at
> com.cloudera.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:51)
>         at
> com.cloudera.sqoop.manager.SqlManager.listTables(SqlManager.java:270)
>         at
> com.cloudera.sqoop.tool.ListTablesTool.run(ListTablesTool.java:49)
>         at com.cloudera.sqoop.Sqoop.run(Sqoop.java:144)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>         at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:180)
>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:219)
>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:228)
>         at com.cloudera.sqoop.Sqoop.main(Sqoop.java:237)
> Thanks,
> Srini
>
>
>
> On Mon, Aug 29, 2011 at 12:58 PM, arvind@cloudera.com <ar...@cloudera.com>
> wrote:
>>
>> [Please subscribe and respond to sqoop-user@incubator.apache.org]
>>
>> Please use HADOOP_CLASSPATH instead of CLASSPATH. Also, in order to
>> use the generic JDBC connector, you will have to specify the driver
>> class explicitly via the command line option --driver
>> com.teradata.jdbc.TeraDriver.
>>
>> Thanks,
>> Arvind
>>
>> On Mon, Aug 29, 2011 at 9:53 AM, SRINIVAS SURASANI <va...@gmail.com>
>> wrote:
>> > Arvind,
>> > I have set the classpath to teradata4.jar [ not placed the teradata4.jar
>> > in
>> > sqoop lib, as I dont have permissions].
>> > I'm getting the following error
>> >
>> > sqoop list-tables --connect jdbc:teradata://PKTD/E1_CMS_WORK --username
>> > srini -P
>> > ERROR: tool.BaseSqoopTool: Got error creating database manager:
>> > java.io.IOexception: No manager for connect string:
>> > jdbc:teradata:///PKTD/E1_CMS_WORK
>> >    at com.cloudera.sqoop.ConnFactory.getManager(ConnFactory.java:119)
>> >    at com.cloudera.sqoop.tool.BaseSqoopTool.init(BaseSqoopTool.java:186)
>> >    ...
>> >    ...
>> > Thanks,
>> > Srini
>> >
>> >
>> >
>> > On Mon, Aug 29, 2011 at 8:52 AM, SRINIVAS SURASANI <va...@gmail.com>
>> > wrote:
>> >>
>> >> Thanks-a-lot Arvind.
>> >>
>> >> On Mon, Aug 29, 2011 at 8:45 AM, arvind@cloudera.com
>> >> <ar...@cloudera.com>
>> >> wrote:
>> >>>
>> >>> [Moving the thread to sqoop-user@incubator.apache.org]
>> >>>
>> >>> Hi Srini,
>> >>>
>> >>> You should be able to use the generic JDBC connector to import/export
>> >>> from Teradata. There is also a specialized connector that is available
>> >>> for use with Teradata if you are interested. This connector is not a
>> >>> part of Sqoop and can be obtained from Cloudera by going to:
>> >>>
>> >>> http://www.cloudera.com/partners/connectors/
>> >>>
>> >>> Thanks,
>> >>> Arvind
>> >>>
>> >>> On Mon, Aug 29, 2011 at 8:17 AM, SRINIVAS SURASANI <va...@gmail.com>
>> >>> wrote:
>> >>> > I have csv file in hadoop and looking to load into Teradata. I was
>> >>> > wondering does the sqoop works with Terradata.(with JDBC jar placing
>> >>> > in sqoop lib dir).
>> >>> >
>> >>> > Regards
>> >>> > Srini
>> >>> >
>> >>> > --
>> >>> > NOTE: The mailing list sqoop-user@cloudera.org is deprecated in
>> >>> > favor
>> >>> > of Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please
>> >>> > subscribe to it by sending an email to
>> >>> > incubator-sqoop-user-subscribe@apache.org.
>> >>> >
>> >>>
>> >>> --
>> >>> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor
>> >>> of
>> >>> Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please
>> >>> subscribe
>> >>> to it by sending an email to
>> >>> incubator-sqoop-user-subscribe@apache.org.
>> >>
>> >
>> > --
>> > NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor of
>> > Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please
>> > subscribe
>> > to it by sending an email to incubator-sqoop-user-subscribe@apache.org.
>> >
>>
>> --
>> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor of
>> Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please subscribe
>> to it by sending an email to incubator-sqoop-user-subscribe@apache.org.
>
> --
> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor of
> Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please subscribe
> to it by sending an email to incubator-sqoop-user-subscribe@apache.org.
>

Re: [sqoop-user] Sqoop-with Terradata

Posted by Srinivas <va...@gmail.com>.

Sent from my iPhone

On Aug 31, 2011, at 1:44 PM, Arvind Prabhakar <ar...@apache.org> wrote:

> Srini,
> 
> This is happening because the GSS config Jar is not getting put in
> Distributed Cache. Sqoop only puts certain jars in the cache as
> opposed to putting every jar that exists in its classpath. In order to
> force any Jar to be put in the Distributed Cache, you must copy it
> over to Sqoop's lib directory.
> 
> Thanks,
> Arvind
> 
> On Tue, Aug 30, 2011 at 9:36 PM, SRINIVAS SURASANI <va...@gmail.com> wrote:
>> Getting the error while exporting .. And from my observation while compiling
>> .java, sets the classpath for terajdbc4.jar and tdgssconfig.jar ( as I
>> marked in bold letters below) but just before launching map-reduce adding
>> jar classpath of tdgssconfig.jar.
>>  I set the HADOOP_CLASSPTH=<path to> terajdbc4.jar:<path to>tdgssconfig.jar
>> Any Help Appreciated.
>> $ sqoop export --verbose --driver com.teradata.jdbc.TeraDriver --connect
>> jdbc:teradata://TD/DB --username WBD -P --table DB.Temp_Table --export-dir
>> /user/hadoop/sqoop_test.txt --fields-terminated-by , --lines-terminated-by
>> \n -m 1>
>> 11/08/30 22:59:43 DEBUG tool.BaseSqoopTool: Enabled debug logging.
>> Enter password:
>> 11/08/30 22:59:50 DEBUG sqoop.ConnFactory: Loaded manager factory:
>> com.cloudera.sqoop.manager.DefaultManagerFactory
>> 11/08/30 22:59:50 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
>> com.cloudera.sqoop.manager.DefaultManagerFactory
>> 11/08/30 22:59:50 INFO manager.SqlManager: Using default fetchSize of 1000
>> 11/08/30 22:59:50 DEBUG sqoop.ConnFactory: Instantiated ConnManager
>> com.cloudera.sqoop.manager.GenericJdbcManager@2b76e552
>> 11/08/30 22:59:50 INFO tool.CodeGenTool: Beginning code generation
>> 11/08/30 22:59:51 DEBUG manager.SqlManager: Using fetchSize for next query:
>> 1000
>> 11/08/30 22:59:51 INFO manager.SqlManager: Executing SQL statement: SELECT
>> t.* FROM DB.Temp_Table AS t WHERE 1=0
>> 11/08/30 22:59:51 DEBUG manager.SqlManager: Using fetchSize for next query:
>> 1000
>> 11/08/30 22:59:51 INFO manager.SqlManager: Executing SQL statement: SELECT
>> t.* FROM DB.Temp_Table AS t WHERE 1=0
>> 11/08/30 22:59:51 DEBUG orm.ClassWriter: selected columns:
>> 11/08/30 22:59:51 DEBUG orm.ClassWriter:   NAME
>> 11/08/30 22:59:51 DEBUG orm.ClassWriter:   SALARY
>> 11/08/30 22:59:51 DEBUG orm.ClassWriter: Writing source file:
>> /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB_Temp_Table.java
>> 11/08/30 22:59:51 DEBUG orm.ClassWriter: Table name:DB.Temp_Table
>> 11/08/30 22:59:51 DEBUG orm.ClassWriter: Columns: NAME:12, SALARY:3,
>> 11/08/30 22:59:51 DEBUG orm.ClassWriter: sourceFilename is
>> DB_Temp_Table.java
>> 11/08/30 22:59:51 DEBUG orm.CompilationManager: Found existing
>> /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/
>> 11/08/30 22:59:51 INFO orm.CompilationManager: HADOOP_HOME is
>> /usr/lib/hadoop
>> 11/08/30 22:59:51 INFO orm.CompilationManager: Found hadoop core jar at:
>> /usr/lib/hadoop/hadoop-0.20.2-cdh3u0-core.jar
>> 11/08/30 22:59:51 DEBUG orm.CompilationManager: Adding source file:
>> /tmp/sqoop-haoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB_Temp_Table.java
>> 11/08/30 22:59:51 DEBUG orm.CompilationManager: Invoking javac with args:
>> 11/08/30 22:59:51 DEBUG orm.CompilationManager:   -sourcepath
>> 11/08/30 22:59:51 DEBUG orm.CompilationManager:
>> /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/
>> 11/08/30 22:59:51 DEBUG orm.CompilationManager:   -d
>> 11/08/30 22:59:51 DEBUG orm.CompilationManager:
>> /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/
>> 11/08/30 22:59:51 DEBUG orm.CompilationManager:   -classpath
>> 11/08/30 22:59:51 DEBUG orm.CompilationManager:
>> /usr/lib/hadoop:/usr/lib/hadoop/hadoop-core-0.20.2-cdh3u0.jar:/usr/lib/hadoop/lib/ant-contrib-1.0b3.jar:/usr/lib/hadoop/lib/aspectjrt-1.6.5.jar:/usr/lib/hadoop/lib/aspectjtools-1.6.5.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/commons-daemon-1.0.1.jar:/usr/lib/hadoop/lib/commons-el-1.0.jar:/usr/lib/hadoop/lib/commons-httpclient-3.0.1.jar:/usr/lib/hadoop/lib/commons-logging-1.0.4.jar:/usr/lib/hadoop/lib/commons-logging-api-1.0.4.jar:/usr/lib/hadoop/lib/commons-net-1.4.1.jar:/usr/lib/hadoop/lib/core-3.1.1.jar:/usr/lib/hadoop/lib/elephant-bird-1.0.jar:/usr/lib/hadoop/lib/hadoop-fairscheduler-0.20.2-cdh3u0.jar:/usr/lib/hadoop/lib/hadoop-lzo-0.4.8.jar:/usr/lib/hadoop/lib/hsqldb-1.8.0.10.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.5.2.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.5.2.jar:/usr/lib/hadoop/lib/jasper-compiler-5.5.12.jar:/usr/lib/hadoop/lib/jasper-runtime-5.5.12.jar:/usr/lib/hadoop/lib/jets3t-0.6.1.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/jetty-servlet-tester-6.1.26.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/junit-4.5.jar:/usr/lib/hadoop/lib/kfs-0.2.2.jar:/usr/lib/hadoop/lib/log4j-1.2.15.jar:/usr/lib/hadoop/lib/mockito-all-1.8.2.jar:/usr/lib/hadoop/lib/oro-2.0.8.jar:/usr/lib/hadoop/lib/protobuf-java-2.3.0.jar:/usr/lib/hadoop/lib/servlet-api-2.5-20081211.jar:/usr/lib/hadoop/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hadoop/lib/slf4j-api-1.4.3.jar:/usr/lib/hadoop/lib/slf4j-api-1.5.8.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.4.3.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.5.10.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/yamlbeans-0.9.3.jar:/usr/lib/hadoop/lib/jsp-2.1/jsp-2.1.jar:/usr/lib/hadoop/lib/jsp-2.1/jsp-api-2.1.jar:/usr/lib/sqoop/conf:/usr/lib/hbase/conf::/usr/lib/sqoop/lib/ant-contrib-1.0b3.jar:/usr/lib/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/usr/lib/sqoop/lib/commons-io-1.4.jar:/usr/lib/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar:/usr/lib/sqoop/lib/ivy-2.0.0-rc2.jar:/usr/lib/sqoop/lib/sqljdbc4.jar:/usr/lib/hbase/hbase-0.90.1-cdh3u0-tests.jar:/usr/lib/hbase/hbase-0.90.1-cdh3u0.jar:/usr/lib/hbase/lib/activation-1.1.jar:/usr/lib/hbase/lib/asm-3.1.jar:/usr/lib/hbase/lib/avro-1.3.3.jar:/usr/lib/hbase/lib/commons-cli-1.2.jar:/usr/lib/hbase/lib/commons-codec-1.4.jar:/usr/lib/hbase/lib/commons-el-1.0.jar:/usr/lib/hbase/lib/commons-httpclient-3.1.jar:/usr/lib/hbase/lib/commons-lang-2.5.jar:/usr/lib/hbase/lib/commons-logging-1.1.1.jar:/usr/lib/hbase/lib/commons-net-1.4.1.jar:/usr/lib/hbase/lib/core-3.1.1.jar:/usr/lib/hbase/lib/guava-r06.jar:/usr/lib/hbase/lib/hadoop-core.jar:/usr/lib/hbase/lib/hbase-0.90.1-cdh3u0.jar:/usr/lib/hbase/lib/jackson-core-asl-1.5.2.jar:/usr/lib/hbase/lib/jackson-jaxrs-1.5.5.jar:/usr/lib/hbase/lib/jackson-mapper-asl-1.5.2.jar:/usr/lib/hbase/lib/jackson-xc-1.5.5.jar:/usr/lib/hbase/lib/jasper-compiler-5.5.23.jar:/usr/lib/hbase/lib/jasper-runtime-5.5.23.jar:/usr/lib/hbase/lib/jaxb-api-2.1.jar:/usr/lib/hbase/lib/jaxb-impl-2.1.12.jar:/usr/lib/hbase/lib/jersey-core-1.4.jar:/usr/lib/hbase/lib/jersey-json-1.4.jar:/usr/lib/hbase/lib/jersey-server-1.4.jar:/usr/lib/hbase/lib/jettison-1.1.jar:/usr/lib/hbase/lib/jetty-6.1.26.jar:/usr/lib/hbase/lib/jetty-util-6.1.26.jar:/usr/lib/hbase/lib/jruby-complete-1.0.3.jar:/usr/lib/hbase/lib/jsp-2.1-6.1.14.jar:/usr/lib/hbase/lib/jsp-api-2.1-6.1.14.jar:/usr/lib/hbase/lib/jsp-api-2.1.jar:/usr/lib/hbase/lib/jsr311-api-1.1.1.jar:/usr/lib/hbase/lib/log4j-1.2.16.jar:/usr/lib/hbase/lib/protobuf-java-2.3.0.jar:/usr/lib/hbase/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hbase/lib/servlet-api-2.5.jar:/usr/lib/hbase/lib/slf4j-api-1.5.8.jar:/usr/lib/hbase/lib/slf4j-log4j12-1.5.8.jar:/usr/lib/hbase/lib/stax-api-1.0.1.jar:/usr/lib/hbase/lib/thrift-0.2.0.jar:/usr/lib/hbase/lib/xmlenc-0.52.jar:/usr/lib/hbase/lib/zookeeper.jar:/usr/lib/zookeeper/zookeeper-3.3.3-cdh3u0.jar:/usr/lib/zookeeper/zookeeper.jar:/usr/lib/zookeeper/lib/jline-0.9.94.jar:/usr/lib/zookeeper/lib/log4j-1.2.15.jar:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar:/usr/lib/sqoop/sqoop-test-1.2.0-cdh3u0.jar:<somepath>/lib/tdgssconfig.jar:<somepath>/lib/terajdbc4.jar:/usr/lib/hadoop/hadoop-0.20.2-cdh3u0-core.jar:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
>> Note:
>> /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB_Temp_Table.java
>> uses or overrides a deprecated API.
>> Note: Recompile with -Xlint:deprecation for details.
>> 11/08/30 22:59:52 INFO orm.CompilationManager: Writing jar file:
>> /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB.Temp_Table.jar
>> 11/08/30 22:59:52 DEBUG orm.CompilationManager: Scanning for .class files in
>> directory: /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a
>> 11/08/30 22:59:52 DEBUG orm.CompilationManager: Got classfile:
>> /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DBTemp_Table.class
>> -> DB_Temp_Table.class
>> 11/08/30 22:59:52 DEBUG orm.CompilationManager: Finished writing jar file
>> /tmp/sqoop-hadrdev/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB.Temp_Table.jar
>> 11/08/30 22:59:52 INFO mapreduce.ExportJobBase: Beginning export of
>> DB.Temp_Table
>> 11/08/30 22:59:52 DEBUG mapreduce.JobBase: Using InputFormat: class
>> com.cloudera.sqoop.mapreduce.ExportInputFormat
>> 11/08/30 22:59:52 DEBUG manager.SqlManager: Using fetchSize for next query:
>> 1000
>> 11/08/30 22:59:52 INFO manager.SqlManager: Executing SQL statement: SELECT
>> t.* FROM DB.Temp_Table AS t WHERE 1=0
>> 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> file:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
>> 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> file:<somepath>/lib/terajdbc4.jar
>> 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> file:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
>> 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> file:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
>> 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> file:/usr/lib/sqoop/lib/ivy-2.0.0-rc2.jar
>> 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> file:/usr/lib/sqoop/lib/commons-io-1.4.jar
>> 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> file:/usr/lib/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
>> 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> file:/usr/lib/sqoop/lib/sqljdbc4.jar
>> 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> file:/usr/lib/sqoop/lib/ant-contrib-1.0b3.jar
>> 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> file:/usr/lib/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar
>> 11/08/30 22:59:53 INFO hdfs.DFSClient: Created HDFS_DELEGATION_TOKEN token
>> 795 for hadoop
>> 11/08/30 22:59:53 INFO security.TokenCache: Got dt for
>> hdfs://<cname>:9000/tmp/hadoop-mapred/mapred/staging/hadoop/.staging/job_201107010928_0398/libjars/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar;uri=xx.xxx.xx.xx:9000;t.service=xx.xxx.xx.xx:9000
>> 11/08/30 22:59:53 INFO input.FileInputFormat: Total input paths to process :
>> 1
>> 11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat: Target numMapTasks=1
>> 11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat: Total input bytes=18
>> 11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat: maxSplitSize=18
>> 11/08/30 22:59:53 INFO input.FileInputFormat: Total input paths to process :
>> 1
>> 11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat: Generated splits:
>> 11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat:
>> Paths:/user/hadrdev/sqoop_test.txt:0+18 Locations:
>> 11/08/30 22:59:53 INFO mapred.JobClient: Running job: job_201107010928_0398
>> 11/08/30 22:59:54 INFO mapred.JobClient:  map 0% reduce 0%
>> 11/08/30 23:00:01 INFO mapred.JobClient: Task Id :
>> attempt_201107010928_0398_m_000000_0, Status : FAILED
>> java.io.IOException: java.lang.NullPointerException
>>         at
>> com.cloudera.sqoop.mapreduce.ExportOutputFormat.getRecordWriter(ExportOutputFormat.java:80)
>>         at
>> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.<init>(MapTask.java:520)
>>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:635)
>>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:322)
>>         at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at javax.security.auth.Subject.doAs(Subject.java:396)
>>         at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1115)
>>         at org.apache.hadoop.mapred.Child.main(Child.java:262)
>> Caused by: java.lang.NullPointerException
>>         at
>> com.teradata.tdgss.jtdgss.TdgssConfigApi.GetMechanisms(DashoA1*..)
>>         at com.teradata.tdgss.jtdgss.TdgssManager.<init>(DashoA1*..)
>>         at com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)
>>         at
>> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:612)
>>         at
>> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:630)
>>         at
>> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(GenericTeraEncrypt.java:723)
>>         at com.teradata.jdbc.AuthMechanism.<init>(AuthMechanism.java:50)
>>         at
>> com.teradata.jdbc.jdbc.GenericInitDBConfigState.action(GenericInitDBConfigState.java:105)
>>         at
>> com.teradata.jdbc.jdbc.GenericLogonController.run(GenericLogonController.java:49)
>>         at com.teradata.jdbc.jdbc_4.TDSession.<init>(TDSession.java:199)
>>         at
>> com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalConnection.<init>(TeraLocalConnection.java:95)
>>         at
>> com.teradata.jdbc.jdbc.ConnectionFactory.createConnection(ConnectionFactory.java:54)
>>         at com.teradata.jdbc.TeraDriver.doConnect(TeraDriver.java:217)
>>         at com.teradata.jdbc.TeraDriver.connect(TeraDriver.java:150)
>>         at java.sql.DriverManager.getConnection(DriverManager.java:582)
>>         at java.sql.DriverManager.getConnection(DriverManager.java:185)
>>         at
>> com.cloudera.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:184)
>>         at
>> com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.<init>(AsyncSqlRecordWriter.java:73)
>>         at
>> com.cloudera.sqoop.mapreduce.ExportOutputFormat$ExportRecordWriter.<init>(ExportOutputFormat.java:96)
>>         at
>> com.cloudera.sqoop.mapreduce.ExportOutputFormat.getRecordWriter(ExportOutputFormat.java:78)
>>         ... 8 more
>> 
>> attempt_201107010928_0398_m_000000_0: GSSException: Failure unspecified at
>> GSS-API level (Mechanism level: UserFile parameter null)
>> attempt_201107010928_0398_m_000000_0:   at
>> com.teradata.tdgss.jtdgss.TdgssParseXml.<init>(DashoA1*..)
>> attempt_201107010928_0398_m_000000_0:   at
>> com.teradata.tdgss.jtdgss.TdgssConfigApi.<init>(DashoA1*..)
>> attempt_201107010928_0398_m_000000_0:   at
>> com.teradata.tdgss.jtdgss.TdgssManager.<init>(DashoA1*..)
>> attempt_201107010928_0398_m_000000_0:   at
>> com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)
>> attempt_201107010928_0398_m_000000_0:   at
>> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:612)
>> attempt_201107010928_0398_m_000000_0:   at
>> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:630)
>> attempt_201107010928_0398_m_000000_0:   at
>> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(
>> On Mon, Aug 29, 2011 at 2:19 PM, SRINIVAS SURASANI <va...@gmail.com> wrote:
>>> 
>>> Arvind,
>>> 
>>> I have subscribed at sqoop-user@incubator.apache.org and posted Question .
>>> Sorry for the inconvinence from my end, since I'm close to deadline I'm
>>> taking your valuable time.
>>> 
>>> sqoop list-tables --driver com.teradata.jdbc.TeraDriver --connect
>>> jdbc:teradata://PKTD/E1_CMS_WORK --username srini -P
>>> Iam getting the following error:
>>> 11/08/29 13:08:03 INFO manager.SqlManager: Using default fetchSize of 1000
>>> GSSException: Failure unspecified at GSS-API level (Mechanism level:
>>> UserFile parameter null)
>>>         at com.teradata.tdgss.jtdgss.TdgssParseXml.<init>(DashoA1*..)
>>>         at com.teradata.tdgss.jtdgss.TdgssConfigApi.<init>(DashoA1*..)
>>>         at com.teradata.tdgss.jtdgss.TdgssManager.<init>(DashoA1*..)
>>>         at com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)
>>>         at
>>> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:612)
>>>         at
>>> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:630)
>>>         at
>>> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(GenericTeraEncrypt.java:723)
>>>         at com.teradata.jdbc.AuthMechanism.<init>(AuthMechanism.java:50)
>>>         at
>>> com.teradata.jdbc.jdbc.GenericInitDBConfigState.action(GenericInitDBConfigState.java:105)
>>>         at
>>> com.teradata.jdbc.jdbc.GenericLogonController.run(GenericLogonController.java:49)
>>>         at com.teradata.jdbc.jdbc_4.TDSession.<init>(TDSession.java:199)
>>>         at
>>> com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalConnection.<init>(TeraLocalConnection.java:95)
>>>         at
>>> com.teradata.jdbc.jdbc.ConnectionFactory.createConnection(ConnectionFactory.java:54)
>>>         at com.teradata.jdbc.TeraDriver.doConnect(TeraDriver.java:217)
>>>         at com.teradata.jdbc.TeraDriver.connect(TeraDriver.java:150)
>>>         at java.sql.DriverManager.getConnection(DriverManager.java:582)
>>>         at java.sql.DriverManager.getConnection(DriverManager.java:185)
>>>         at
>>> com.cloudera.sqoop.manager.SqlManager.makeConnection(SqlManager.java:643)
>>>         at
>>> com.cloudera.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:51)
>>>         at
>>> com.cloudera.sqoop.manager.SqlManager.listTables(SqlManager.java:270)
>>>         at
>>> com.cloudera.sqoop.tool.ListTablesTool.run(ListTablesTool.java:49)
>>>         at com.cloudera.sqoop.Sqoop.run(Sqoop.java:144)
>>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>>>         at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:180)
>>>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:219)
>>>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:228)
>>>         at com.cloudera.sqoop.Sqoop.main(Sqoop.java:237)
>>> 11/08/29 13:08:04 ERROR sqoop.Sqoop: Got exception running Sqoop:
>>> java.lang.NullPointerException
>>> java.lang.NullPointerException
>>>         at
>>> com.teradata.tdgss.jtdgss.TdgssConfigApi.GetMechanisms(DashoA1*..)
>>>         at com.teradata.tdgss.jtdgss.TdgssManager.<init>(DashoA1*..)
>>>         at com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)
>>>         at
>>> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:612)
>>>         at
>>> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:630)
>>>         at
>>> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(GenericTeraEncrypt.java:723)
>>>         at com.teradata.jdbc.AuthMechanism.<init>(AuthMechanism.java:50)
>>>         at
>>> com.teradata.jdbc.jdbc.GenericInitDBConfigState.action(GenericInitDBConfigState.java:105)
>>>         at
>>> com.teradata.jdbc.jdbc.GenericLogonController.run(GenericLogonController.java:49)
>>>         at com.teradata.jdbc.jdbc_4.TDSession.<init>(TDSession.java:199)
>>>         at
>>> com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalConnection.<init>(TeraLocalConnection.java:95)
>>>         at
>>> com.teradata.jdbc.jdbc.ConnectionFactory.createConnection(ConnectionFactory.java:54)
>>>         at com.teradata.jdbc.TeraDriver.doConnect(TeraDriver.java:217)
>>>         at com.teradata.jdbc.TeraDriver.connect(TeraDriver.java:150)
>>>         at java.sql.DriverManager.getConnection(DriverManager.java:582)
>>>         at java.sql.DriverManager.getConnection(DriverManager.java:185)
>>>         at
>>> com.cloudera.sqoop.manager.SqlManager.makeConnection(SqlManager.java:643)
>>>         at
>>> com.cloudera.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:51)
>>>         at
>>> com.cloudera.sqoop.manager.SqlManager.listTables(SqlManager.java:270)
>>>         at
>>> com.cloudera.sqoop.tool.ListTablesTool.run(ListTablesTool.java:49)
>>>         at com.cloudera.sqoop.Sqoop.run(Sqoop.java:144)
>>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>>>         at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:180)
>>>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:219)
>>>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:228)
>>>         at com.cloudera.sqoop.Sqoop.main(Sqoop.java:237)
>>> Thanks,
>>> Srini
>>> 
>>> 
>>> 
>>> On Mon, Aug 29, 2011 at 12:58 PM, arvind@cloudera.com
>>> <ar...@cloudera.com> wrote:
>>>> 
>>>> [Please subscribe and respond to sqoop-user@incubator.apache.org]
>>>> 
>>>> Please use HADOOP_CLASSPATH instead of CLASSPATH. Also, in order to
>>>> use the generic JDBC connector, you will have to specify the driver
>>>> class explicitly via the command line option --driver
>>>> com.teradata.jdbc.TeraDriver.
>>>> 
>>>> Thanks,
>>>> Arvind
>>>> 
>>>> On Mon, Aug 29, 2011 at 9:53 AM, SRINIVAS SURASANI <va...@gmail.com>
>>>> wrote:
>>>>> Arvind,
>>>>> I have set the classpath to teradata4.jar [ not placed the
>>>>> teradata4.jar in
>>>>> sqoop lib, as I dont have permissions].
>>>>> I'm getting the following error
>>>>> 
>>>>> sqoop list-tables --connect jdbc:teradata://PKTD/E1_CMS_WORK --username
>>>>> srini -P
>>>>> ERROR: tool.BaseSqoopTool: Got error creating database manager:
>>>>> java.io.IOexception: No manager for connect string:
>>>>> jdbc:teradata:///PKTD/E1_CMS_WORK
>>>>>    at com.cloudera.sqoop.ConnFactory.getManager(ConnFactory.java:119)
>>>>>    at
>>>>> com.cloudera.sqoop.tool.BaseSqoopTool.init(BaseSqoopTool.java:186)
>>>>>    ...
>>>>>    ...
>>>>> Thanks,
>>>>> Srini
>>>>> 
>>>>> 
>>>>> 
>>>>> On Mon, Aug 29, 2011 at 8:52 AM, SRINIVAS SURASANI <va...@gmail.com>
>>>>> wrote:
>>>>>> 
>>>>>> Thanks-a-lot Arvind.
>>>>>> 
>>>>>> On Mon, Aug 29, 2011 at 8:45 AM, arvind@cloudera.com
>>>>>> <ar...@cloudera.com>
>>>>>> wrote:
>>>>>>> 
>>>>>>> [Moving the thread to sqoop-user@incubator.apache.org]
>>>>>>> 
>>>>>>> Hi Srini,
>>>>>>> 
>>>>>>> You should be able to use the generic JDBC connector to import/export
>>>>>>> from Teradata. There is also a specialized connector that is
>>>>>>> available
>>>>>>> for use with Teradata if you are interested. This connector is not a
>>>>>>> part of Sqoop and can be obtained from Cloudera by going to:
>>>>>>> 
>>>>>>> http://www.cloudera.com/partners/connectors/
>>>>>>> 
>>>>>>> Thanks,
>>>>>>> Arvind
>>>>>>> 
>>>>>>> On Mon, Aug 29, 2011 at 8:17 AM, SRINIVAS SURASANI <va...@gmail.com>
>>>>>>> wrote:
>>>>>>>> I have csv file in hadoop and looking to load into Teradata. I was
>>>>>>>> wondering does the sqoop works with Terradata.(with JDBC jar
>>>>>>>> placing
>>>>>>>> in sqoop lib dir).
>>>>>>>> 
>>>>>>>> Regards
>>>>>>>> Srini
>>>>>>>> 
>>>>>>>> --
>>>>>>>> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in
>>>>>>>> favor
>>>>>>>> of Apache Sqoop mailing list sqoop-user@incubator.apache.org.
>>>>>>>> Please
>>>>>>>> subscribe to it by sending an email to
>>>>>>>> incubator-sqoop-user-subscribe@apache.org.
>>>>>>>> 
>>>>>>> 
>>>>>>> --
>>>>>>> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor
>>>>>>> of
>>>>>>> Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please
>>>>>>> subscribe
>>>>>>> to it by sending an email to
>>>>>>> incubator-sqoop-user-subscribe@apache.org.
>>>>>> 
>>>>> 
>>>>> --
>>>>> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor
>>>>> of
>>>>> Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please
>>>>> subscribe
>>>>> to it by sending an email to incubator-sqoop-user-subscribe@apache.org.
>>>>> 
>>>> 
>>>> --
>>>> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor of
>>>> Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please subscribe
>>>> to it by sending an email to incubator-sqoop-user-subscribe@apache.org.
>>> 
>> 
>> 

Re: [sqoop-user] Sqoop-with Terradata

Posted by Srinivas <va...@gmail.com>.

Sent from my iPhone

On Aug 31, 2011, at 1:44 PM, Arvind Prabhakar <ar...@apache.org> wrote:

> <mime-attachment.txt>

Re: [sqoop-user] Sqoop-with Terradata

Posted by Srinivas <va...@gmail.com>.

Sent from my iPhone

On Aug 31, 2011, at 1:44 PM, Arvind Prabhakar <ar...@apache.org> wrote:

> Srini,
> 
> This is happening because the GSS config Jar is not getting put in
> Distributed Cache. Sqoop only puts certain jars in the cache as
> opposed to putting every jar that exists in its classpath. In order to
> force any Jar to be put in the Distributed Cache, you must copy it
> over to Sqoop's lib directory.
> 
> Thanks,
> Arvind
> 
> On Tue, Aug 30, 2011 at 9:36 PM, SRINIVAS SURASANI <va...@gmail.com> wrote:
>> Getting the error while exporting .. And from my observation while compiling
>> .java, sets the classpath for terajdbc4.jar and tdgssconfig.jar ( as I
>> marked in bold letters below) but just before launching map-reduce adding
>> jar classpath of tdgssconfig.jar.
>>  I set the HADOOP_CLASSPTH=<path to> terajdbc4.jar:<path to>tdgssconfig.jar
>> Any Help Appreciated.
>> $ sqoop export --verbose --driver com.teradata.jdbc.TeraDriver --connect
>> jdbc:teradata://TD/DB --username WBD -P --table DB.Temp_Table --export-dir
>> /user/hadoop/sqoop_test.txt --fields-terminated-by , --lines-terminated-by
>> \n -m 1>
>> 11/08/30 22:59:43 DEBUG tool.BaseSqoopTool: Enabled debug logging.
>> Enter password:
>> 11/08/30 22:59:50 DEBUG sqoop.ConnFactory: Loaded manager factory:
>> com.cloudera.sqoop.manager.DefaultManagerFactory
>> 11/08/30 22:59:50 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
>> com.cloudera.sqoop.manager.DefaultManagerFactory
>> 11/08/30 22:59:50 INFO manager.SqlManager: Using default fetchSize of 1000
>> 11/08/30 22:59:50 DEBUG sqoop.ConnFactory: Instantiated ConnManager
>> com.cloudera.sqoop.manager.GenericJdbcManager@2b76e552
>> 11/08/30 22:59:50 INFO tool.CodeGenTool: Beginning code generation
>> 11/08/30 22:59:51 DEBUG manager.SqlManager: Using fetchSize for next query:
>> 1000
>> 11/08/30 22:59:51 INFO manager.SqlManager: Executing SQL statement: SELECT
>> t.* FROM DB.Temp_Table AS t WHERE 1=0
>> 11/08/30 22:59:51 DEBUG manager.SqlManager: Using fetchSize for next query:
>> 1000
>> 11/08/30 22:59:51 INFO manager.SqlManager: Executing SQL statement: SELECT
>> t.* FROM DB.Temp_Table AS t WHERE 1=0
>> 11/08/30 22:59:51 DEBUG orm.ClassWriter: selected columns:
>> 11/08/30 22:59:51 DEBUG orm.ClassWriter:   NAME
>> 11/08/30 22:59:51 DEBUG orm.ClassWriter:   SALARY
>> 11/08/30 22:59:51 DEBUG orm.ClassWriter: Writing source file:
>> /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB_Temp_Table.java
>> 11/08/30 22:59:51 DEBUG orm.ClassWriter: Table name:DB.Temp_Table
>> 11/08/30 22:59:51 DEBUG orm.ClassWriter: Columns: NAME:12, SALARY:3,
>> 11/08/30 22:59:51 DEBUG orm.ClassWriter: sourceFilename is
>> DB_Temp_Table.java
>> 11/08/30 22:59:51 DEBUG orm.CompilationManager: Found existing
>> /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/
>> 11/08/30 22:59:51 INFO orm.CompilationManager: HADOOP_HOME is
>> /usr/lib/hadoop
>> 11/08/30 22:59:51 INFO orm.CompilationManager: Found hadoop core jar at:
>> /usr/lib/hadoop/hadoop-0.20.2-cdh3u0-core.jar
>> 11/08/30 22:59:51 DEBUG orm.CompilationManager: Adding source file:
>> /tmp/sqoop-haoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB_Temp_Table.java
>> 11/08/30 22:59:51 DEBUG orm.CompilationManager: Invoking javac with args:
>> 11/08/30 22:59:51 DEBUG orm.CompilationManager:   -sourcepath
>> 11/08/30 22:59:51 DEBUG orm.CompilationManager:
>> /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/
>> 11/08/30 22:59:51 DEBUG orm.CompilationManager:   -d
>> 11/08/30 22:59:51 DEBUG orm.CompilationManager:
>> /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/
>> 11/08/30 22:59:51 DEBUG orm.CompilationManager:   -classpath
>> 11/08/30 22:59:51 DEBUG orm.CompilationManager:
>> /usr/lib/hadoop:/usr/lib/hadoop/hadoop-core-0.20.2-cdh3u0.jar:/usr/lib/hadoop/lib/ant-contrib-1.0b3.jar:/usr/lib/hadoop/lib/aspectjrt-1.6.5.jar:/usr/lib/hadoop/lib/aspectjtools-1.6.5.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/commons-daemon-1.0.1.jar:/usr/lib/hadoop/lib/commons-el-1.0.jar:/usr/lib/hadoop/lib/commons-httpclient-3.0.1.jar:/usr/lib/hadoop/lib/commons-logging-1.0.4.jar:/usr/lib/hadoop/lib/commons-logging-api-1.0.4.jar:/usr/lib/hadoop/lib/commons-net-1.4.1.jar:/usr/lib/hadoop/lib/core-3.1.1.jar:/usr/lib/hadoop/lib/elephant-bird-1.0.jar:/usr/lib/hadoop/lib/hadoop-fairscheduler-0.20.2-cdh3u0.jar:/usr/lib/hadoop/lib/hadoop-lzo-0.4.8.jar:/usr/lib/hadoop/lib/hsqldb-1.8.0.10.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.5.2.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.5.2.jar:/usr/lib/hadoop/lib/jasper-compiler-5.5.12.jar:/usr/lib/hadoop/lib/jasper-runtime-5.5.12.jar:/usr/lib/hadoop/lib/jets3t-0.6.1.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/jetty-servlet-tester-6.1.26.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/junit-4.5.jar:/usr/lib/hadoop/lib/kfs-0.2.2.jar:/usr/lib/hadoop/lib/log4j-1.2.15.jar:/usr/lib/hadoop/lib/mockito-all-1.8.2.jar:/usr/lib/hadoop/lib/oro-2.0.8.jar:/usr/lib/hadoop/lib/protobuf-java-2.3.0.jar:/usr/lib/hadoop/lib/servlet-api-2.5-20081211.jar:/usr/lib/hadoop/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hadoop/lib/slf4j-api-1.4.3.jar:/usr/lib/hadoop/lib/slf4j-api-1.5.8.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.4.3.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.5.10.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/yamlbeans-0.9.3.jar:/usr/lib/hadoop/lib/jsp-2.1/jsp-2.1.jar:/usr/lib/hadoop/lib/jsp-2.1/jsp-api-2.1.jar:/usr/lib/sqoop/conf:/usr/lib/hbase/conf::/usr/lib/sqoop/lib/ant-contrib-1.0b3.jar:/usr/lib/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/usr/lib/sqoop/lib/commons-io-1.4.jar:/usr/lib/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar:/usr/lib/sqoop/lib/ivy-2.0.0-rc2.jar:/usr/lib/sqoop/lib/sqljdbc4.jar:/usr/lib/hbase/hbase-0.90.1-cdh3u0-tests.jar:/usr/lib/hbase/hbase-0.90.1-cdh3u0.jar:/usr/lib/hbase/lib/activation-1.1.jar:/usr/lib/hbase/lib/asm-3.1.jar:/usr/lib/hbase/lib/avro-1.3.3.jar:/usr/lib/hbase/lib/commons-cli-1.2.jar:/usr/lib/hbase/lib/commons-codec-1.4.jar:/usr/lib/hbase/lib/commons-el-1.0.jar:/usr/lib/hbase/lib/commons-httpclient-3.1.jar:/usr/lib/hbase/lib/commons-lang-2.5.jar:/usr/lib/hbase/lib/commons-logging-1.1.1.jar:/usr/lib/hbase/lib/commons-net-1.4.1.jar:/usr/lib/hbase/lib/core-3.1.1.jar:/usr/lib/hbase/lib/guava-r06.jar:/usr/lib/hbase/lib/hadoop-core.jar:/usr/lib/hbase/lib/hbase-0.90.1-cdh3u0.jar:/usr/lib/hbase/lib/jackson-core-asl-1.5.2.jar:/usr/lib/hbase/lib/jackson-jaxrs-1.5.5.jar:/usr/lib/hbase/lib/jackson-mapper-asl-1.5.2.jar:/usr/lib/hbase/lib/jackson-xc-1.5.5.jar:/usr/lib/hbase/lib/jasper-compiler-5.5.23.jar:/usr/lib/hbase/lib/jasper-runtime-5.5.23.jar:/usr/lib/hbase/lib/jaxb-api-2.1.jar:/usr/lib/hbase/lib/jaxb-impl-2.1.12.jar:/usr/lib/hbase/lib/jersey-core-1.4.jar:/usr/lib/hbase/lib/jersey-json-1.4.jar:/usr/lib/hbase/lib/jersey-server-1.4.jar:/usr/lib/hbase/lib/jettison-1.1.jar:/usr/lib/hbase/lib/jetty-6.1.26.jar:/usr/lib/hbase/lib/jetty-util-6.1.26.jar:/usr/lib/hbase/lib/jruby-complete-1.0.3.jar:/usr/lib/hbase/lib/jsp-2.1-6.1.14.jar:/usr/lib/hbase/lib/jsp-api-2.1-6.1.14.jar:/usr/lib/hbase/lib/jsp-api-2.1.jar:/usr/lib/hbase/lib/jsr311-api-1.1.1.jar:/usr/lib/hbase/lib/log4j-1.2.16.jar:/usr/lib/hbase/lib/protobuf-java-2.3.0.jar:/usr/lib/hbase/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hbase/lib/servlet-api-2.5.jar:/usr/lib/hbase/lib/slf4j-api-1.5.8.jar:/usr/lib/hbase/lib/slf4j-log4j12-1.5.8.jar:/usr/lib/hbase/lib/stax-api-1.0.1.jar:/usr/lib/hbase/lib/thrift-0.2.0.jar:/usr/lib/hbase/lib/xmlenc-0.52.jar:/usr/lib/hbase/lib/zookeeper.jar:/usr/lib/zookeeper/zookeeper-3.3.3-cdh3u0.jar:/usr/lib/zookeeper/zookeeper.jar:/usr/lib/zookeeper/lib/jline-0.9.94.jar:/usr/lib/zookeeper/lib/log4j-1.2.15.jar:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar:/usr/lib/sqoop/sqoop-test-1.2.0-cdh3u0.jar:<somepath>/lib/tdgssconfig.jar:<somepath>/lib/terajdbc4.jar:/usr/lib/hadoop/hadoop-0.20.2-cdh3u0-core.jar:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
>> Note:
>> /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB_Temp_Table.java
>> uses or overrides a deprecated API.
>> Note: Recompile with -Xlint:deprecation for details.
>> 11/08/30 22:59:52 INFO orm.CompilationManager: Writing jar file:
>> /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB.Temp_Table.jar
>> 11/08/30 22:59:52 DEBUG orm.CompilationManager: Scanning for .class files in
>> directory: /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a
>> 11/08/30 22:59:52 DEBUG orm.CompilationManager: Got classfile:
>> /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DBTemp_Table.class
>> -> DB_Temp_Table.class
>> 11/08/30 22:59:52 DEBUG orm.CompilationManager: Finished writing jar file
>> /tmp/sqoop-hadrdev/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB.Temp_Table.jar
>> 11/08/30 22:59:52 INFO mapreduce.ExportJobBase: Beginning export of
>> DB.Temp_Table
>> 11/08/30 22:59:52 DEBUG mapreduce.JobBase: Using InputFormat: class
>> com.cloudera.sqoop.mapreduce.ExportInputFormat
>> 11/08/30 22:59:52 DEBUG manager.SqlManager: Using fetchSize for next query:
>> 1000
>> 11/08/30 22:59:52 INFO manager.SqlManager: Executing SQL statement: SELECT
>> t.* FROM DB.Temp_Table AS t WHERE 1=0
>> 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> file:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
>> 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> file:<somepath>/lib/terajdbc4.jar
>> 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> file:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
>> 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> file:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
>> 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> file:/usr/lib/sqoop/lib/ivy-2.0.0-rc2.jar
>> 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> file:/usr/lib/sqoop/lib/commons-io-1.4.jar
>> 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> file:/usr/lib/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
>> 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> file:/usr/lib/sqoop/lib/sqljdbc4.jar
>> 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> file:/usr/lib/sqoop/lib/ant-contrib-1.0b3.jar
>> 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> file:/usr/lib/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar
>> 11/08/30 22:59:53 INFO hdfs.DFSClient: Created HDFS_DELEGATION_TOKEN token
>> 795 for hadoop
>> 11/08/30 22:59:53 INFO security.TokenCache: Got dt for
>> hdfs://<cname>:9000/tmp/hadoop-mapred/mapred/staging/hadoop/.staging/job_201107010928_0398/libjars/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar;uri=xx.xxx.xx.xx:9000;t.service=xx.xxx.xx.xx:9000
>> 11/08/30 22:59:53 INFO input.FileInputFormat: Total input paths to process :
>> 1
>> 11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat: Target numMapTasks=1
>> 11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat: Total input bytes=18
>> 11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat: maxSplitSize=18
>> 11/08/30 22:59:53 INFO input.FileInputFormat: Total input paths to process :
>> 1
>> 11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat: Generated splits:
>> 11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat:
>> Paths:/user/hadrdev/sqoop_test.txt:0+18 Locations:
>> 11/08/30 22:59:53 INFO mapred.JobClient: Running job: job_201107010928_0398
>> 11/08/30 22:59:54 INFO mapred.JobClient:  map 0% reduce 0%
>> 11/08/30 23:00:01 INFO mapred.JobClient: Task Id :
>> attempt_201107010928_0398_m_000000_0, Status : FAILED
>> java.io.IOException: java.lang.NullPointerException
>>         at
>> com.cloudera.sqoop.mapreduce.ExportOutputFormat.getRecordWriter(ExportOutputFormat.java:80)
>>         at
>> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.<init>(MapTask.java:520)
>>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:635)
>>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:322)
>>         at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at javax.security.auth.Subject.doAs(Subject.java:396)
>>         at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1115)
>>         at org.apache.hadoop.mapred.Child.main(Child.java:262)
>> Caused by: java.lang.NullPointerException
>>         at
>> com.teradata.tdgss.jtdgss.TdgssConfigApi.GetMechanisms(DashoA1*..)
>>         at com.teradata.tdgss.jtdgss.TdgssManager.<init>(DashoA1*..)
>>         at com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)
>>         at
>> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:612)
>>         at
>> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:630)
>>         at
>> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(GenericTeraEncrypt.java:723)
>>         at com.teradata.jdbc.AuthMechanism.<init>(AuthMechanism.java:50)
>>         at
>> com.teradata.jdbc.jdbc.GenericInitDBConfigState.action(GenericInitDBConfigState.java:105)
>>         at
>> com.teradata.jdbc.jdbc.GenericLogonController.run(GenericLogonController.java:49)
>>         at com.teradata.jdbc.jdbc_4.TDSession.<init>(TDSession.java:199)
>>         at
>> com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalConnection.<init>(TeraLocalConnection.java:95)
>>         at
>> com.teradata.jdbc.jdbc.ConnectionFactory.createConnection(ConnectionFactory.java:54)
>>         at com.teradata.jdbc.TeraDriver.doConnect(TeraDriver.java:217)
>>         at com.teradata.jdbc.TeraDriver.connect(TeraDriver.java:150)
>>         at java.sql.DriverManager.getConnection(DriverManager.java:582)
>>         at java.sql.DriverManager.getConnection(DriverManager.java:185)
>>         at
>> com.cloudera.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:184)
>>         at
>> com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.<init>(AsyncSqlRecordWriter.java:73)
>>         at
>> com.cloudera.sqoop.mapreduce.ExportOutputFormat$ExportRecordWriter.<init>(ExportOutputFormat.java:96)
>>         at
>> com.cloudera.sqoop.mapreduce.ExportOutputFormat.getRecordWriter(ExportOutputFormat.java:78)
>>         ... 8 more
>> 
>> attempt_201107010928_0398_m_000000_0: GSSException: Failure unspecified at
>> GSS-API level (Mechanism level: UserFile parameter null)
>> attempt_201107010928_0398_m_000000_0:   at
>> com.teradata.tdgss.jtdgss.TdgssParseXml.<init>(DashoA1*..)
>> attempt_201107010928_0398_m_000000_0:   at
>> com.teradata.tdgss.jtdgss.TdgssConfigApi.<init>(DashoA1*..)
>> attempt_201107010928_0398_m_000000_0:   at
>> com.teradata.tdgss.jtdgss.TdgssManager.<init>(DashoA1*..)
>> attempt_201107010928_0398_m_000000_0:   at
>> com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)
>> attempt_201107010928_0398_m_000000_0:   at
>> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:612)
>> attempt_201107010928_0398_m_000000_0:   at
>> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:630)
>> attempt_201107010928_0398_m_000000_0:   at
>> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(
>> On Mon, Aug 29, 2011 at 2:19 PM, SRINIVAS SURASANI <va...@gmail.com> wrote:
>>> 
>>> Arvind,
>>> 
>>> I have subscribed at sqoop-user@incubator.apache.org and posted Question .
>>> Sorry for the inconvinence from my end, since I'm close to deadline I'm
>>> taking your valuable time.
>>> 
>>> sqoop list-tables --driver com.teradata.jdbc.TeraDriver --connect
>>> jdbc:teradata://PKTD/E1_CMS_WORK --username srini -P
>>> Iam getting the following error:
>>> 11/08/29 13:08:03 INFO manager.SqlManager: Using default fetchSize of 1000
>>> GSSException: Failure unspecified at GSS-API level (Mechanism level:
>>> UserFile parameter null)
>>>         at com.teradata.tdgss.jtdgss.TdgssParseXml.<init>(DashoA1*..)
>>>         at com.teradata.tdgss.jtdgss.TdgssConfigApi.<init>(DashoA1*..)
>>>         at com.teradata.tdgss.jtdgss.TdgssManager.<init>(DashoA1*..)
>>>         at com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)
>>>         at
>>> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:612)
>>>         at
>>> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:630)
>>>         at
>>> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(GenericTeraEncrypt.java:723)
>>>         at com.teradata.jdbc.AuthMechanism.<init>(AuthMechanism.java:50)
>>>         at
>>> com.teradata.jdbc.jdbc.GenericInitDBConfigState.action(GenericInitDBConfigState.java:105)
>>>         at
>>> com.teradata.jdbc.jdbc.GenericLogonController.run(GenericLogonController.java:49)
>>>         at com.teradata.jdbc.jdbc_4.TDSession.<init>(TDSession.java:199)
>>>         at
>>> com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalConnection.<init>(TeraLocalConnection.java:95)
>>>         at
>>> com.teradata.jdbc.jdbc.ConnectionFactory.createConnection(ConnectionFactory.java:54)
>>>         at com.teradata.jdbc.TeraDriver.doConnect(TeraDriver.java:217)
>>>         at com.teradata.jdbc.TeraDriver.connect(TeraDriver.java:150)
>>>         at java.sql.DriverManager.getConnection(DriverManager.java:582)
>>>         at java.sql.DriverManager.getConnection(DriverManager.java:185)
>>>         at
>>> com.cloudera.sqoop.manager.SqlManager.makeConnection(SqlManager.java:643)
>>>         at
>>> com.cloudera.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:51)
>>>         at
>>> com.cloudera.sqoop.manager.SqlManager.listTables(SqlManager.java:270)
>>>         at
>>> com.cloudera.sqoop.tool.ListTablesTool.run(ListTablesTool.java:49)
>>>         at com.cloudera.sqoop.Sqoop.run(Sqoop.java:144)
>>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>>>         at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:180)
>>>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:219)
>>>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:228)
>>>         at com.cloudera.sqoop.Sqoop.main(Sqoop.java:237)
>>> 11/08/29 13:08:04 ERROR sqoop.Sqoop: Got exception running Sqoop:
>>> java.lang.NullPointerException
>>> java.lang.NullPointerException
>>>         at
>>> com.teradata.tdgss.jtdgss.TdgssConfigApi.GetMechanisms(DashoA1*..)
>>>         at com.teradata.tdgss.jtdgss.TdgssManager.<init>(DashoA1*..)
>>>         at com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)
>>>         at
>>> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:612)
>>>         at
>>> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:630)
>>>         at
>>> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(GenericTeraEncrypt.java:723)
>>>         at com.teradata.jdbc.AuthMechanism.<init>(AuthMechanism.java:50)
>>>         at
>>> com.teradata.jdbc.jdbc.GenericInitDBConfigState.action(GenericInitDBConfigState.java:105)
>>>         at
>>> com.teradata.jdbc.jdbc.GenericLogonController.run(GenericLogonController.java:49)
>>>         at com.teradata.jdbc.jdbc_4.TDSession.<init>(TDSession.java:199)
>>>         at
>>> com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalConnection.<init>(TeraLocalConnection.java:95)
>>>         at
>>> com.teradata.jdbc.jdbc.ConnectionFactory.createConnection(ConnectionFactory.java:54)
>>>         at com.teradata.jdbc.TeraDriver.doConnect(TeraDriver.java:217)
>>>         at com.teradata.jdbc.TeraDriver.connect(TeraDriver.java:150)
>>>         at java.sql.DriverManager.getConnection(DriverManager.java:582)
>>>         at java.sql.DriverManager.getConnection(DriverManager.java:185)
>>>         at
>>> com.cloudera.sqoop.manager.SqlManager.makeConnection(SqlManager.java:643)
>>>         at
>>> com.cloudera.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:51)
>>>         at
>>> com.cloudera.sqoop.manager.SqlManager.listTables(SqlManager.java:270)
>>>         at
>>> com.cloudera.sqoop.tool.ListTablesTool.run(ListTablesTool.java:49)
>>>         at com.cloudera.sqoop.Sqoop.run(Sqoop.java:144)
>>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>>>         at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:180)
>>>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:219)
>>>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:228)
>>>         at com.cloudera.sqoop.Sqoop.main(Sqoop.java:237)
>>> Thanks,
>>> Srini
>>> 
>>> 
>>> 
>>> On Mon, Aug 29, 2011 at 12:58 PM, arvind@cloudera.com
>>> <ar...@cloudera.com> wrote:
>>>> 
>>>> [Please subscribe and respond to sqoop-user@incubator.apache.org]
>>>> 
>>>> Please use HADOOP_CLASSPATH instead of CLASSPATH. Also, in order to
>>>> use the generic JDBC connector, you will have to specify the driver
>>>> class explicitly via the command line option --driver
>>>> com.teradata.jdbc.TeraDriver.
>>>> 
>>>> Thanks,
>>>> Arvind
>>>> 
>>>> On Mon, Aug 29, 2011 at 9:53 AM, SRINIVAS SURASANI <va...@gmail.com>
>>>> wrote:
>>>>> Arvind,
>>>>> I have set the classpath to teradata4.jar [ not placed the
>>>>> teradata4.jar in
>>>>> sqoop lib, as I dont have permissions].
>>>>> I'm getting the following error
>>>>> 
>>>>> sqoop list-tables --connect jdbc:teradata://PKTD/E1_CMS_WORK --username
>>>>> srini -P
>>>>> ERROR: tool.BaseSqoopTool: Got error creating database manager:
>>>>> java.io.IOexception: No manager for connect string:
>>>>> jdbc:teradata:///PKTD/E1_CMS_WORK
>>>>>    at com.cloudera.sqoop.ConnFactory.getManager(ConnFactory.java:119)
>>>>>    at
>>>>> com.cloudera.sqoop.tool.BaseSqoopTool.init(BaseSqoopTool.java:186)
>>>>>    ...
>>>>>    ...
>>>>> Thanks,
>>>>> Srini
>>>>> 
>>>>> 
>>>>> 
>>>>> On Mon, Aug 29, 2011 at 8:52 AM, SRINIVAS SURASANI <va...@gmail.com>
>>>>> wrote:
>>>>>> 
>>>>>> Thanks-a-lot Arvind.
>>>>>> 
>>>>>> On Mon, Aug 29, 2011 at 8:45 AM, arvind@cloudera.com
>>>>>> <ar...@cloudera.com>
>>>>>> wrote:
>>>>>>> 
>>>>>>> [Moving the thread to sqoop-user@incubator.apache.org]
>>>>>>> 
>>>>>>> Hi Srini,
>>>>>>> 
>>>>>>> You should be able to use the generic JDBC connector to import/export
>>>>>>> from Teradata. There is also a specialized connector that is
>>>>>>> available
>>>>>>> for use with Teradata if you are interested. This connector is not a
>>>>>>> part of Sqoop and can be obtained from Cloudera by going to:
>>>>>>> 
>>>>>>> http://www.cloudera.com/partners/connectors/
>>>>>>> 
>>>>>>> Thanks,
>>>>>>> Arvind
>>>>>>> 
>>>>>>> On Mon, Aug 29, 2011 at 8:17 AM, SRINIVAS SURASANI <va...@gmail.com>
>>>>>>> wrote:
>>>>>>>> I have csv file in hadoop and looking to load into Teradata. I was
>>>>>>>> wondering does the sqoop works with Terradata.(with JDBC jar
>>>>>>>> placing
>>>>>>>> in sqoop lib dir).
>>>>>>>> 
>>>>>>>> Regards
>>>>>>>> Srini
>>>>>>>> 
>>>>>>>> --
>>>>>>>> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in
>>>>>>>> favor
>>>>>>>> of Apache Sqoop mailing list sqoop-user@incubator.apache.org.
>>>>>>>> Please
>>>>>>>> subscribe to it by sending an email to
>>>>>>>> incubator-sqoop-user-subscribe@apache.org.
>>>>>>>> 
>>>>>>> 
>>>>>>> --
>>>>>>> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor
>>>>>>> of
>>>>>>> Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please
>>>>>>> subscribe
>>>>>>> to it by sending an email to
>>>>>>> incubator-sqoop-user-subscribe@apache.org.
>>>>>> 
>>>>> 
>>>>> --
>>>>> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor
>>>>> of
>>>>> Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please
>>>>> subscribe
>>>>> to it by sending an email to incubator-sqoop-user-subscribe@apache.org.
>>>>> 
>>>> 
>>>> --
>>>> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor of
>>>> Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please subscribe
>>>> to it by sending an email to incubator-sqoop-user-subscribe@apache.org.
>>> 
>> 
>> 

Re: [sqoop-user] Sqoop-with Terradata

Posted by Arvind Prabhakar <ar...@apache.org>.
Srini,

One thing you can try is building and using the latest Sqoop from the
trunk. Also when you do run Sqoop, specify the --verbose option to
produce detailed output that is helpful in finding what could be going
wrong.

Thanks,
Arvind

On Fri, Sep 2, 2011 at 2:21 PM, SRINIVAS SURASANI <va...@gmail.com> wrote:
> Arvind,
> Now Im getting strange errror. I made sure table has equal number of
> attributes and data is not-corrupted.
> 1st error:
> sqoop export -libjar/jdbc/13.00.00.07/lib/tdgssconfig.jar --verbose --driver
> com.teradata.jdbc.TeraDriver --connect jdbc:teradata://PTD/EW1_CMTS_WORK
> --username EDWBD_CMTS --password  --table EW1_CMS_WORK.TMRB_TEST
> --export-dir /user/hadrdev/sqoop_msrb_test.txt --fields-terminated-by ','
> --lines-terminated-by '\n'
>
>
> 11/09/02 16:25:14 DEBUG manager.SqlManager: Using fetchSize for next query:
> 1000
> 11/09/02 16:25:14 INFO manager.SqlManager: Executing SQL statement: SELECT
> t.* FROM EW1_CMS_WORK.TMRB_TEST AS t WHERE 1=0
> 11/09/02 16:25:14 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
> 11/09/02 16:25:14 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/jdbc/13.00.00.07/common/lib/terajdbc4.jar
> 11/09/02 16:25:14 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
> 11/09/02 16:25:14 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
> 11/09/02 16:25:14 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/lib/ivy-2.0.0-rc2.jar
> 11/09/02 16:25:14 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/lib/commons-io-1.4.jar
> 11/09/02 16:25:14 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
> 11/09/02 16:25:14 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/lib/sqljdbc4.jar
> 11/09/02 16:25:14 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/lib/ant-contrib-1.0b3.jar
> 11/09/02 16:25:14 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar
> 11/09/02 16:25:14 INFO hdfs.DFSClient: Created HDFS_DELEGATION_TOKEN token
> 922 for hadrdev
> 11/09/02 16:25:14 INFO security.TokenCache: Got dt for
> hdfs://idoop:9000/tmp/hadoop-mapred/mapred/staging/hadrdev/.staging/job_201108311434_0051/libjars/tdgssconfig.jar;uri=10.128.225.1:9000;t.service=10.128.225.1:9000
> 11/09/02 16:25:14 INFO input.FileInputFormat: Total input paths to process :
> 1
> 11/09/02 16:25:14 DEBUG mapreduce.ExportInputFormat: Target numMapTasks=4
> 11/09/02 16:25:14 DEBUG mapreduce.ExportInputFormat: Total input
> bytes=7854508
> 11/09/02 16:25:14 DEBUG mapreduce.ExportInputFormat: maxSplitSize=1963627
> 11/09/02 16:25:14 INFO input.FileInputFormat: Total input paths to process :
> 1
> 11/09/02 16:25:14 DEBUG mapreduce.ExportInputFormat: Generated splits:
> 11/09/02 16:25:14 DEBUG mapreduce.ExportInputFormat:
> Paths:/user/hadrdev/sqoop_msrb_test.txt:0+7854508 Locations:idoop.com:;
> 11/09/02 16:25:15 INFO mapred.JobClient: Running job: job_201108311434_0051
> 11/09/02 16:25:16 INFO mapred.JobClient:  map 0% reduce 0%
> 11/09/02 16:25:24 INFO mapred.JobClient: Task Id :
> attempt_201108311434_0051_m_000000_0, Status : FAILED
> java.io.IOException: com.teradata.jdbc.jdbc_4.util.JDBCException: [Teradata
> Database] [TeraJDBC 13.00.00.07] [Error 3706] [SQLState 42000] Syntax error:
> expected something between ')' and ','.
>         at
> com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:217)
>         at
> com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:45)
>         at
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:530)
>         at
> org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
>         at
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:80)
>         at
> com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:38)
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>         at
> com.cloudera.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:187)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:646)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:322)
>         at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:396)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1115)
>         at org.apache.hadoop.mapred.Child.main(Child.java:262)
> Caused by: com.teradata.jdbc.jdbc_4.util.JDBCException: [Teradata Database]
> [TeraJDBC 13.00.00.07] [Error 3706] [SQLState 42000] Syntax error: expected
> something between ')' and ','.
>         at
> com.teradata.jdbc.jdbc_4.util.ErrorFactory.makeDatabaseSQLException(ErrorFactory.java:288)
>         at
> com.teradata.jdbc.jdbc_4.statemachine.ReceiveInitSubState.action(ReceiveInitSubState.java:102)
>         at
> com.teradata.jdbc.jdbc_4.statemachine.StatementReceiveState.subStateMachine(StatementReceiveState.java:285)
>         at
> com.teradata.jdbc.jdbc_4.statemachine.StatementReceiveState.action(StatementReceiveState.java:176)
>         at
> com.teradata.jdbc.jdbc_4.statemachine.StatementController.runBody(StatementController.java:108)
>         at
> com.teradata.jdbc.jdbc_4.statemachine.StatementController.run(StatementController.java:99)
>         at
> com.teradata.jdbc.jdbc_4.Statement.executeStatement(Statement.java:331)
>         at
> com.teradata.jdbc.jdbc_4.Statement.prepareRequest(Statement.java:491)
>         at
> com.teradata.jdbc.jdbc_4.PreparedStatement.<init>(PreparedStatement.java:56)
>         at
> com.teradata.jdbc.jdbc_4.TDSession.createPreparedStatement(TDSession.java:689)
>         at
> com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalPreparedStatement.<init>(TeraLocalPreparedStatement.java:84)
>         at
> com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalConnection.prepareStatement(TeraLocalConnection.java:327)
>         at
> com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalConnection.prepareStatement(TeraLocalConnection.java:148)
>         at
> com.cloudera.sqoop.mapreduce.ExportOutputFormat$ExportRecordWriter.getPreparedStatement(ExportOutputFormat.java:142)
>         at
> com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.execUpdate(AsyncSqlRecordWriter.java:146)
>         at
> com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:212)
>
> 2nd error for other same kind of command:
> here mapper is repeating for three times and job is getting failed.
> duce.ExportInputFormat: Total input bytes=24
> 11/09/02 16:34:24 DEBUG mapreduce.ExportInputFormat: maxSplitSize=24
> 11/09/02 16:34:24 INFO input.FileInputFormat: Total input paths to process :
> 1
> 11/09/02 16:34:24 DEBUG mapreduce.ExportInputFormat: Generated splits:
> 11/09/02 16:34:24 DEBUG mapreduce.ExportInputFormat:
> Paths:/user/hadrdev/sqoop_test.txt:0+24 Locations:idoop.ms.com:;
> 11/09/02 16:34:25 INFO mapred.JobClient: Running job: job_201108311434_0052
> 11/09/02 16:34:26 INFO mapred.JobClient:  map 0% reduce 0%
> 11/09/02 16:34:37 INFO mapred.JobClient:  map 100% reduce 0%
> 11/09/02 16:44:37 INFO mapred.JobClient:  map 0% reduce 0%
> 11/09/02 16:44:37 INFO mapred.JobClient: Task Id :
> attempt_201108311434_0052_m_000000_0, Status : FAILED
> Task attempt_201108311434_0052_m_000000_0 failed to report status for 600
> seconds. Killing!
> 11/09/02 16:44:46 INFO mapred.JobClient:  map 100% reduce 0%
> 11/09/02 16:54:46 INFO mapred.JobClient:  map 0% reduce 0%
> 11/09/02 16:54:46 INFO mapred.JobClient: Task Id :
> attempt_201108311434_0052_m_000000_1, Status : FAILED
> Task attempt_201108311434_0052_m_000000_1 failed to report status for 600
> seconds. Killing!
> 11/09/02 16:54:56 INFO mapred.JobClient:  map 100% reduce 0%
> 11/09/02 17:04:56 INFO mapred.JobClient:  map 0% reduce 0%
> 11/09/02 17:04:56 INFO mapred.JobClient: Task Id :
> attempt_201108311434_0052_m_000000_2, Status : FAILED
> Task attempt_201108311434_0052_m_000000_2 failed to report status for 600
> seconds. Killing!
> 11/09/02 17:05:07 INFO mapred.JobClient:  map 100% reduce 0%
>
> Thanks Arvind as always.. you are taking out your valuable time to help.
>
>
> On Fri, Sep 2, 2011 at 2:12 AM, Arvind Prabhakar <ar...@apache.org> wrote:
>>
>> Please try specifying the extra Jar file using -libjar argument. This
>> is a generic Hadoop argument that Sqoop passes down to the framework
>> and should allow the inclusion of other jar files in the classpath.
>> Note that this must be specified before any Sqoop specific argument is
>> given. For example:
>>
>> $ bin/sqoop import -libjars /path/to/gssjar --connect "..."
>>
>> Thanks,
>> Arvind
>>
>> On Thu, Sep 1, 2011 at 8:54 PM, SRINIVAS SURASANI <va...@gmail.com>
>> wrote:
>> > Arvind,
>> >
>> > I understand to place the GSS config jar to be placed in Sqoop lib
>> > directory. I was wondering is there any alternative way to achieve this
>> > [
>> > meaning , how terajdbc4.jar is added to is distributed cache
>> > automatically
>> > before launching Map-Reduce ].
>> >
>> > Thanks,
>> > Srini
>> >
>> >
>> >
>> > On Wed, Aug 31, 2011 at 1:44 PM, Arvind Prabhakar <ar...@apache.org>
>> > wrote:
>> >>
>> >> Srini,
>> >>
>> >> This is happening because the GSS config Jar is not getting put in
>> >> Distributed Cache. Sqoop only puts certain jars in the cache as
>> >> opposed to putting every jar that exists in its classpath. In order to
>> >> force any Jar to be put in the Distributed Cache, you must copy it
>> >> over to Sqoop's lib directory.
>> >>
>> >> Thanks,
>> >> Arvind
>> >>
>> >> On Tue, Aug 30, 2011 at 9:36 PM, SRINIVAS SURASANI <va...@gmail.com>
>> >> wrote:
>> >> > Getting the error while exporting .. And from my observation while
>> >> > compiling
>> >> > .java, sets the classpath for terajdbc4.jar and tdgssconfig.jar ( as
>> >> > I
>> >> > marked in bold letters below) but just before launching map-reduce
>> >> > adding
>> >> > jar classpath of tdgssconfig.jar.
>> >> >  I set the HADOOP_CLASSPTH=<path to> terajdbc4.jar:<path
>> >> > to>tdgssconfig.jar
>> >> > Any Help Appreciated.
>> >> > $ sqoop export --verbose --driver com.teradata.jdbc.TeraDriver
>> >> > --connect
>> >> > jdbc:teradata://TD/DB --username WBD -P --table DB.Temp_Table
>> >> > --export-dir
>> >> > /user/hadoop/sqoop_test.txt --fields-terminated-by ,
>> >> > --lines-terminated-by
>> >> > \n -m 1>
>> >> > 11/08/30 22:59:43 DEBUG tool.BaseSqoopTool: Enabled debug logging.
>> >> > Enter password:
>> >> > 11/08/30 22:59:50 DEBUG sqoop.ConnFactory: Loaded manager factory:
>> >> > com.cloudera.sqoop.manager.DefaultManagerFactory
>> >> > 11/08/30 22:59:50 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
>> >> > com.cloudera.sqoop.manager.DefaultManagerFactory
>> >> > 11/08/30 22:59:50 INFO manager.SqlManager: Using default fetchSize of
>> >> > 1000
>> >> > 11/08/30 22:59:50 DEBUG sqoop.ConnFactory: Instantiated ConnManager
>> >> > com.cloudera.sqoop.manager.GenericJdbcManager@2b76e552
>> >> > 11/08/30 22:59:50 INFO tool.CodeGenTool: Beginning code generation
>> >> > 11/08/30 22:59:51 DEBUG manager.SqlManager: Using fetchSize for next
>> >> > query:
>> >> > 1000
>> >> > 11/08/30 22:59:51 INFO manager.SqlManager: Executing SQL statement:
>> >> > SELECT
>> >> > t.* FROM DB.Temp_Table AS t WHERE 1=0
>> >> > 11/08/30 22:59:51 DEBUG manager.SqlManager: Using fetchSize for next
>> >> > query:
>> >> > 1000
>> >> > 11/08/30 22:59:51 INFO manager.SqlManager: Executing SQL statement:
>> >> > SELECT
>> >> > t.* FROM DB.Temp_Table AS t WHERE 1=0
>> >> > 11/08/30 22:59:51 DEBUG orm.ClassWriter: selected columns:
>> >> > 11/08/30 22:59:51 DEBUG orm.ClassWriter:   NAME
>> >> > 11/08/30 22:59:51 DEBUG orm.ClassWriter:   SALARY
>> >> > 11/08/30 22:59:51 DEBUG orm.ClassWriter: Writing source file:
>> >> >
>> >> >
>> >> > /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB_Temp_Table.java
>> >> > 11/08/30 22:59:51 DEBUG orm.ClassWriter: Table name:DB.Temp_Table
>> >> > 11/08/30 22:59:51 DEBUG orm.ClassWriter: Columns: NAME:12, SALARY:3,
>> >> > 11/08/30 22:59:51 DEBUG orm.ClassWriter: sourceFilename is
>> >> > DB_Temp_Table.java
>> >> > 11/08/30 22:59:51 DEBUG orm.CompilationManager: Found existing
>> >> > /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/
>> >> > 11/08/30 22:59:51 INFO orm.CompilationManager: HADOOP_HOME is
>> >> > /usr/lib/hadoop
>> >> > 11/08/30 22:59:51 INFO orm.CompilationManager: Found hadoop core jar
>> >> > at:
>> >> > /usr/lib/hadoop/hadoop-0.20.2-cdh3u0-core.jar
>> >> > 11/08/30 22:59:51 DEBUG orm.CompilationManager: Adding source file:
>> >> >
>> >> >
>> >> > /tmp/sqoop-haoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB_Temp_Table.java
>> >> > 11/08/30 22:59:51 DEBUG orm.CompilationManager: Invoking javac with
>> >> > args:
>> >> > 11/08/30 22:59:51 DEBUG orm.CompilationManager:   -sourcepath
>> >> > 11/08/30 22:59:51 DEBUG orm.CompilationManager:
>> >> > /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/
>> >> > 11/08/30 22:59:51 DEBUG orm.CompilationManager:   -d
>> >> > 11/08/30 22:59:51 DEBUG orm.CompilationManager:
>> >> > /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/
>> >> > 11/08/30 22:59:51 DEBUG orm.CompilationManager:   -classpath
>> >> > 11/08/30 22:59:51 DEBUG orm.CompilationManager:
>> >> >
>> >> >
>> >> > /usr/lib/hadoop:/usr/lib/hadoop/hadoop-core-0.20.2-cdh3u0.jar:/usr/lib/hadoop/lib/ant-contrib-1.0b3.jar:/usr/lib/hadoop/lib/aspectjrt-1.6.5.jar:/usr/lib/hadoop/lib/aspectjtools-1.6.5.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/commons-daemon-1.0.1.jar:/usr/lib/hadoop/lib/commons-el-1.0.jar:/usr/lib/hadoop/lib/commons-httpclient-3.0.1.jar:/usr/lib/hadoop/lib/commons-logging-1.0.4.jar:/usr/lib/hadoop/lib/commons-logging-api-1.0.4.jar:/usr/lib/hadoop/lib/commons-net-1.4.1.jar:/usr/lib/hadoop/lib/core-3.1.1.jar:/usr/lib/hadoop/lib/elephant-bird-1.0.jar:/usr/lib/hadoop/lib/hadoop-fairscheduler-0.20.2-cdh3u0.jar:/usr/lib/hadoop/lib/hadoop-lzo-0.4.8.jar:/usr/lib/hadoop/lib/hsqldb-1.8.0.10.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.5.2.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.5.2.jar:/usr/lib/hadoop/lib/jasper-compiler-5.5.12.jar:/usr/lib/hadoop/lib/jasper-runtime-5.5.12.jar:/usr/lib/hadoop/lib/jets3t-0.6.1.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/jetty-servlet-tester-6.1.26.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/junit-4.5.jar:/usr/lib/hadoop/lib/kfs-0.2.2.jar:/usr/lib/hadoop/lib/log4j-1.2.15.jar:/usr/lib/hadoop/lib/mockito-all-1.8.2.jar:/usr/lib/hadoop/lib/oro-2.0.8.jar:/usr/lib/hadoop/lib/protobuf-java-2.3.0.jar:/usr/lib/hadoop/lib/servlet-api-2.5-20081211.jar:/usr/lib/hadoop/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hadoop/lib/slf4j-api-1.4.3.jar:/usr/lib/hadoop/lib/slf4j-api-1.5.8.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.4.3.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.5.10.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/yamlbeans-0.9.3.jar:/usr/lib/hadoop/lib/jsp-2.1/jsp-2.1.jar:/usr/lib/hadoop/lib/jsp-2.1/jsp-api-2.1.jar:/usr/lib/sqoop/conf:/usr/lib/hbase/conf::/usr/lib/sqoop/lib/ant-contrib-1.0b3.jar:/usr/lib/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/usr/lib/sqoop/lib/commons-io-1.4.jar:/usr/lib/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar:/usr/lib/sqoop/lib/ivy-2.0.0-rc2.jar:/usr/lib/sqoop/lib/sqljdbc4.jar:/usr/lib/hbase/hbase-0.90.1-cdh3u0-tests.jar:/usr/lib/hbase/hbase-0.90.1-cdh3u0.jar:/usr/lib/hbase/lib/activation-1.1.jar:/usr/lib/hbase/lib/asm-3.1.jar:/usr/lib/hbase/lib/avro-1.3.3.jar:/usr/lib/hbase/lib/commons-cli-1.2.jar:/usr/lib/hbase/lib/commons-codec-1.4.jar:/usr/lib/hbase/lib/commons-el-1.0.jar:/usr/lib/hbase/lib/commons-httpclient-3.1.jar:/usr/lib/hbase/lib/commons-lang-2.5.jar:/usr/lib/hbase/lib/commons-logging-1.1.1.jar:/usr/lib/hbase/lib/commons-net-1.4.1.jar:/usr/lib/hbase/lib/core-3.1.1.jar:/usr/lib/hbase/lib/guava-r06.jar:/usr/lib/hbase/lib/hadoop-core.jar:/usr/lib/hbase/lib/hbase-0.90.1-cdh3u0.jar:/usr/lib/hbase/lib/jackson-core-asl-1.5.2.jar:/usr/lib/hbase/lib/jackson-jaxrs-1.5.5.jar:/usr/lib/hbase/lib/jackson-mapper-asl-1.5.2.jar:/usr/lib/hbase/lib/jackson-xc-1.5.5.jar:/usr/lib/hbase/lib/jasper-compiler-5.5.23.jar:/usr/lib/hbase/lib/jasper-runtime-5.5.23.jar:/usr/lib/hbase/lib/jaxb-api-2.1.jar:/usr/lib/hbase/lib/jaxb-impl-2.1.12.jar:/usr/lib/hbase/lib/jersey-core-1.4.jar:/usr/lib/hbase/lib/jersey-json-1.4.jar:/usr/lib/hbase/lib/jersey-server-1.4.jar:/usr/lib/hbase/lib/jettison-1.1.jar:/usr/lib/hbase/lib/jetty-6.1.26.jar:/usr/lib/hbase/lib/jetty-util-6.1.26.jar:/usr/lib/hbase/lib/jruby-complete-1.0.3.jar:/usr/lib/hbase/lib/jsp-2.1-6.1.14.jar:/usr/lib/hbase/lib/jsp-api-2.1-6.1.14.jar:/usr/lib/hbase/lib/jsp-api-2.1.jar:/usr/lib/hbase/lib/jsr311-api-1.1.1.jar:/usr/lib/hbase/lib/log4j-1.2.16.jar:/usr/lib/hbase/lib/protobuf-java-2.3.0.jar:/usr/lib/hbase/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hbase/lib/servlet-api-2.5.jar:/usr/lib/hbase/lib/slf4j-api-1.5.8.jar:/usr/lib/hbase/lib/slf4j-log4j12-1.5.8.jar:/usr/lib/hbase/lib/stax-api-1.0.1.jar:/usr/lib/hbase/lib/thrift-0.2.0.jar:/usr/lib/hbase/lib/xmlenc-0.52.jar:/usr/lib/hbase/lib/zookeeper.jar:/usr/lib/zookeeper/zookeeper-3.3.3-cdh3u0.jar:/usr/lib/zookeeper/zookeeper.jar:/usr/lib/zookeeper/lib/jline-0.9.94.jar:/usr/lib/zookeeper/lib/log4j-1.2.15.jar:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar:/usr/lib/sqoop/sqoop-test-1.2.0-cdh3u0.jar:<somepath>/lib/tdgssconfig.jar:<somepath>/lib/terajdbc4.jar:/usr/lib/hadoop/hadoop-0.20.2-cdh3u0-core.jar:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
>> >> > Note:
>> >> >
>> >> >
>> >> > /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB_Temp_Table.java
>> >> > uses or overrides a deprecated API.
>> >> > Note: Recompile with -Xlint:deprecation for details.
>> >> > 11/08/30 22:59:52 INFO orm.CompilationManager: Writing jar file:
>> >> >
>> >> >
>> >> > /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB.Temp_Table.jar
>> >> > 11/08/30 22:59:52 DEBUG orm.CompilationManager: Scanning for .class
>> >> > files in
>> >> > directory: /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a
>> >> > 11/08/30 22:59:52 DEBUG orm.CompilationManager: Got classfile:
>> >> >
>> >> >
>> >> > /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DBTemp_Table.class
>> >> > -> DB_Temp_Table.class
>> >> > 11/08/30 22:59:52 DEBUG orm.CompilationManager: Finished writing jar
>> >> > file
>> >> >
>> >> >
>> >> > /tmp/sqoop-hadrdev/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB.Temp_Table.jar
>> >> > 11/08/30 22:59:52 INFO mapreduce.ExportJobBase: Beginning export of
>> >> > DB.Temp_Table
>> >> > 11/08/30 22:59:52 DEBUG mapreduce.JobBase: Using InputFormat: class
>> >> > com.cloudera.sqoop.mapreduce.ExportInputFormat
>> >> > 11/08/30 22:59:52 DEBUG manager.SqlManager: Using fetchSize for next
>> >> > query:
>> >> > 1000
>> >> > 11/08/30 22:59:52 INFO manager.SqlManager: Executing SQL statement:
>> >> > SELECT
>> >> > t.* FROM DB.Temp_Table AS t WHERE 1=0
>> >> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> >> > file:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
>> >> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> >> > file:<somepath>/lib/terajdbc4.jar
>> >> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> >> > file:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
>> >> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> >> > file:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
>> >> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> >> > file:/usr/lib/sqoop/lib/ivy-2.0.0-rc2.jar
>> >> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> >> > file:/usr/lib/sqoop/lib/commons-io-1.4.jar
>> >> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> >> > file:/usr/lib/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
>> >> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> >> > file:/usr/lib/sqoop/lib/sqljdbc4.jar
>> >> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> >> > file:/usr/lib/sqoop/lib/ant-contrib-1.0b3.jar
>> >> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> >> > file:/usr/lib/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar
>> >> > 11/08/30 22:59:53 INFO hdfs.DFSClient: Created HDFS_DELEGATION_TOKEN
>> >> > token
>> >> > 795 for hadoop
>> >> > 11/08/30 22:59:53 INFO security.TokenCache: Got dt for
>> >> >
>> >> >
>> >> > hdfs://<cname>:9000/tmp/hadoop-mapred/mapred/staging/hadoop/.staging/job_201107010928_0398/libjars/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar;uri=xx.xxx.xx.xx:9000;t.service=xx.xxx.xx.xx:9000
>> >> > 11/08/30 22:59:53 INFO input.FileInputFormat: Total input paths to
>> >> > process :
>> >> > 1
>> >> > 11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat: Target
>> >> > numMapTasks=1
>> >> > 11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat: Total input
>> >> > bytes=18
>> >> > 11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat: maxSplitSize=18
>> >> > 11/08/30 22:59:53 INFO input.FileInputFormat: Total input paths to
>> >> > process :
>> >> > 1
>> >> > 11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat: Generated
>> >> > splits:
>> >> > 11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat:
>> >> > Paths:/user/hadrdev/sqoop_test.txt:0+18 Locations:
>> >> > 11/08/30 22:59:53 INFO mapred.JobClient: Running job:
>> >> > job_201107010928_0398
>> >> > 11/08/30 22:59:54 INFO mapred.JobClient:  map 0% reduce 0%
>> >> > 11/08/30 23:00:01 INFO mapred.JobClient: Task Id :
>> >> > attempt_201107010928_0398_m_000000_0, Status : FAILED
>> >> > java.io.IOException: java.lang.NullPointerException
>> >> >         at
>> >> >
>> >> >
>> >> > com.cloudera.sqoop.mapreduce.ExportOutputFormat.getRecordWriter(ExportOutputFormat.java:80)
>> >> >         at
>> >> >
>> >> >
>> >> > org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.<init>(MapTask.java:520)
>> >> >         at
>> >> > org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:635)
>> >> >         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:322)
>> >> >         at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>> >> >         at java.security.AccessController.doPrivileged(Native Method)
>> >> >         at javax.security.auth.Subject.doAs(Subject.java:396)
>> >> >         at
>> >> >
>> >> >
>> >> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1115)
>> >> >         at org.apache.hadoop.mapred.Child.main(Child.java:262)
>> >> > Caused by: java.lang.NullPointerException
>> >> >         at
>> >> > com.teradata.tdgss.jtdgss.TdgssConfigApi.GetMechanisms(DashoA1*..)
>> >> >         at com.teradata.tdgss.jtdgss.TdgssManager.<init>(DashoA1*..)
>> >> >         at
>> >> > com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)
>> >> >         at
>> >> >
>> >> >
>> >> > com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:612)
>> >> >         at
>> >> >
>> >> >
>> >> > com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:630)
>> >> >         at
>> >> >
>> >> >
>> >> > com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(GenericTeraEncrypt.java:723)
>> >> >         at
>> >> > com.teradata.jdbc.AuthMechanism.<init>(AuthMechanism.java:50)
>> >> >         at
>> >> >
>> >> >
>> >> > com.teradata.jdbc.jdbc.GenericInitDBConfigState.action(GenericInitDBConfigState.java:105)
>> >> >         at
>> >> >
>> >> >
>> >> > com.teradata.jdbc.jdbc.GenericLogonController.run(GenericLogonController.java:49)
>> >> >         at
>> >> > com.teradata.jdbc.jdbc_4.TDSession.<init>(TDSession.java:199)
>> >> >         at
>> >> >
>> >> >
>> >> > com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalConnection.<init>(TeraLocalConnection.java:95)
>> >> >         at
>> >> >
>> >> >
>> >> > com.teradata.jdbc.jdbc.ConnectionFactory.createConnection(ConnectionFactory.java:54)
>> >> >         at
>> >> > com.teradata.jdbc.TeraDriver.doConnect(TeraDriver.java:217)
>> >> >         at com.teradata.jdbc.TeraDriver.connect(TeraDriver.java:150)
>> >> >         at
>> >> > java.sql.DriverManager.getConnection(DriverManager.java:582)
>> >> >         at
>> >> > java.sql.DriverManager.getConnection(DriverManager.java:185)
>> >> >         at
>> >> >
>> >> >
>> >> > com.cloudera.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:184)
>> >> >         at
>> >> >
>> >> >
>> >> > com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.<init>(AsyncSqlRecordWriter.java:73)
>> >> >         at
>> >> >
>> >> >
>> >> > com.cloudera.sqoop.mapreduce.ExportOutputFormat$ExportRecordWriter.<init>(ExportOutputFormat.java:96)
>> >> >         at
>> >> >
>> >> >
>> >> > com.cloudera.sqoop.mapreduce.ExportOutputFormat.getRecordWriter(ExportOutputFormat.java:78)
>> >> >         ... 8 more
>> >> >
>> >> > attempt_201107010928_0398_m_000000_0: GSSException: Failure
>> >> > unspecified
>> >> > at
>> >> > GSS-API level (Mechanism level: UserFile parameter null)
>> >> > attempt_201107010928_0398_m_000000_0:   at
>> >> > com.teradata.tdgss.jtdgss.TdgssParseXml.<init>(DashoA1*..)
>> >> > attempt_201107010928_0398_m_000000_0:   at
>> >> > com.teradata.tdgss.jtdgss.TdgssConfigApi.<init>(DashoA1*..)
>> >> > attempt_201107010928_0398_m_000000_0:   at
>> >> > com.teradata.tdgss.jtdgss.TdgssManager.<init>(DashoA1*..)
>> >> > attempt_201107010928_0398_m_000000_0:   at
>> >> > com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)
>> >> > attempt_201107010928_0398_m_000000_0:   at
>> >> >
>> >> >
>> >> > com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:612)
>> >> > attempt_201107010928_0398_m_000000_0:   at
>> >> >
>> >> >
>> >> > com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:630)
>> >> > attempt_201107010928_0398_m_000000_0:   at
>> >> > com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(
>> >> > On Mon, Aug 29, 2011 at 2:19 PM, SRINIVAS SURASANI <va...@gmail.com>
>> >> > wrote:
>> >> >>
>> >> >> Arvind,
>> >> >>
>> >> >> I have subscribed at sqoop-user@incubator.apache.org and posted
>> >> >> Question .
>> >> >> Sorry for the inconvinence from my end, since I'm close to deadline
>> >> >> I'm
>> >> >> taking your valuable time.
>> >> >>
>> >> >> sqoop list-tables --driver com.teradata.jdbc.TeraDriver --connect
>> >> >> jdbc:teradata://PKTD/E1_CMS_WORK --username srini -P
>> >> >> Iam getting the following error:
>> >> >> 11/08/29 13:08:03 INFO manager.SqlManager: Using default fetchSize
>> >> >> of
>> >> >> 1000
>> >> >> GSSException: Failure unspecified at GSS-API level (Mechanism level:
>> >> >> UserFile parameter null)
>> >> >>         at
>> >> >> com.teradata.tdgss.jtdgss.TdgssParseXml.<init>(DashoA1*..)
>> >> >>         at
>> >> >> com.teradata.tdgss.jtdgss.TdgssConfigApi.<init>(DashoA1*..)
>> >> >>         at com.teradata.tdgss.jtdgss.TdgssManager.<init>(DashoA1*..)
>> >> >>         at
>> >> >> com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)
>> >> >>         at
>> >> >>
>> >> >>
>> >> >> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:612)
>> >> >>         at
>> >> >>
>> >> >>
>> >> >> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:630)
>> >> >>         at
>> >> >>
>> >> >>
>> >> >> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(GenericTeraEncrypt.java:723)
>> >> >>         at
>> >> >> com.teradata.jdbc.AuthMechanism.<init>(AuthMechanism.java:50)
>> >> >>         at
>> >> >>
>> >> >>
>> >> >> com.teradata.jdbc.jdbc.GenericInitDBConfigState.action(GenericInitDBConfigState.java:105)
>> >> >>         at
>> >> >>
>> >> >>
>> >> >> com.teradata.jdbc.jdbc.GenericLogonController.run(GenericLogonController.java:49)
>> >> >>         at
>> >> >> com.teradata.jdbc.jdbc_4.TDSession.<init>(TDSession.java:199)
>> >> >>         at
>> >> >>
>> >> >>
>> >> >> com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalConnection.<init>(TeraLocalConnection.java:95)
>> >> >>         at
>> >> >>
>> >> >>
>> >> >> com.teradata.jdbc.jdbc.ConnectionFactory.createConnection(ConnectionFactory.java:54)
>> >> >>         at
>> >> >> com.teradata.jdbc.TeraDriver.doConnect(TeraDriver.java:217)
>> >> >>         at com.teradata.jdbc.TeraDriver.connect(TeraDriver.java:150)
>> >> >>         at
>> >> >> java.sql.DriverManager.getConnection(DriverManager.java:582)
>> >> >>         at
>> >> >> java.sql.DriverManager.getConnection(DriverManager.java:185)
>> >> >>         at
>> >> >>
>> >> >>
>> >> >> com.cloudera.sqoop.manager.SqlManager.makeConnection(SqlManager.java:643)
>> >> >>         at
>> >> >>
>> >> >>
>> >> >> com.cloudera.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:51)
>> >> >>         at
>> >> >>
>> >> >> com.cloudera.sqoop.manager.SqlManager.listTables(SqlManager.java:270)
>> >> >>         at
>> >> >> com.cloudera.sqoop.tool.ListTablesTool.run(ListTablesTool.java:49)
>> >> >>         at com.cloudera.sqoop.Sqoop.run(Sqoop.java:144)
>> >> >>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>> >> >>         at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:180)
>> >> >>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:219)
>> >> >>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:228)
>> >> >>         at com.cloudera.sqoop.Sqoop.main(Sqoop.java:237)
>> >> >> 11/08/29 13:08:04 ERROR sqoop.Sqoop: Got exception running Sqoop:
>> >> >> java.lang.NullPointerException
>> >> >> java.lang.NullPointerException
>> >> >>         at
>> >> >> com.teradata.tdgss.jtdgss.TdgssConfigApi.GetMechanisms(DashoA1*..)
>> >> >>         at com.teradata.tdgss.jtdgss.TdgssManager.<init>(DashoA1*..)
>> >> >>         at
>> >> >> com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)
>> >> >>         at
>> >> >>
>> >> >>
>> >> >> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:612)
>> >> >>         at
>> >> >>
>> >> >>
>> >> >> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:630)
>> >> >>         at
>> >> >>
>> >> >>
>> >> >> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(GenericTeraEncrypt.java:723)
>> >> >>         at
>> >> >> com.teradata.jdbc.AuthMechanism.<init>(AuthMechanism.java:50)
>> >> >>         at
>> >> >>
>> >> >>
>> >> >> com.teradata.jdbc.jdbc.GenericInitDBConfigState.action(GenericInitDBConfigState.java:105)
>> >> >>         at
>> >> >>
>> >> >>
>> >> >> com.teradata.jdbc.jdbc.GenericLogonController.run(GenericLogonController.java:49)
>> >> >>         at
>> >> >> com.teradata.jdbc.jdbc_4.TDSession.<init>(TDSession.java:199)
>> >> >>         at
>> >> >>
>> >> >>
>> >> >> com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalConnection.<init>(TeraLocalConnection.java:95)
>> >> >>         at
>> >> >>
>> >> >>
>> >> >> com.teradata.jdbc.jdbc.ConnectionFactory.createConnection(ConnectionFactory.java:54)
>> >> >>         at
>> >> >> com.teradata.jdbc.TeraDriver.doConnect(TeraDriver.java:217)
>> >> >>         at com.teradata.jdbc.TeraDriver.connect(TeraDriver.java:150)
>> >> >>         at
>> >> >> java.sql.DriverManager.getConnection(DriverManager.java:582)
>> >> >>         at
>> >> >> java.sql.DriverManager.getConnection(DriverManager.java:185)
>> >> >>         at
>> >> >>
>> >> >>
>> >> >> com.cloudera.sqoop.manager.SqlManager.makeConnection(SqlManager.java:643)
>> >> >>         at
>> >> >>
>> >> >>
>> >> >> com.cloudera.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:51)
>> >> >>         at
>> >> >>
>> >> >> com.cloudera.sqoop.manager.SqlManager.listTables(SqlManager.java:270)
>> >> >>         at
>> >> >> com.cloudera.sqoop.tool.ListTablesTool.run(ListTablesTool.java:49)
>> >> >>         at com.cloudera.sqoop.Sqoop.run(Sqoop.java:144)
>> >> >>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>> >> >>         at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:180)
>> >> >>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:219)
>> >> >>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:228)
>> >> >>         at com.cloudera.sqoop.Sqoop.main(Sqoop.java:237)
>> >> >> Thanks,
>> >> >> Srini
>> >> >>
>> >> >>
>> >> >>
>> >> >> On Mon, Aug 29, 2011 at 12:58 PM, arvind@cloudera.com
>> >> >> <ar...@cloudera.com> wrote:
>> >> >>>
>> >> >>> [Please subscribe and respond to sqoop-user@incubator.apache.org]
>> >> >>>
>> >> >>> Please use HADOOP_CLASSPATH instead of CLASSPATH. Also, in order to
>> >> >>> use the generic JDBC connector, you will have to specify the driver
>> >> >>> class explicitly via the command line option --driver
>> >> >>> com.teradata.jdbc.TeraDriver.
>> >> >>>
>> >> >>> Thanks,
>> >> >>> Arvind
>> >> >>>
>> >> >>> On Mon, Aug 29, 2011 at 9:53 AM, SRINIVAS SURASANI
>> >> >>> <va...@gmail.com>
>> >> >>> wrote:
>> >> >>> > Arvind,
>> >> >>> > I have set the classpath to teradata4.jar [ not placed the
>> >> >>> > teradata4.jar in
>> >> >>> > sqoop lib, as I dont have permissions].
>> >> >>> > I'm getting the following error
>> >> >>> >
>> >> >>> > sqoop list-tables --connect jdbc:teradata://PKTD/E1_CMS_WORK
>> >> >>> > --username
>> >> >>> > srini -P
>> >> >>> > ERROR: tool.BaseSqoopTool: Got error creating database manager:
>> >> >>> > java.io.IOexception: No manager for connect string:
>> >> >>> > jdbc:teradata:///PKTD/E1_CMS_WORK
>> >> >>> >    at
>> >> >>> > com.cloudera.sqoop.ConnFactory.getManager(ConnFactory.java:119)
>> >> >>> >    at
>> >> >>> >
>> >> >>> > com.cloudera.sqoop.tool.BaseSqoopTool.init(BaseSqoopTool.java:186)
>> >> >>> >    ...
>> >> >>> >    ...
>> >> >>> > Thanks,
>> >> >>> > Srini
>> >> >>> >
>> >> >>> >
>> >> >>> >
>> >> >>> > On Mon, Aug 29, 2011 at 8:52 AM, SRINIVAS SURASANI
>> >> >>> > <va...@gmail.com>
>> >> >>> > wrote:
>> >> >>> >>
>> >> >>> >> Thanks-a-lot Arvind.
>> >> >>> >>
>> >> >>> >> On Mon, Aug 29, 2011 at 8:45 AM, arvind@cloudera.com
>> >> >>> >> <ar...@cloudera.com>
>> >> >>> >> wrote:
>> >> >>> >>>
>> >> >>> >>> [Moving the thread to sqoop-user@incubator.apache.org]
>> >> >>> >>>
>> >> >>> >>> Hi Srini,
>> >> >>> >>>
>> >> >>> >>> You should be able to use the generic JDBC connector to
>> >> >>> >>> import/export
>> >> >>> >>> from Teradata. There is also a specialized connector that is
>> >> >>> >>> available
>> >> >>> >>> for use with Teradata if you are interested. This connector is
>> >> >>> >>> not
>> >> >>> >>> a
>> >> >>> >>> part of Sqoop and can be obtained from Cloudera by going to:
>> >> >>> >>>
>> >> >>> >>> http://www.cloudera.com/partners/connectors/
>> >> >>> >>>
>> >> >>> >>> Thanks,
>> >> >>> >>> Arvind
>> >> >>> >>>
>> >> >>> >>> On Mon, Aug 29, 2011 at 8:17 AM, SRINIVAS SURASANI
>> >> >>> >>> <va...@gmail.com>
>> >> >>> >>> wrote:
>> >> >>> >>> > I have csv file in hadoop and looking to load into Teradata.
>> >> >>> >>> > I
>> >> >>> >>> > was
>> >> >>> >>> > wondering does the sqoop works with Terradata.(with JDBC jar
>> >> >>> >>> > placing
>> >> >>> >>> > in sqoop lib dir).
>> >> >>> >>> >
>> >> >>> >>> > Regards
>> >> >>> >>> > Srini
>> >> >>> >>> >
>> >> >>> >>> > --
>> >> >>> >>> > NOTE: The mailing list sqoop-user@cloudera.org is deprecated
>> >> >>> >>> > in
>> >> >>> >>> > favor
>> >> >>> >>> > of Apache Sqoop mailing list sqoop-user@incubator.apache.org.
>> >> >>> >>> > Please
>> >> >>> >>> > subscribe to it by sending an email to
>> >> >>> >>> > incubator-sqoop-user-subscribe@apache.org.
>> >> >>> >>> >
>> >> >>> >>>
>> >> >>> >>> --
>> >> >>> >>> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in
>> >> >>> >>> favor
>> >> >>> >>> of
>> >> >>> >>> Apache Sqoop mailing list sqoop-user@incubator.apache.org.
>> >> >>> >>> Please
>> >> >>> >>> subscribe
>> >> >>> >>> to it by sending an email to
>> >> >>> >>> incubator-sqoop-user-subscribe@apache.org.
>> >> >>> >>
>> >> >>> >
>> >> >>> > --
>> >> >>> > NOTE: The mailing list sqoop-user@cloudera.org is deprecated in
>> >> >>> > favor
>> >> >>> > of
>> >> >>> > Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please
>> >> >>> > subscribe
>> >> >>> > to it by sending an email to
>> >> >>> > incubator-sqoop-user-subscribe@apache.org.
>> >> >>> >
>> >> >>>
>> >> >>> --
>> >> >>> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in
>> >> >>> favor
>> >> >>> of
>> >> >>> Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please
>> >> >>> subscribe
>> >> >>> to it by sending an email to
>> >> >>> incubator-sqoop-user-subscribe@apache.org.
>> >> >>
>> >> >
>> >> >
>> >
>> >
>
>

Re: [sqoop-user] Sqoop-with Terradata

Posted by SRINIVAS SURASANI <va...@gmail.com>.
Arvind,

Now Im getting strange errror. I made sure table has equal number of
attributes and data is not-corrupted.

*1st error*:

sqoop export -libjar/jdbc/13.00.00.07/lib/tdgssconfig.jar<http://13.0.0.7/lib/tdgssconfig.jar>
--verbose
--driver com.teradata.jdbc.TeraDriver --connect
jdbc:teradata://PTD/EW1_CMTS_WORK --username EDWBD_CMTS --password  --table
EW1_CMS_WORK.TMRB_TEST --export-dir /user/hadrdev/sqoop_msrb_test.txt
--fields-terminated-by ',' --lines-terminated-by '\n'


11/09/02 16:25:14 DEBUG manager.SqlManager: Using fetchSize for next query:
1000
11/09/02 16:25:14 INFO manager.SqlManager: Executing SQL statement: SELECT
t.* FROM EW1_CMS_WORK.TMRB_TEST AS t WHERE 1=0
11/09/02 16:25:14 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
11/09/02 16:25:14 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/jdbc/13.00.00.07/common/lib/terajdbc4.jar<http://13.0.0.7/common/lib/terajdbc4.jar>
11/09/02 16:25:14 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
11/09/02 16:25:14 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
11/09/02 16:25:14 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/lib/sqoop/lib/ivy-2.0.0-rc2.jar
11/09/02 16:25:14 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/lib/sqoop/lib/commons-io-1.4.jar
11/09/02 16:25:14 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/lib/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
11/09/02 16:25:14 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/lib/sqoop/lib/sqljdbc4.jar
11/09/02 16:25:14 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/lib/sqoop/lib/ant-contrib-1.0b3.jar
11/09/02 16:25:14 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/lib/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar
11/09/02 16:25:14 INFO hdfs.DFSClient: Created HDFS_DELEGATION_TOKEN token
922 for hadrdev
11/09/02 16:25:14 INFO security.TokenCache: Got dt for hdfs://idoop:
9000/tmp/hadoop-mapred/mapred/staging/hadrdev/.staging/job_201108311434_0051/libjars/tdgssconfig.jar;uri=10.1
28.225.1:9000;t.service=10.128.225.1:9000
11/09/02 16:25:14 INFO input.FileInputFormat: Total input paths to process :
1
11/09/02 16:25:14 DEBUG mapreduce.ExportInputFormat: Target numMapTasks=4
11/09/02 16:25:14 DEBUG mapreduce.ExportInputFormat: Total input
bytes=7854508
11/09/02 16:25:14 DEBUG mapreduce.ExportInputFormat: maxSplitSize=1963627
11/09/02 16:25:14 INFO input.FileInputFormat: Total input paths to process :
1
11/09/02 16:25:14 DEBUG mapreduce.ExportInputFormat: Generated splits:
11/09/02 16:25:14 DEBUG mapreduce.ExportInputFormat:
Paths:/user/hadrdev/sqoop_msrb_test.txt:0+7854508 Locations:idoop.com:;
11/09/02 16:25:15 INFO mapred.JobClient: Running job: job_201108311434_0051
11/09/02 16:25:16 INFO mapred.JobClient:  map 0% reduce 0%
11/09/02 16:25:24 INFO mapred.JobClient: Task Id :
attempt_201108311434_0051_m_000000_0, Status : FAILED
*java.io.IOException: com.teradata.jdbc.jdbc_4.util.JDBCException: [Teradata
Database] [TeraJDBC 13.00.00.07] [Error 3706] [SQLState 42000] Syntax error:
expected something between ')' and ','*.
        at
com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:217)
        at
com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:45)
        at
org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:530)
        at
org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
        at
com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:80)
        at
com.cloudera.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:38)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
        at
com.cloudera.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:187)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:646)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:322)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1115)
        at org.apache.hadoop.mapred.Child.main(Child.java:262)
*Caused by: com.teradata.jdbc.jdbc_4.util.JDBCException: [Teradata Database]
[TeraJDBC 13.00.00.07] [Error 3706] [SQLState 42000] Syntax error: expected
something between ')' and ','.
*        at
com.teradata.jdbc.jdbc_4.util.ErrorFactory.makeDatabaseSQLException(ErrorFactory.java:288)
        at
com.teradata.jdbc.jdbc_4.statemachine.ReceiveInitSubState.action(ReceiveInitSubState.java:102)
        at
com.teradata.jdbc.jdbc_4.statemachine.StatementReceiveState.subStateMachine(StatementReceiveState.java:285)
        at
com.teradata.jdbc.jdbc_4.statemachine.StatementReceiveState.action(StatementReceiveState.java:176)
        at
com.teradata.jdbc.jdbc_4.statemachine.StatementController.runBody(StatementController.java:108)
        at
com.teradata.jdbc.jdbc_4.statemachine.StatementController.run(StatementController.java:99)
        at
com.teradata.jdbc.jdbc_4.Statement.executeStatement(Statement.java:331)
        at
com.teradata.jdbc.jdbc_4.Statement.prepareRequest(Statement.java:491)
        at
com.teradata.jdbc.jdbc_4.PreparedStatement.<init>(PreparedStatement.java:56)
        at
com.teradata.jdbc.jdbc_4.TDSession.createPreparedStatement(TDSession.java:689)
        at
com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalPreparedStatement.<init>(TeraLocalPreparedStatement.java:84)
        at
com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalConnection.prepareStatement(TeraLocalConnection.java:327)
        at
com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalConnection.prepareStatement(TeraLocalConnection.java:148)
        at
com.cloudera.sqoop.mapreduce.ExportOutputFormat$ExportRecordWriter.getPreparedStatement(ExportOutputFormat.java:142)
        at
com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.execUpdate(AsyncSqlRecordWriter.java:146)
        at
com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.write(AsyncSqlRecordWriter.java:212)


*2nd error for other same kind of command*:

here mapper is repeating for three times and job is getting failed.

duce.ExportInputFormat: Total input bytes=24
11/09/02 16:34:24 DEBUG mapreduce.ExportInputFormat: maxSplitSize=24
11/09/02 16:34:24 INFO input.FileInputFormat: Total input paths to process :
1
11/09/02 16:34:24 DEBUG mapreduce.ExportInputFormat: Generated splits:
11/09/02 16:34:24 DEBUG mapreduce.ExportInputFormat:
Paths:/user/hadrdev/sqoop_test.txt:0+24 Locations:idoop.ms.com:;
11/09/02 16:34:25 INFO mapred.JobClient: Running job: job_201108311434_0052
11/09/02 16:34:26 INFO mapred.JobClient:  *map 0% reduce 0%
11/09/02 16:34:37 INFO mapred.JobClient:  map 100% reduce 0%
11/09/02 16:44:37 INFO mapred.JobClient:  map 0% reduce 0%
11/09/02 16:44:37 INFO mapred.JobClient: Task Id :
attempt_201108311434_0052_m_000000_0, Status : FAILED
Task attempt_201108311434_0052_m_000000_0 failed to report status for 600
seconds. Killing!
11/09/02 16:44:46 INFO mapred.JobClient:  map 100% reduce 0%
11/09/02 16:54:46 INFO mapred.JobClient:  map 0% reduce 0%
11/09/02 16:54:46 INFO mapred.JobClient: Task Id :
attempt_201108311434_0052_m_000000_1, Status : FAILED*
Task attempt_201108311434_0052_m_000000_1 failed to report status for 600
seconds. Killing!
11/09/02 16:54:56 INFO mapred.JobClient:  map 100% reduce 0%
11/09/02 17:04:56 INFO mapred.JobClient:  map 0% reduce 0%
11/09/02 17:04:56 INFO mapred.JobClient: Task Id :
attempt_201108311434_0052_m_000000_2, Status : FAILED
Task attempt_201108311434_0052_m_000000_2 failed to report status for 600
seconds. Killing!
11/09/02 17:05:07 INFO mapred.JobClient:  map 100% reduce 0%


Thanks Arvind as always.. you are taking out your valuable time to help.



On Fri, Sep 2, 2011 at 2:12 AM, Arvind Prabhakar <ar...@apache.org> wrote:

> Please try specifying the extra Jar file using -libjar argument. This
> is a generic Hadoop argument that Sqoop passes down to the framework
> and should allow the inclusion of other jar files in the classpath.
> Note that this must be specified before any Sqoop specific argument is
> given. For example:
>
> $ bin/sqoop import -libjars /path/to/gssjar --connect "..."
>
> Thanks,
> Arvind
>
> On Thu, Sep 1, 2011 at 8:54 PM, SRINIVAS SURASANI <va...@gmail.com>
> wrote:
> > Arvind,
> >
> > I understand to place the GSS config jar to be placed in Sqoop lib
> > directory. I was wondering is there any alternative way to achieve this [
> > meaning , how terajdbc4.jar is added to is distributed cache
> automatically
> > before launching Map-Reduce ].
> >
> > Thanks,
> > Srini
> >
> >
> >
> > On Wed, Aug 31, 2011 at 1:44 PM, Arvind Prabhakar <ar...@apache.org>
> wrote:
> >>
> >> Srini,
> >>
> >> This is happening because the GSS config Jar is not getting put in
> >> Distributed Cache. Sqoop only puts certain jars in the cache as
> >> opposed to putting every jar that exists in its classpath. In order to
> >> force any Jar to be put in the Distributed Cache, you must copy it
> >> over to Sqoop's lib directory.
> >>
> >> Thanks,
> >> Arvind
> >>
> >> On Tue, Aug 30, 2011 at 9:36 PM, SRINIVAS SURASANI <va...@gmail.com>
> >> wrote:
> >> > Getting the error while exporting .. And from my observation while
> >> > compiling
> >> > .java, sets the classpath for terajdbc4.jar and tdgssconfig.jar ( as I
> >> > marked in bold letters below) but just before launching map-reduce
> >> > adding
> >> > jar classpath of tdgssconfig.jar.
> >> >  I set the HADOOP_CLASSPTH=<path to> terajdbc4.jar:<path
> >> > to>tdgssconfig.jar
> >> > Any Help Appreciated.
> >> > $ sqoop export --verbose --driver com.teradata.jdbc.TeraDriver
> --connect
> >> > jdbc:teradata://TD/DB --username WBD -P --table DB.Temp_Table
> >> > --export-dir
> >> > /user/hadoop/sqoop_test.txt --fields-terminated-by ,
> >> > --lines-terminated-by
> >> > \n -m 1>
> >> > 11/08/30 22:59:43 DEBUG tool.BaseSqoopTool: Enabled debug logging.
> >> > Enter password:
> >> > 11/08/30 22:59:50 DEBUG sqoop.ConnFactory: Loaded manager factory:
> >> > com.cloudera.sqoop.manager.DefaultManagerFactory
> >> > 11/08/30 22:59:50 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
> >> > com.cloudera.sqoop.manager.DefaultManagerFactory
> >> > 11/08/30 22:59:50 INFO manager.SqlManager: Using default fetchSize of
> >> > 1000
> >> > 11/08/30 22:59:50 DEBUG sqoop.ConnFactory: Instantiated ConnManager
> >> > com.cloudera.sqoop.manager.GenericJdbcManager@2b76e552
> >> > 11/08/30 22:59:50 INFO tool.CodeGenTool: Beginning code generation
> >> > 11/08/30 22:59:51 DEBUG manager.SqlManager: Using fetchSize for next
> >> > query:
> >> > 1000
> >> > 11/08/30 22:59:51 INFO manager.SqlManager: Executing SQL statement:
> >> > SELECT
> >> > t.* FROM DB.Temp_Table AS t WHERE 1=0
> >> > 11/08/30 22:59:51 DEBUG manager.SqlManager: Using fetchSize for next
> >> > query:
> >> > 1000
> >> > 11/08/30 22:59:51 INFO manager.SqlManager: Executing SQL statement:
> >> > SELECT
> >> > t.* FROM DB.Temp_Table AS t WHERE 1=0
> >> > 11/08/30 22:59:51 DEBUG orm.ClassWriter: selected columns:
> >> > 11/08/30 22:59:51 DEBUG orm.ClassWriter:   NAME
> >> > 11/08/30 22:59:51 DEBUG orm.ClassWriter:   SALARY
> >> > 11/08/30 22:59:51 DEBUG orm.ClassWriter: Writing source file:
> >> >
> >> >
> /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB_Temp_Table.java
> >> > 11/08/30 22:59:51 DEBUG orm.ClassWriter: Table name:DB.Temp_Table
> >> > 11/08/30 22:59:51 DEBUG orm.ClassWriter: Columns: NAME:12, SALARY:3,
> >> > 11/08/30 22:59:51 DEBUG orm.ClassWriter: sourceFilename is
> >> > DB_Temp_Table.java
> >> > 11/08/30 22:59:51 DEBUG orm.CompilationManager: Found existing
> >> > /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/
> >> > 11/08/30 22:59:51 INFO orm.CompilationManager: HADOOP_HOME is
> >> > /usr/lib/hadoop
> >> > 11/08/30 22:59:51 INFO orm.CompilationManager: Found hadoop core jar
> at:
> >> > /usr/lib/hadoop/hadoop-0.20.2-cdh3u0-core.jar
> >> > 11/08/30 22:59:51 DEBUG orm.CompilationManager: Adding source file:
> >> >
> >> >
> /tmp/sqoop-haoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB_Temp_Table.java
> >> > 11/08/30 22:59:51 DEBUG orm.CompilationManager: Invoking javac with
> >> > args:
> >> > 11/08/30 22:59:51 DEBUG orm.CompilationManager:   -sourcepath
> >> > 11/08/30 22:59:51 DEBUG orm.CompilationManager:
> >> > /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/
> >> > 11/08/30 22:59:51 DEBUG orm.CompilationManager:   -d
> >> > 11/08/30 22:59:51 DEBUG orm.CompilationManager:
> >> > /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/
> >> > 11/08/30 22:59:51 DEBUG orm.CompilationManager:   -classpath
> >> > 11/08/30 22:59:51 DEBUG orm.CompilationManager:
> >> >
> >> >
> /usr/lib/hadoop:/usr/lib/hadoop/hadoop-core-0.20.2-cdh3u0.jar:/usr/lib/hadoop/lib/ant-contrib-1.0b3.jar:/usr/lib/hadoop/lib/aspectjrt-1.6.5.jar:/usr/lib/hadoop/lib/aspectjtools-1.6.5.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/commons-daemon-1.0.1.jar:/usr/lib/hadoop/lib/commons-el-1.0.jar:/usr/lib/hadoop/lib/commons-httpclient-3.0.1.jar:/usr/lib/hadoop/lib/commons-logging-1.0.4.jar:/usr/lib/hadoop/lib/commons-logging-api-1.0.4.jar:/usr/lib/hadoop/lib/commons-net-1.4.1.jar:/usr/lib/hadoop/lib/core-3.1.1.jar:/usr/lib/hadoop/lib/elephant-bird-1.0.jar:/usr/lib/hadoop/lib/hadoop-fairscheduler-0.20.2-cdh3u0.jar:/usr/lib/hadoop/lib/hadoop-lzo-0.4.8.jar:/usr/lib/hadoop/lib/hsqldb-1.8.0.10.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.5.2.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.5.2.jar:/usr/lib/hadoop/lib/jasper-compiler-5.5.12.jar:/usr/lib/hadoop/lib/jasper-runtime-5.5.12.jar:/usr/lib/hadoop/lib/jets3t-0.6.1.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/jetty-servlet-tester-6.1.26.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/junit-4.5.jar:/usr/lib/hadoop/lib/kfs-0.2.2.jar:/usr/lib/hadoop/lib/log4j-1.2.15.jar:/usr/lib/hadoop/lib/mockito-all-1.8.2.jar:/usr/lib/hadoop/lib/oro-2.0.8.jar:/usr/lib/hadoop/lib/protobuf-java-2.3.0.jar:/usr/lib/hadoop/lib/servlet-api-2.5-20081211.jar:/usr/lib/hadoop/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hadoop/lib/slf4j-api-1.4.3.jar:/usr/lib/hadoop/lib/slf4j-api-1.5.8.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.4.3.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.5.10.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/yamlbeans-0.9.3.jar:/usr/lib/hadoop/lib/jsp-2.1/jsp-2.1.jar:/usr/lib/hadoop/lib/jsp-2.1/jsp-api-2.1.jar:/usr/lib/sqoop/conf:/usr/lib/hbase/conf::/usr/lib/sqoop/lib/ant-contrib-1.0b3.jar:/usr/lib/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/usr/lib/sqoop/lib/commons-io-1.4.jar:/usr/lib/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar:/usr/lib/sqoop/lib/ivy-2.0.0-rc2.jar:/usr/lib/sqoop/lib/sqljdbc4.jar:/usr/lib/hbase/hbase-0.90.1-cdh3u0-tests.jar:/usr/lib/hbase/hbase-0.90.1-cdh3u0.jar:/usr/lib/hbase/lib/activation-1.1.jar:/usr/lib/hbase/lib/asm-3.1.jar:/usr/lib/hbase/lib/avro-1.3.3.jar:/usr/lib/hbase/lib/commons-cli-1.2.jar:/usr/lib/hbase/lib/commons-codec-1.4.jar:/usr/lib/hbase/lib/commons-el-1.0.jar:/usr/lib/hbase/lib/commons-httpclient-3.1.jar:/usr/lib/hbase/lib/commons-lang-2.5.jar:/usr/lib/hbase/lib/commons-logging-1.1.1.jar:/usr/lib/hbase/lib/commons-net-1.4.1.jar:/usr/lib/hbase/lib/core-3.1.1.jar:/usr/lib/hbase/lib/guava-r06.jar:/usr/lib/hbase/lib/hadoop-core.jar:/usr/lib/hbase/lib/hbase-0.90.1-cdh3u0.jar:/usr/lib/hbase/lib/jackson-core-asl-1.5.2.jar:/usr/lib/hbase/lib/jackson-jaxrs-1.5.5.jar:/usr/lib/hbase/lib/jackson-mapper-asl-1.5.2.jar:/usr/lib/hbase/lib/jackson-xc-1.5.5.jar:/usr/lib/hbase/lib/jasper-compiler-5.5.23.jar:/usr/lib/hbase/lib/jasper-runtime-5.5.23.jar:/usr/lib/hbase/lib/jaxb-api-2.1.jar:/usr/lib/hbase/lib/jaxb-impl-2.1.12.jar:/usr/lib/hbase/lib/jersey-core-1.4.jar:/usr/lib/hbase/lib/jersey-json-1.4.jar:/usr/lib/hbase/lib/jersey-server-1.4.jar:/usr/lib/hbase/lib/jettison-1.1.jar:/usr/lib/hbase/lib/jetty-6.1.26.jar:/usr/lib/hbase/lib/jetty-util-6.1.26.jar:/usr/lib/hbase/lib/jruby-complete-1.0.3.jar:/usr/lib/hbase/lib/jsp-2.1-6.1.14.jar:/usr/lib/hbase/lib/jsp-api-2.1-6.1.14.jar:/usr/lib/hbase/lib/jsp-api-2.1.jar:/usr/lib/hbase/lib/jsr311-api-1.1.1.jar:/usr/lib/hbase/lib/log4j-1.2.16.jar:/usr/lib/hbase/lib/protobuf-java-2.3.0.jar:/usr/lib/hbase/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hbase/lib/servlet-api-2.5.jar:/usr/lib/hbase/lib/slf4j-api-1.5.8.jar:/usr/lib/hbase/lib/slf4j-log4j12-1.5.8.jar:/usr/lib/hbase/lib/stax-api-1.0.1.jar:/usr/lib/hbase/lib/thrift-0.2.0.jar:/usr/lib/hbase/lib/xmlenc-0.52.jar:/usr/lib/hbase/lib/zookeeper.jar:/usr/lib/zookeeper/zookeeper-3.3.3-cdh3u0.jar:/usr/lib/zookeeper/zookeeper.jar:/usr/lib/zookeeper/lib/jline-0.9.94.jar:/usr/lib/zookeeper/lib/log4j-1.2.15.jar:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar:/usr/lib/sqoop/sqoop-test-1.2.0-cdh3u0.jar:<somepath>/lib/tdgssconfig.jar:<somepath>/lib/terajdbc4.jar:/usr/lib/hadoop/hadoop-0.20.2-cdh3u0-core.jar:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
> >> > Note:
> >> >
> >> >
> /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB_Temp_Table.java
> >> > uses or overrides a deprecated API.
> >> > Note: Recompile with -Xlint:deprecation for details.
> >> > 11/08/30 22:59:52 INFO orm.CompilationManager: Writing jar file:
> >> >
> >> >
> /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB.Temp_Table.jar
> >> > 11/08/30 22:59:52 DEBUG orm.CompilationManager: Scanning for .class
> >> > files in
> >> > directory: /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a
> >> > 11/08/30 22:59:52 DEBUG orm.CompilationManager: Got classfile:
> >> >
> >> >
> /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DBTemp_Table.class
> >> > -> DB_Temp_Table.class
> >> > 11/08/30 22:59:52 DEBUG orm.CompilationManager: Finished writing jar
> >> > file
> >> >
> >> >
> /tmp/sqoop-hadrdev/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB.Temp_Table.jar
> >> > 11/08/30 22:59:52 INFO mapreduce.ExportJobBase: Beginning export of
> >> > DB.Temp_Table
> >> > 11/08/30 22:59:52 DEBUG mapreduce.JobBase: Using InputFormat: class
> >> > com.cloudera.sqoop.mapreduce.ExportInputFormat
> >> > 11/08/30 22:59:52 DEBUG manager.SqlManager: Using fetchSize for next
> >> > query:
> >> > 1000
> >> > 11/08/30 22:59:52 INFO manager.SqlManager: Executing SQL statement:
> >> > SELECT
> >> > t.* FROM DB.Temp_Table AS t WHERE 1=0
> >> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
> >> > file:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
> >> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
> >> > file:<somepath>/lib/terajdbc4.jar
> >> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
> >> > file:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
> >> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
> >> > file:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
> >> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
> >> > file:/usr/lib/sqoop/lib/ivy-2.0.0-rc2.jar
> >> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
> >> > file:/usr/lib/sqoop/lib/commons-io-1.4.jar
> >> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
> >> > file:/usr/lib/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
> >> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
> >> > file:/usr/lib/sqoop/lib/sqljdbc4.jar
> >> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
> >> > file:/usr/lib/sqoop/lib/ant-contrib-1.0b3.jar
> >> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
> >> > file:/usr/lib/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar
> >> > 11/08/30 22:59:53 INFO hdfs.DFSClient: Created HDFS_DELEGATION_TOKEN
> >> > token
> >> > 795 for hadoop
> >> > 11/08/30 22:59:53 INFO security.TokenCache: Got dt for
> >> >
> >> >
> hdfs://<cname>:9000/tmp/hadoop-mapred/mapred/staging/hadoop/.staging/job_201107010928_0398/libjars/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar;uri=xx.xxx.xx.xx:9000;t.service=xx.xxx.xx.xx:9000
> >> > 11/08/30 22:59:53 INFO input.FileInputFormat: Total input paths to
> >> > process :
> >> > 1
> >> > 11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat: Target
> >> > numMapTasks=1
> >> > 11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat: Total input
> >> > bytes=18
> >> > 11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat: maxSplitSize=18
> >> > 11/08/30 22:59:53 INFO input.FileInputFormat: Total input paths to
> >> > process :
> >> > 1
> >> > 11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat: Generated splits:
> >> > 11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat:
> >> > Paths:/user/hadrdev/sqoop_test.txt:0+18 Locations:
> >> > 11/08/30 22:59:53 INFO mapred.JobClient: Running job:
> >> > job_201107010928_0398
> >> > 11/08/30 22:59:54 INFO mapred.JobClient:  map 0% reduce 0%
> >> > 11/08/30 23:00:01 INFO mapred.JobClient: Task Id :
> >> > attempt_201107010928_0398_m_000000_0, Status : FAILED
> >> > java.io.IOException: java.lang.NullPointerException
> >> >         at
> >> >
> >> >
> com.cloudera.sqoop.mapreduce.ExportOutputFormat.getRecordWriter(ExportOutputFormat.java:80)
> >> >         at
> >> >
> >> >
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.<init>(MapTask.java:520)
> >> >         at
> >> > org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:635)
> >> >         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:322)
> >> >         at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> >> >         at java.security.AccessController.doPrivileged(Native Method)
> >> >         at javax.security.auth.Subject.doAs(Subject.java:396)
> >> >         at
> >> >
> >> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1115)
> >> >         at org.apache.hadoop.mapred.Child.main(Child.java:262)
> >> > Caused by: java.lang.NullPointerException
> >> >         at
> >> > com.teradata.tdgss.jtdgss.TdgssConfigApi.GetMechanisms(DashoA1*..)
> >> >         at com.teradata.tdgss.jtdgss.TdgssManager.<init>(DashoA1*..)
> >> >         at
> >> > com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)
> >> >         at
> >> >
> >> >
> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:612)
> >> >         at
> >> >
> >> >
> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:630)
> >> >         at
> >> >
> >> >
> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(GenericTeraEncrypt.java:723)
> >> >         at
> com.teradata.jdbc.AuthMechanism.<init>(AuthMechanism.java:50)
> >> >         at
> >> >
> >> >
> com.teradata.jdbc.jdbc.GenericInitDBConfigState.action(GenericInitDBConfigState.java:105)
> >> >         at
> >> >
> >> >
> com.teradata.jdbc.jdbc.GenericLogonController.run(GenericLogonController.java:49)
> >> >         at
> com.teradata.jdbc.jdbc_4.TDSession.<init>(TDSession.java:199)
> >> >         at
> >> >
> >> >
> com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalConnection.<init>(TeraLocalConnection.java:95)
> >> >         at
> >> >
> >> >
> com.teradata.jdbc.jdbc.ConnectionFactory.createConnection(ConnectionFactory.java:54)
> >> >         at com.teradata.jdbc.TeraDriver.doConnect(TeraDriver.java:217)
> >> >         at com.teradata.jdbc.TeraDriver.connect(TeraDriver.java:150)
> >> >         at
> java.sql.DriverManager.getConnection(DriverManager.java:582)
> >> >         at
> java.sql.DriverManager.getConnection(DriverManager.java:185)
> >> >         at
> >> >
> >> >
> com.cloudera.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:184)
> >> >         at
> >> >
> >> >
> com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.<init>(AsyncSqlRecordWriter.java:73)
> >> >         at
> >> >
> >> >
> com.cloudera.sqoop.mapreduce.ExportOutputFormat$ExportRecordWriter.<init>(ExportOutputFormat.java:96)
> >> >         at
> >> >
> >> >
> com.cloudera.sqoop.mapreduce.ExportOutputFormat.getRecordWriter(ExportOutputFormat.java:78)
> >> >         ... 8 more
> >> >
> >> > attempt_201107010928_0398_m_000000_0: GSSException: Failure
> unspecified
> >> > at
> >> > GSS-API level (Mechanism level: UserFile parameter null)
> >> > attempt_201107010928_0398_m_000000_0:   at
> >> > com.teradata.tdgss.jtdgss.TdgssParseXml.<init>(DashoA1*..)
> >> > attempt_201107010928_0398_m_000000_0:   at
> >> > com.teradata.tdgss.jtdgss.TdgssConfigApi.<init>(DashoA1*..)
> >> > attempt_201107010928_0398_m_000000_0:   at
> >> > com.teradata.tdgss.jtdgss.TdgssManager.<init>(DashoA1*..)
> >> > attempt_201107010928_0398_m_000000_0:   at
> >> > com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)
> >> > attempt_201107010928_0398_m_000000_0:   at
> >> >
> >> >
> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:612)
> >> > attempt_201107010928_0398_m_000000_0:   at
> >> >
> >> >
> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:630)
> >> > attempt_201107010928_0398_m_000000_0:   at
> >> > com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(
> >> > On Mon, Aug 29, 2011 at 2:19 PM, SRINIVAS SURASANI <va...@gmail.com>
> >> > wrote:
> >> >>
> >> >> Arvind,
> >> >>
> >> >> I have subscribed at sqoop-user@incubator.apache.org and posted
> >> >> Question .
> >> >> Sorry for the inconvinence from my end, since I'm close to deadline
> I'm
> >> >> taking your valuable time.
> >> >>
> >> >> sqoop list-tables --driver com.teradata.jdbc.TeraDriver --connect
> >> >> jdbc:teradata://PKTD/E1_CMS_WORK --username srini -P
> >> >> Iam getting the following error:
> >> >> 11/08/29 13:08:03 INFO manager.SqlManager: Using default fetchSize of
> >> >> 1000
> >> >> GSSException: Failure unspecified at GSS-API level (Mechanism level:
> >> >> UserFile parameter null)
> >> >>         at com.teradata.tdgss.jtdgss.TdgssParseXml.<init>(DashoA1*..)
> >> >>         at
> com.teradata.tdgss.jtdgss.TdgssConfigApi.<init>(DashoA1*..)
> >> >>         at com.teradata.tdgss.jtdgss.TdgssManager.<init>(DashoA1*..)
> >> >>         at
> >> >> com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)
> >> >>         at
> >> >>
> >> >>
> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:612)
> >> >>         at
> >> >>
> >> >>
> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:630)
> >> >>         at
> >> >>
> >> >>
> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(GenericTeraEncrypt.java:723)
> >> >>         at
> >> >> com.teradata.jdbc.AuthMechanism.<init>(AuthMechanism.java:50)
> >> >>         at
> >> >>
> >> >>
> com.teradata.jdbc.jdbc.GenericInitDBConfigState.action(GenericInitDBConfigState.java:105)
> >> >>         at
> >> >>
> >> >>
> com.teradata.jdbc.jdbc.GenericLogonController.run(GenericLogonController.java:49)
> >> >>         at
> >> >> com.teradata.jdbc.jdbc_4.TDSession.<init>(TDSession.java:199)
> >> >>         at
> >> >>
> >> >>
> com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalConnection.<init>(TeraLocalConnection.java:95)
> >> >>         at
> >> >>
> >> >>
> com.teradata.jdbc.jdbc.ConnectionFactory.createConnection(ConnectionFactory.java:54)
> >> >>         at
> com.teradata.jdbc.TeraDriver.doConnect(TeraDriver.java:217)
> >> >>         at com.teradata.jdbc.TeraDriver.connect(TeraDriver.java:150)
> >> >>         at
> java.sql.DriverManager.getConnection(DriverManager.java:582)
> >> >>         at
> java.sql.DriverManager.getConnection(DriverManager.java:185)
> >> >>         at
> >> >>
> >> >>
> com.cloudera.sqoop.manager.SqlManager.makeConnection(SqlManager.java:643)
> >> >>         at
> >> >>
> >> >>
> com.cloudera.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:51)
> >> >>         at
> >> >> com.cloudera.sqoop.manager.SqlManager.listTables(SqlManager.java:270)
> >> >>         at
> >> >> com.cloudera.sqoop.tool.ListTablesTool.run(ListTablesTool.java:49)
> >> >>         at com.cloudera.sqoop.Sqoop.run(Sqoop.java:144)
> >> >>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> >> >>         at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:180)
> >> >>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:219)
> >> >>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:228)
> >> >>         at com.cloudera.sqoop.Sqoop.main(Sqoop.java:237)
> >> >> 11/08/29 13:08:04 ERROR sqoop.Sqoop: Got exception running Sqoop:
> >> >> java.lang.NullPointerException
> >> >> java.lang.NullPointerException
> >> >>         at
> >> >> com.teradata.tdgss.jtdgss.TdgssConfigApi.GetMechanisms(DashoA1*..)
> >> >>         at com.teradata.tdgss.jtdgss.TdgssManager.<init>(DashoA1*..)
> >> >>         at
> >> >> com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)
> >> >>         at
> >> >>
> >> >>
> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:612)
> >> >>         at
> >> >>
> >> >>
> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:630)
> >> >>         at
> >> >>
> >> >>
> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(GenericTeraEncrypt.java:723)
> >> >>         at
> >> >> com.teradata.jdbc.AuthMechanism.<init>(AuthMechanism.java:50)
> >> >>         at
> >> >>
> >> >>
> com.teradata.jdbc.jdbc.GenericInitDBConfigState.action(GenericInitDBConfigState.java:105)
> >> >>         at
> >> >>
> >> >>
> com.teradata.jdbc.jdbc.GenericLogonController.run(GenericLogonController.java:49)
> >> >>         at
> >> >> com.teradata.jdbc.jdbc_4.TDSession.<init>(TDSession.java:199)
> >> >>         at
> >> >>
> >> >>
> com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalConnection.<init>(TeraLocalConnection.java:95)
> >> >>         at
> >> >>
> >> >>
> com.teradata.jdbc.jdbc.ConnectionFactory.createConnection(ConnectionFactory.java:54)
> >> >>         at
> com.teradata.jdbc.TeraDriver.doConnect(TeraDriver.java:217)
> >> >>         at com.teradata.jdbc.TeraDriver.connect(TeraDriver.java:150)
> >> >>         at
> java.sql.DriverManager.getConnection(DriverManager.java:582)
> >> >>         at
> java.sql.DriverManager.getConnection(DriverManager.java:185)
> >> >>         at
> >> >>
> >> >>
> com.cloudera.sqoop.manager.SqlManager.makeConnection(SqlManager.java:643)
> >> >>         at
> >> >>
> >> >>
> com.cloudera.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:51)
> >> >>         at
> >> >> com.cloudera.sqoop.manager.SqlManager.listTables(SqlManager.java:270)
> >> >>         at
> >> >> com.cloudera.sqoop.tool.ListTablesTool.run(ListTablesTool.java:49)
> >> >>         at com.cloudera.sqoop.Sqoop.run(Sqoop.java:144)
> >> >>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> >> >>         at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:180)
> >> >>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:219)
> >> >>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:228)
> >> >>         at com.cloudera.sqoop.Sqoop.main(Sqoop.java:237)
> >> >> Thanks,
> >> >> Srini
> >> >>
> >> >>
> >> >>
> >> >> On Mon, Aug 29, 2011 at 12:58 PM, arvind@cloudera.com
> >> >> <ar...@cloudera.com> wrote:
> >> >>>
> >> >>> [Please subscribe and respond to sqoop-user@incubator.apache.org]
> >> >>>
> >> >>> Please use HADOOP_CLASSPATH instead of CLASSPATH. Also, in order to
> >> >>> use the generic JDBC connector, you will have to specify the driver
> >> >>> class explicitly via the command line option --driver
> >> >>> com.teradata.jdbc.TeraDriver.
> >> >>>
> >> >>> Thanks,
> >> >>> Arvind
> >> >>>
> >> >>> On Mon, Aug 29, 2011 at 9:53 AM, SRINIVAS SURASANI <
> vasajb@gmail.com>
> >> >>> wrote:
> >> >>> > Arvind,
> >> >>> > I have set the classpath to teradata4.jar [ not placed the
> >> >>> > teradata4.jar in
> >> >>> > sqoop lib, as I dont have permissions].
> >> >>> > I'm getting the following error
> >> >>> >
> >> >>> > sqoop list-tables --connect jdbc:teradata://PKTD/E1_CMS_WORK
> >> >>> > --username
> >> >>> > srini -P
> >> >>> > ERROR: tool.BaseSqoopTool: Got error creating database manager:
> >> >>> > java.io.IOexception: No manager for connect string:
> >> >>> > jdbc:teradata:///PKTD/E1_CMS_WORK
> >> >>> >    at
> >> >>> > com.cloudera.sqoop.ConnFactory.getManager(ConnFactory.java:119)
> >> >>> >    at
> >> >>> > com.cloudera.sqoop.tool.BaseSqoopTool.init(BaseSqoopTool.java:186)
> >> >>> >    ...
> >> >>> >    ...
> >> >>> > Thanks,
> >> >>> > Srini
> >> >>> >
> >> >>> >
> >> >>> >
> >> >>> > On Mon, Aug 29, 2011 at 8:52 AM, SRINIVAS SURASANI
> >> >>> > <va...@gmail.com>
> >> >>> > wrote:
> >> >>> >>
> >> >>> >> Thanks-a-lot Arvind.
> >> >>> >>
> >> >>> >> On Mon, Aug 29, 2011 at 8:45 AM, arvind@cloudera.com
> >> >>> >> <ar...@cloudera.com>
> >> >>> >> wrote:
> >> >>> >>>
> >> >>> >>> [Moving the thread to sqoop-user@incubator.apache.org]
> >> >>> >>>
> >> >>> >>> Hi Srini,
> >> >>> >>>
> >> >>> >>> You should be able to use the generic JDBC connector to
> >> >>> >>> import/export
> >> >>> >>> from Teradata. There is also a specialized connector that is
> >> >>> >>> available
> >> >>> >>> for use with Teradata if you are interested. This connector is
> not
> >> >>> >>> a
> >> >>> >>> part of Sqoop and can be obtained from Cloudera by going to:
> >> >>> >>>
> >> >>> >>> http://www.cloudera.com/partners/connectors/
> >> >>> >>>
> >> >>> >>> Thanks,
> >> >>> >>> Arvind
> >> >>> >>>
> >> >>> >>> On Mon, Aug 29, 2011 at 8:17 AM, SRINIVAS SURASANI
> >> >>> >>> <va...@gmail.com>
> >> >>> >>> wrote:
> >> >>> >>> > I have csv file in hadoop and looking to load into Teradata. I
> >> >>> >>> > was
> >> >>> >>> > wondering does the sqoop works with Terradata.(with JDBC jar
> >> >>> >>> > placing
> >> >>> >>> > in sqoop lib dir).
> >> >>> >>> >
> >> >>> >>> > Regards
> >> >>> >>> > Srini
> >> >>> >>> >
> >> >>> >>> > --
> >> >>> >>> > NOTE: The mailing list sqoop-user@cloudera.org is deprecated
> in
> >> >>> >>> > favor
> >> >>> >>> > of Apache Sqoop mailing list sqoop-user@incubator.apache.org.
> >> >>> >>> > Please
> >> >>> >>> > subscribe to it by sending an email to
> >> >>> >>> > incubator-sqoop-user-subscribe@apache.org.
> >> >>> >>> >
> >> >>> >>>
> >> >>> >>> --
> >> >>> >>> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in
> >> >>> >>> favor
> >> >>> >>> of
> >> >>> >>> Apache Sqoop mailing list sqoop-user@incubator.apache.org.
> Please
> >> >>> >>> subscribe
> >> >>> >>> to it by sending an email to
> >> >>> >>> incubator-sqoop-user-subscribe@apache.org.
> >> >>> >>
> >> >>> >
> >> >>> > --
> >> >>> > NOTE: The mailing list sqoop-user@cloudera.org is deprecated in
> >> >>> > favor
> >> >>> > of
> >> >>> > Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please
> >> >>> > subscribe
> >> >>> > to it by sending an email to
> >> >>> > incubator-sqoop-user-subscribe@apache.org.
> >> >>> >
> >> >>>
> >> >>> --
> >> >>> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in
> favor
> >> >>> of
> >> >>> Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please
> >> >>> subscribe
> >> >>> to it by sending an email to
> >> >>> incubator-sqoop-user-subscribe@apache.org.
> >> >>
> >> >
> >> >
> >
> >
>

Re: [sqoop-user] Sqoop-with Terradata

Posted by Arvind Prabhakar <ar...@apache.org>.
Please try specifying the extra Jar file using -libjar argument. This
is a generic Hadoop argument that Sqoop passes down to the framework
and should allow the inclusion of other jar files in the classpath.
Note that this must be specified before any Sqoop specific argument is
given. For example:

$ bin/sqoop import -libjars /path/to/gssjar --connect "..."

Thanks,
Arvind

On Thu, Sep 1, 2011 at 8:54 PM, SRINIVAS SURASANI <va...@gmail.com> wrote:
> Arvind,
>
> I understand to place the GSS config jar to be placed in Sqoop lib
> directory. I was wondering is there any alternative way to achieve this [
> meaning , how terajdbc4.jar is added to is distributed cache automatically
> before launching Map-Reduce ].
>
> Thanks,
> Srini
>
>
>
> On Wed, Aug 31, 2011 at 1:44 PM, Arvind Prabhakar <ar...@apache.org> wrote:
>>
>> Srini,
>>
>> This is happening because the GSS config Jar is not getting put in
>> Distributed Cache. Sqoop only puts certain jars in the cache as
>> opposed to putting every jar that exists in its classpath. In order to
>> force any Jar to be put in the Distributed Cache, you must copy it
>> over to Sqoop's lib directory.
>>
>> Thanks,
>> Arvind
>>
>> On Tue, Aug 30, 2011 at 9:36 PM, SRINIVAS SURASANI <va...@gmail.com>
>> wrote:
>> > Getting the error while exporting .. And from my observation while
>> > compiling
>> > .java, sets the classpath for terajdbc4.jar and tdgssconfig.jar ( as I
>> > marked in bold letters below) but just before launching map-reduce
>> > adding
>> > jar classpath of tdgssconfig.jar.
>> >  I set the HADOOP_CLASSPTH=<path to> terajdbc4.jar:<path
>> > to>tdgssconfig.jar
>> > Any Help Appreciated.
>> > $ sqoop export --verbose --driver com.teradata.jdbc.TeraDriver --connect
>> > jdbc:teradata://TD/DB --username WBD -P --table DB.Temp_Table
>> > --export-dir
>> > /user/hadoop/sqoop_test.txt --fields-terminated-by ,
>> > --lines-terminated-by
>> > \n -m 1>
>> > 11/08/30 22:59:43 DEBUG tool.BaseSqoopTool: Enabled debug logging.
>> > Enter password:
>> > 11/08/30 22:59:50 DEBUG sqoop.ConnFactory: Loaded manager factory:
>> > com.cloudera.sqoop.manager.DefaultManagerFactory
>> > 11/08/30 22:59:50 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
>> > com.cloudera.sqoop.manager.DefaultManagerFactory
>> > 11/08/30 22:59:50 INFO manager.SqlManager: Using default fetchSize of
>> > 1000
>> > 11/08/30 22:59:50 DEBUG sqoop.ConnFactory: Instantiated ConnManager
>> > com.cloudera.sqoop.manager.GenericJdbcManager@2b76e552
>> > 11/08/30 22:59:50 INFO tool.CodeGenTool: Beginning code generation
>> > 11/08/30 22:59:51 DEBUG manager.SqlManager: Using fetchSize for next
>> > query:
>> > 1000
>> > 11/08/30 22:59:51 INFO manager.SqlManager: Executing SQL statement:
>> > SELECT
>> > t.* FROM DB.Temp_Table AS t WHERE 1=0
>> > 11/08/30 22:59:51 DEBUG manager.SqlManager: Using fetchSize for next
>> > query:
>> > 1000
>> > 11/08/30 22:59:51 INFO manager.SqlManager: Executing SQL statement:
>> > SELECT
>> > t.* FROM DB.Temp_Table AS t WHERE 1=0
>> > 11/08/30 22:59:51 DEBUG orm.ClassWriter: selected columns:
>> > 11/08/30 22:59:51 DEBUG orm.ClassWriter:   NAME
>> > 11/08/30 22:59:51 DEBUG orm.ClassWriter:   SALARY
>> > 11/08/30 22:59:51 DEBUG orm.ClassWriter: Writing source file:
>> >
>> > /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB_Temp_Table.java
>> > 11/08/30 22:59:51 DEBUG orm.ClassWriter: Table name:DB.Temp_Table
>> > 11/08/30 22:59:51 DEBUG orm.ClassWriter: Columns: NAME:12, SALARY:3,
>> > 11/08/30 22:59:51 DEBUG orm.ClassWriter: sourceFilename is
>> > DB_Temp_Table.java
>> > 11/08/30 22:59:51 DEBUG orm.CompilationManager: Found existing
>> > /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/
>> > 11/08/30 22:59:51 INFO orm.CompilationManager: HADOOP_HOME is
>> > /usr/lib/hadoop
>> > 11/08/30 22:59:51 INFO orm.CompilationManager: Found hadoop core jar at:
>> > /usr/lib/hadoop/hadoop-0.20.2-cdh3u0-core.jar
>> > 11/08/30 22:59:51 DEBUG orm.CompilationManager: Adding source file:
>> >
>> > /tmp/sqoop-haoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB_Temp_Table.java
>> > 11/08/30 22:59:51 DEBUG orm.CompilationManager: Invoking javac with
>> > args:
>> > 11/08/30 22:59:51 DEBUG orm.CompilationManager:   -sourcepath
>> > 11/08/30 22:59:51 DEBUG orm.CompilationManager:
>> > /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/
>> > 11/08/30 22:59:51 DEBUG orm.CompilationManager:   -d
>> > 11/08/30 22:59:51 DEBUG orm.CompilationManager:
>> > /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/
>> > 11/08/30 22:59:51 DEBUG orm.CompilationManager:   -classpath
>> > 11/08/30 22:59:51 DEBUG orm.CompilationManager:
>> >
>> > /usr/lib/hadoop:/usr/lib/hadoop/hadoop-core-0.20.2-cdh3u0.jar:/usr/lib/hadoop/lib/ant-contrib-1.0b3.jar:/usr/lib/hadoop/lib/aspectjrt-1.6.5.jar:/usr/lib/hadoop/lib/aspectjtools-1.6.5.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/commons-daemon-1.0.1.jar:/usr/lib/hadoop/lib/commons-el-1.0.jar:/usr/lib/hadoop/lib/commons-httpclient-3.0.1.jar:/usr/lib/hadoop/lib/commons-logging-1.0.4.jar:/usr/lib/hadoop/lib/commons-logging-api-1.0.4.jar:/usr/lib/hadoop/lib/commons-net-1.4.1.jar:/usr/lib/hadoop/lib/core-3.1.1.jar:/usr/lib/hadoop/lib/elephant-bird-1.0.jar:/usr/lib/hadoop/lib/hadoop-fairscheduler-0.20.2-cdh3u0.jar:/usr/lib/hadoop/lib/hadoop-lzo-0.4.8.jar:/usr/lib/hadoop/lib/hsqldb-1.8.0.10.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.5.2.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.5.2.jar:/usr/lib/hadoop/lib/jasper-compiler-5.5.12.jar:/usr/lib/hadoop/lib/jasper-runtime-5.5.12.jar:/usr/lib/hadoop/lib/jets3t-0.6.1.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/jetty-servlet-tester-6.1.26.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/junit-4.5.jar:/usr/lib/hadoop/lib/kfs-0.2.2.jar:/usr/lib/hadoop/lib/log4j-1.2.15.jar:/usr/lib/hadoop/lib/mockito-all-1.8.2.jar:/usr/lib/hadoop/lib/oro-2.0.8.jar:/usr/lib/hadoop/lib/protobuf-java-2.3.0.jar:/usr/lib/hadoop/lib/servlet-api-2.5-20081211.jar:/usr/lib/hadoop/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hadoop/lib/slf4j-api-1.4.3.jar:/usr/lib/hadoop/lib/slf4j-api-1.5.8.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.4.3.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.5.10.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/yamlbeans-0.9.3.jar:/usr/lib/hadoop/lib/jsp-2.1/jsp-2.1.jar:/usr/lib/hadoop/lib/jsp-2.1/jsp-api-2.1.jar:/usr/lib/sqoop/conf:/usr/lib/hbase/conf::/usr/lib/sqoop/lib/ant-contrib-1.0b3.jar:/usr/lib/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/usr/lib/sqoop/lib/commons-io-1.4.jar:/usr/lib/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar:/usr/lib/sqoop/lib/ivy-2.0.0-rc2.jar:/usr/lib/sqoop/lib/sqljdbc4.jar:/usr/lib/hbase/hbase-0.90.1-cdh3u0-tests.jar:/usr/lib/hbase/hbase-0.90.1-cdh3u0.jar:/usr/lib/hbase/lib/activation-1.1.jar:/usr/lib/hbase/lib/asm-3.1.jar:/usr/lib/hbase/lib/avro-1.3.3.jar:/usr/lib/hbase/lib/commons-cli-1.2.jar:/usr/lib/hbase/lib/commons-codec-1.4.jar:/usr/lib/hbase/lib/commons-el-1.0.jar:/usr/lib/hbase/lib/commons-httpclient-3.1.jar:/usr/lib/hbase/lib/commons-lang-2.5.jar:/usr/lib/hbase/lib/commons-logging-1.1.1.jar:/usr/lib/hbase/lib/commons-net-1.4.1.jar:/usr/lib/hbase/lib/core-3.1.1.jar:/usr/lib/hbase/lib/guava-r06.jar:/usr/lib/hbase/lib/hadoop-core.jar:/usr/lib/hbase/lib/hbase-0.90.1-cdh3u0.jar:/usr/lib/hbase/lib/jackson-core-asl-1.5.2.jar:/usr/lib/hbase/lib/jackson-jaxrs-1.5.5.jar:/usr/lib/hbase/lib/jackson-mapper-asl-1.5.2.jar:/usr/lib/hbase/lib/jackson-xc-1.5.5.jar:/usr/lib/hbase/lib/jasper-compiler-5.5.23.jar:/usr/lib/hbase/lib/jasper-runtime-5.5.23.jar:/usr/lib/hbase/lib/jaxb-api-2.1.jar:/usr/lib/hbase/lib/jaxb-impl-2.1.12.jar:/usr/lib/hbase/lib/jersey-core-1.4.jar:/usr/lib/hbase/lib/jersey-json-1.4.jar:/usr/lib/hbase/lib/jersey-server-1.4.jar:/usr/lib/hbase/lib/jettison-1.1.jar:/usr/lib/hbase/lib/jetty-6.1.26.jar:/usr/lib/hbase/lib/jetty-util-6.1.26.jar:/usr/lib/hbase/lib/jruby-complete-1.0.3.jar:/usr/lib/hbase/lib/jsp-2.1-6.1.14.jar:/usr/lib/hbase/lib/jsp-api-2.1-6.1.14.jar:/usr/lib/hbase/lib/jsp-api-2.1.jar:/usr/lib/hbase/lib/jsr311-api-1.1.1.jar:/usr/lib/hbase/lib/log4j-1.2.16.jar:/usr/lib/hbase/lib/protobuf-java-2.3.0.jar:/usr/lib/hbase/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hbase/lib/servlet-api-2.5.jar:/usr/lib/hbase/lib/slf4j-api-1.5.8.jar:/usr/lib/hbase/lib/slf4j-log4j12-1.5.8.jar:/usr/lib/hbase/lib/stax-api-1.0.1.jar:/usr/lib/hbase/lib/thrift-0.2.0.jar:/usr/lib/hbase/lib/xmlenc-0.52.jar:/usr/lib/hbase/lib/zookeeper.jar:/usr/lib/zookeeper/zookeeper-3.3.3-cdh3u0.jar:/usr/lib/zookeeper/zookeeper.jar:/usr/lib/zookeeper/lib/jline-0.9.94.jar:/usr/lib/zookeeper/lib/log4j-1.2.15.jar:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar:/usr/lib/sqoop/sqoop-test-1.2.0-cdh3u0.jar:<somepath>/lib/tdgssconfig.jar:<somepath>/lib/terajdbc4.jar:/usr/lib/hadoop/hadoop-0.20.2-cdh3u0-core.jar:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
>> > Note:
>> >
>> > /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB_Temp_Table.java
>> > uses or overrides a deprecated API.
>> > Note: Recompile with -Xlint:deprecation for details.
>> > 11/08/30 22:59:52 INFO orm.CompilationManager: Writing jar file:
>> >
>> > /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB.Temp_Table.jar
>> > 11/08/30 22:59:52 DEBUG orm.CompilationManager: Scanning for .class
>> > files in
>> > directory: /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a
>> > 11/08/30 22:59:52 DEBUG orm.CompilationManager: Got classfile:
>> >
>> > /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DBTemp_Table.class
>> > -> DB_Temp_Table.class
>> > 11/08/30 22:59:52 DEBUG orm.CompilationManager: Finished writing jar
>> > file
>> >
>> > /tmp/sqoop-hadrdev/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB.Temp_Table.jar
>> > 11/08/30 22:59:52 INFO mapreduce.ExportJobBase: Beginning export of
>> > DB.Temp_Table
>> > 11/08/30 22:59:52 DEBUG mapreduce.JobBase: Using InputFormat: class
>> > com.cloudera.sqoop.mapreduce.ExportInputFormat
>> > 11/08/30 22:59:52 DEBUG manager.SqlManager: Using fetchSize for next
>> > query:
>> > 1000
>> > 11/08/30 22:59:52 INFO manager.SqlManager: Executing SQL statement:
>> > SELECT
>> > t.* FROM DB.Temp_Table AS t WHERE 1=0
>> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
>> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:<somepath>/lib/terajdbc4.jar
>> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
>> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
>> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/lib/ivy-2.0.0-rc2.jar
>> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/lib/commons-io-1.4.jar
>> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
>> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/lib/sqljdbc4.jar
>> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/lib/ant-contrib-1.0b3.jar
>> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
>> > file:/usr/lib/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar
>> > 11/08/30 22:59:53 INFO hdfs.DFSClient: Created HDFS_DELEGATION_TOKEN
>> > token
>> > 795 for hadoop
>> > 11/08/30 22:59:53 INFO security.TokenCache: Got dt for
>> >
>> > hdfs://<cname>:9000/tmp/hadoop-mapred/mapred/staging/hadoop/.staging/job_201107010928_0398/libjars/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar;uri=xx.xxx.xx.xx:9000;t.service=xx.xxx.xx.xx:9000
>> > 11/08/30 22:59:53 INFO input.FileInputFormat: Total input paths to
>> > process :
>> > 1
>> > 11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat: Target
>> > numMapTasks=1
>> > 11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat: Total input
>> > bytes=18
>> > 11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat: maxSplitSize=18
>> > 11/08/30 22:59:53 INFO input.FileInputFormat: Total input paths to
>> > process :
>> > 1
>> > 11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat: Generated splits:
>> > 11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat:
>> > Paths:/user/hadrdev/sqoop_test.txt:0+18 Locations:
>> > 11/08/30 22:59:53 INFO mapred.JobClient: Running job:
>> > job_201107010928_0398
>> > 11/08/30 22:59:54 INFO mapred.JobClient:  map 0% reduce 0%
>> > 11/08/30 23:00:01 INFO mapred.JobClient: Task Id :
>> > attempt_201107010928_0398_m_000000_0, Status : FAILED
>> > java.io.IOException: java.lang.NullPointerException
>> >         at
>> >
>> > com.cloudera.sqoop.mapreduce.ExportOutputFormat.getRecordWriter(ExportOutputFormat.java:80)
>> >         at
>> >
>> > org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.<init>(MapTask.java:520)
>> >         at
>> > org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:635)
>> >         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:322)
>> >         at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>> >         at java.security.AccessController.doPrivileged(Native Method)
>> >         at javax.security.auth.Subject.doAs(Subject.java:396)
>> >         at
>> >
>> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1115)
>> >         at org.apache.hadoop.mapred.Child.main(Child.java:262)
>> > Caused by: java.lang.NullPointerException
>> >         at
>> > com.teradata.tdgss.jtdgss.TdgssConfigApi.GetMechanisms(DashoA1*..)
>> >         at com.teradata.tdgss.jtdgss.TdgssManager.<init>(DashoA1*..)
>> >         at
>> > com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)
>> >         at
>> >
>> > com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:612)
>> >         at
>> >
>> > com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:630)
>> >         at
>> >
>> > com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(GenericTeraEncrypt.java:723)
>> >         at com.teradata.jdbc.AuthMechanism.<init>(AuthMechanism.java:50)
>> >         at
>> >
>> > com.teradata.jdbc.jdbc.GenericInitDBConfigState.action(GenericInitDBConfigState.java:105)
>> >         at
>> >
>> > com.teradata.jdbc.jdbc.GenericLogonController.run(GenericLogonController.java:49)
>> >         at com.teradata.jdbc.jdbc_4.TDSession.<init>(TDSession.java:199)
>> >         at
>> >
>> > com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalConnection.<init>(TeraLocalConnection.java:95)
>> >         at
>> >
>> > com.teradata.jdbc.jdbc.ConnectionFactory.createConnection(ConnectionFactory.java:54)
>> >         at com.teradata.jdbc.TeraDriver.doConnect(TeraDriver.java:217)
>> >         at com.teradata.jdbc.TeraDriver.connect(TeraDriver.java:150)
>> >         at java.sql.DriverManager.getConnection(DriverManager.java:582)
>> >         at java.sql.DriverManager.getConnection(DriverManager.java:185)
>> >         at
>> >
>> > com.cloudera.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:184)
>> >         at
>> >
>> > com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.<init>(AsyncSqlRecordWriter.java:73)
>> >         at
>> >
>> > com.cloudera.sqoop.mapreduce.ExportOutputFormat$ExportRecordWriter.<init>(ExportOutputFormat.java:96)
>> >         at
>> >
>> > com.cloudera.sqoop.mapreduce.ExportOutputFormat.getRecordWriter(ExportOutputFormat.java:78)
>> >         ... 8 more
>> >
>> > attempt_201107010928_0398_m_000000_0: GSSException: Failure unspecified
>> > at
>> > GSS-API level (Mechanism level: UserFile parameter null)
>> > attempt_201107010928_0398_m_000000_0:   at
>> > com.teradata.tdgss.jtdgss.TdgssParseXml.<init>(DashoA1*..)
>> > attempt_201107010928_0398_m_000000_0:   at
>> > com.teradata.tdgss.jtdgss.TdgssConfigApi.<init>(DashoA1*..)
>> > attempt_201107010928_0398_m_000000_0:   at
>> > com.teradata.tdgss.jtdgss.TdgssManager.<init>(DashoA1*..)
>> > attempt_201107010928_0398_m_000000_0:   at
>> > com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)
>> > attempt_201107010928_0398_m_000000_0:   at
>> >
>> > com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:612)
>> > attempt_201107010928_0398_m_000000_0:   at
>> >
>> > com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:630)
>> > attempt_201107010928_0398_m_000000_0:   at
>> > com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(
>> > On Mon, Aug 29, 2011 at 2:19 PM, SRINIVAS SURASANI <va...@gmail.com>
>> > wrote:
>> >>
>> >> Arvind,
>> >>
>> >> I have subscribed at sqoop-user@incubator.apache.org and posted
>> >> Question .
>> >> Sorry for the inconvinence from my end, since I'm close to deadline I'm
>> >> taking your valuable time.
>> >>
>> >> sqoop list-tables --driver com.teradata.jdbc.TeraDriver --connect
>> >> jdbc:teradata://PKTD/E1_CMS_WORK --username srini -P
>> >> Iam getting the following error:
>> >> 11/08/29 13:08:03 INFO manager.SqlManager: Using default fetchSize of
>> >> 1000
>> >> GSSException: Failure unspecified at GSS-API level (Mechanism level:
>> >> UserFile parameter null)
>> >>         at com.teradata.tdgss.jtdgss.TdgssParseXml.<init>(DashoA1*..)
>> >>         at com.teradata.tdgss.jtdgss.TdgssConfigApi.<init>(DashoA1*..)
>> >>         at com.teradata.tdgss.jtdgss.TdgssManager.<init>(DashoA1*..)
>> >>         at
>> >> com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)
>> >>         at
>> >>
>> >> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:612)
>> >>         at
>> >>
>> >> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:630)
>> >>         at
>> >>
>> >> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(GenericTeraEncrypt.java:723)
>> >>         at
>> >> com.teradata.jdbc.AuthMechanism.<init>(AuthMechanism.java:50)
>> >>         at
>> >>
>> >> com.teradata.jdbc.jdbc.GenericInitDBConfigState.action(GenericInitDBConfigState.java:105)
>> >>         at
>> >>
>> >> com.teradata.jdbc.jdbc.GenericLogonController.run(GenericLogonController.java:49)
>> >>         at
>> >> com.teradata.jdbc.jdbc_4.TDSession.<init>(TDSession.java:199)
>> >>         at
>> >>
>> >> com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalConnection.<init>(TeraLocalConnection.java:95)
>> >>         at
>> >>
>> >> com.teradata.jdbc.jdbc.ConnectionFactory.createConnection(ConnectionFactory.java:54)
>> >>         at com.teradata.jdbc.TeraDriver.doConnect(TeraDriver.java:217)
>> >>         at com.teradata.jdbc.TeraDriver.connect(TeraDriver.java:150)
>> >>         at java.sql.DriverManager.getConnection(DriverManager.java:582)
>> >>         at java.sql.DriverManager.getConnection(DriverManager.java:185)
>> >>         at
>> >>
>> >> com.cloudera.sqoop.manager.SqlManager.makeConnection(SqlManager.java:643)
>> >>         at
>> >>
>> >> com.cloudera.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:51)
>> >>         at
>> >> com.cloudera.sqoop.manager.SqlManager.listTables(SqlManager.java:270)
>> >>         at
>> >> com.cloudera.sqoop.tool.ListTablesTool.run(ListTablesTool.java:49)
>> >>         at com.cloudera.sqoop.Sqoop.run(Sqoop.java:144)
>> >>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>> >>         at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:180)
>> >>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:219)
>> >>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:228)
>> >>         at com.cloudera.sqoop.Sqoop.main(Sqoop.java:237)
>> >> 11/08/29 13:08:04 ERROR sqoop.Sqoop: Got exception running Sqoop:
>> >> java.lang.NullPointerException
>> >> java.lang.NullPointerException
>> >>         at
>> >> com.teradata.tdgss.jtdgss.TdgssConfigApi.GetMechanisms(DashoA1*..)
>> >>         at com.teradata.tdgss.jtdgss.TdgssManager.<init>(DashoA1*..)
>> >>         at
>> >> com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)
>> >>         at
>> >>
>> >> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:612)
>> >>         at
>> >>
>> >> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:630)
>> >>         at
>> >>
>> >> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(GenericTeraEncrypt.java:723)
>> >>         at
>> >> com.teradata.jdbc.AuthMechanism.<init>(AuthMechanism.java:50)
>> >>         at
>> >>
>> >> com.teradata.jdbc.jdbc.GenericInitDBConfigState.action(GenericInitDBConfigState.java:105)
>> >>         at
>> >>
>> >> com.teradata.jdbc.jdbc.GenericLogonController.run(GenericLogonController.java:49)
>> >>         at
>> >> com.teradata.jdbc.jdbc_4.TDSession.<init>(TDSession.java:199)
>> >>         at
>> >>
>> >> com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalConnection.<init>(TeraLocalConnection.java:95)
>> >>         at
>> >>
>> >> com.teradata.jdbc.jdbc.ConnectionFactory.createConnection(ConnectionFactory.java:54)
>> >>         at com.teradata.jdbc.TeraDriver.doConnect(TeraDriver.java:217)
>> >>         at com.teradata.jdbc.TeraDriver.connect(TeraDriver.java:150)
>> >>         at java.sql.DriverManager.getConnection(DriverManager.java:582)
>> >>         at java.sql.DriverManager.getConnection(DriverManager.java:185)
>> >>         at
>> >>
>> >> com.cloudera.sqoop.manager.SqlManager.makeConnection(SqlManager.java:643)
>> >>         at
>> >>
>> >> com.cloudera.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:51)
>> >>         at
>> >> com.cloudera.sqoop.manager.SqlManager.listTables(SqlManager.java:270)
>> >>         at
>> >> com.cloudera.sqoop.tool.ListTablesTool.run(ListTablesTool.java:49)
>> >>         at com.cloudera.sqoop.Sqoop.run(Sqoop.java:144)
>> >>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>> >>         at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:180)
>> >>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:219)
>> >>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:228)
>> >>         at com.cloudera.sqoop.Sqoop.main(Sqoop.java:237)
>> >> Thanks,
>> >> Srini
>> >>
>> >>
>> >>
>> >> On Mon, Aug 29, 2011 at 12:58 PM, arvind@cloudera.com
>> >> <ar...@cloudera.com> wrote:
>> >>>
>> >>> [Please subscribe and respond to sqoop-user@incubator.apache.org]
>> >>>
>> >>> Please use HADOOP_CLASSPATH instead of CLASSPATH. Also, in order to
>> >>> use the generic JDBC connector, you will have to specify the driver
>> >>> class explicitly via the command line option --driver
>> >>> com.teradata.jdbc.TeraDriver.
>> >>>
>> >>> Thanks,
>> >>> Arvind
>> >>>
>> >>> On Mon, Aug 29, 2011 at 9:53 AM, SRINIVAS SURASANI <va...@gmail.com>
>> >>> wrote:
>> >>> > Arvind,
>> >>> > I have set the classpath to teradata4.jar [ not placed the
>> >>> > teradata4.jar in
>> >>> > sqoop lib, as I dont have permissions].
>> >>> > I'm getting the following error
>> >>> >
>> >>> > sqoop list-tables --connect jdbc:teradata://PKTD/E1_CMS_WORK
>> >>> > --username
>> >>> > srini -P
>> >>> > ERROR: tool.BaseSqoopTool: Got error creating database manager:
>> >>> > java.io.IOexception: No manager for connect string:
>> >>> > jdbc:teradata:///PKTD/E1_CMS_WORK
>> >>> >    at
>> >>> > com.cloudera.sqoop.ConnFactory.getManager(ConnFactory.java:119)
>> >>> >    at
>> >>> > com.cloudera.sqoop.tool.BaseSqoopTool.init(BaseSqoopTool.java:186)
>> >>> >    ...
>> >>> >    ...
>> >>> > Thanks,
>> >>> > Srini
>> >>> >
>> >>> >
>> >>> >
>> >>> > On Mon, Aug 29, 2011 at 8:52 AM, SRINIVAS SURASANI
>> >>> > <va...@gmail.com>
>> >>> > wrote:
>> >>> >>
>> >>> >> Thanks-a-lot Arvind.
>> >>> >>
>> >>> >> On Mon, Aug 29, 2011 at 8:45 AM, arvind@cloudera.com
>> >>> >> <ar...@cloudera.com>
>> >>> >> wrote:
>> >>> >>>
>> >>> >>> [Moving the thread to sqoop-user@incubator.apache.org]
>> >>> >>>
>> >>> >>> Hi Srini,
>> >>> >>>
>> >>> >>> You should be able to use the generic JDBC connector to
>> >>> >>> import/export
>> >>> >>> from Teradata. There is also a specialized connector that is
>> >>> >>> available
>> >>> >>> for use with Teradata if you are interested. This connector is not
>> >>> >>> a
>> >>> >>> part of Sqoop and can be obtained from Cloudera by going to:
>> >>> >>>
>> >>> >>> http://www.cloudera.com/partners/connectors/
>> >>> >>>
>> >>> >>> Thanks,
>> >>> >>> Arvind
>> >>> >>>
>> >>> >>> On Mon, Aug 29, 2011 at 8:17 AM, SRINIVAS SURASANI
>> >>> >>> <va...@gmail.com>
>> >>> >>> wrote:
>> >>> >>> > I have csv file in hadoop and looking to load into Teradata. I
>> >>> >>> > was
>> >>> >>> > wondering does the sqoop works with Terradata.(with JDBC jar
>> >>> >>> > placing
>> >>> >>> > in sqoop lib dir).
>> >>> >>> >
>> >>> >>> > Regards
>> >>> >>> > Srini
>> >>> >>> >
>> >>> >>> > --
>> >>> >>> > NOTE: The mailing list sqoop-user@cloudera.org is deprecated in
>> >>> >>> > favor
>> >>> >>> > of Apache Sqoop mailing list sqoop-user@incubator.apache.org.
>> >>> >>> > Please
>> >>> >>> > subscribe to it by sending an email to
>> >>> >>> > incubator-sqoop-user-subscribe@apache.org.
>> >>> >>> >
>> >>> >>>
>> >>> >>> --
>> >>> >>> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in
>> >>> >>> favor
>> >>> >>> of
>> >>> >>> Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please
>> >>> >>> subscribe
>> >>> >>> to it by sending an email to
>> >>> >>> incubator-sqoop-user-subscribe@apache.org.
>> >>> >>
>> >>> >
>> >>> > --
>> >>> > NOTE: The mailing list sqoop-user@cloudera.org is deprecated in
>> >>> > favor
>> >>> > of
>> >>> > Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please
>> >>> > subscribe
>> >>> > to it by sending an email to
>> >>> > incubator-sqoop-user-subscribe@apache.org.
>> >>> >
>> >>>
>> >>> --
>> >>> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor
>> >>> of
>> >>> Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please
>> >>> subscribe
>> >>> to it by sending an email to
>> >>> incubator-sqoop-user-subscribe@apache.org.
>> >>
>> >
>> >
>
>

Re: [sqoop-user] Sqoop-with Terradata

Posted by SRINIVAS SURASANI <va...@gmail.com>.
Arvind,

I understand to place the GSS config jar to be placed in Sqoop lib
directory. I was wondering is there any alternative way to achieve this [
meaning , how terajdbc4.jar is added to is distributed cache automatically
before launching Map-Reduce ].

Thanks,
Srini




On Wed, Aug 31, 2011 at 1:44 PM, Arvind Prabhakar <ar...@apache.org> wrote:

> Srini,
>
> This is happening because the GSS config Jar is not getting put in
> Distributed Cache. Sqoop only puts certain jars in the cache as
> opposed to putting every jar that exists in its classpath. In order to
> force any Jar to be put in the Distributed Cache, you must copy it
> over to Sqoop's lib directory.
>
> Thanks,
> Arvind
>
> On Tue, Aug 30, 2011 at 9:36 PM, SRINIVAS SURASANI <va...@gmail.com>
> wrote:
> > Getting the error while exporting .. And from my observation while
> compiling
> > .java, sets the classpath for terajdbc4.jar and tdgssconfig.jar ( as I
> > marked in bold letters below) but just before launching map-reduce adding
> > jar classpath of tdgssconfig.jar.
> >  I set the HADOOP_CLASSPTH=<path to> terajdbc4.jar:<path
> to>tdgssconfig.jar
> > Any Help Appreciated.
> > $ sqoop export --verbose --driver com.teradata.jdbc.TeraDriver --connect
> > jdbc:teradata://TD/DB --username WBD -P --table DB.Temp_Table
> --export-dir
> > /user/hadoop/sqoop_test.txt --fields-terminated-by ,
> --lines-terminated-by
> > \n -m 1>
> > 11/08/30 22:59:43 DEBUG tool.BaseSqoopTool: Enabled debug logging.
> > Enter password:
> > 11/08/30 22:59:50 DEBUG sqoop.ConnFactory: Loaded manager factory:
> > com.cloudera.sqoop.manager.DefaultManagerFactory
> > 11/08/30 22:59:50 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
> > com.cloudera.sqoop.manager.DefaultManagerFactory
> > 11/08/30 22:59:50 INFO manager.SqlManager: Using default fetchSize of
> 1000
> > 11/08/30 22:59:50 DEBUG sqoop.ConnFactory: Instantiated ConnManager
> > com.cloudera.sqoop.manager.GenericJdbcManager@2b76e552
> > 11/08/30 22:59:50 INFO tool.CodeGenTool: Beginning code generation
> > 11/08/30 22:59:51 DEBUG manager.SqlManager: Using fetchSize for next
> query:
> > 1000
> > 11/08/30 22:59:51 INFO manager.SqlManager: Executing SQL statement:
> SELECT
> > t.* FROM DB.Temp_Table AS t WHERE 1=0
> > 11/08/30 22:59:51 DEBUG manager.SqlManager: Using fetchSize for next
> query:
> > 1000
> > 11/08/30 22:59:51 INFO manager.SqlManager: Executing SQL statement:
> SELECT
> > t.* FROM DB.Temp_Table AS t WHERE 1=0
> > 11/08/30 22:59:51 DEBUG orm.ClassWriter: selected columns:
> > 11/08/30 22:59:51 DEBUG orm.ClassWriter:   NAME
> > 11/08/30 22:59:51 DEBUG orm.ClassWriter:   SALARY
> > 11/08/30 22:59:51 DEBUG orm.ClassWriter: Writing source file:
> >
> /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB_Temp_Table.java
> > 11/08/30 22:59:51 DEBUG orm.ClassWriter: Table name:DB.Temp_Table
> > 11/08/30 22:59:51 DEBUG orm.ClassWriter: Columns: NAME:12, SALARY:3,
> > 11/08/30 22:59:51 DEBUG orm.ClassWriter: sourceFilename is
> > DB_Temp_Table.java
> > 11/08/30 22:59:51 DEBUG orm.CompilationManager: Found existing
> > /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/
> > 11/08/30 22:59:51 INFO orm.CompilationManager: HADOOP_HOME is
> > /usr/lib/hadoop
> > 11/08/30 22:59:51 INFO orm.CompilationManager: Found hadoop core jar at:
> > /usr/lib/hadoop/hadoop-0.20.2-cdh3u0-core.jar
> > 11/08/30 22:59:51 DEBUG orm.CompilationManager: Adding source file:
> >
> /tmp/sqoop-haoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB_Temp_Table.java
> > 11/08/30 22:59:51 DEBUG orm.CompilationManager: Invoking javac with args:
> > 11/08/30 22:59:51 DEBUG orm.CompilationManager:   -sourcepath
> > 11/08/30 22:59:51 DEBUG orm.CompilationManager:
> > /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/
> > 11/08/30 22:59:51 DEBUG orm.CompilationManager:   -d
> > 11/08/30 22:59:51 DEBUG orm.CompilationManager:
> > /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/
> > 11/08/30 22:59:51 DEBUG orm.CompilationManager:   -classpath
> > 11/08/30 22:59:51 DEBUG orm.CompilationManager:
> >
> /usr/lib/hadoop:/usr/lib/hadoop/hadoop-core-0.20.2-cdh3u0.jar:/usr/lib/hadoop/lib/ant-contrib-1.0b3.jar:/usr/lib/hadoop/lib/aspectjrt-1.6.5.jar:/usr/lib/hadoop/lib/aspectjtools-1.6.5.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/commons-daemon-1.0.1.jar:/usr/lib/hadoop/lib/commons-el-1.0.jar:/usr/lib/hadoop/lib/commons-httpclient-3.0.1.jar:/usr/lib/hadoop/lib/commons-logging-1.0.4.jar:/usr/lib/hadoop/lib/commons-logging-api-1.0.4.jar:/usr/lib/hadoop/lib/commons-net-1.4.1.jar:/usr/lib/hadoop/lib/core-3.1.1.jar:/usr/lib/hadoop/lib/elephant-bird-1.0.jar:/usr/lib/hadoop/lib/hadoop-fairscheduler-0.20.2-cdh3u0.jar:/usr/lib/hadoop/lib/hadoop-lzo-0.4.8.jar:/usr/lib/hadoop/lib/hsqldb-1.8.0.10.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.5.2.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.5.2.jar:/usr/lib/hadoop/lib/jasper-compiler-5.5.12.jar:/usr/lib/hadoop/lib/jasper-runtime-5.5.12.jar:/usr/lib/hadoop/lib/jets3t-0.6.1.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/jetty-servlet-tester-6.1.26.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/junit-4.5.jar:/usr/lib/hadoop/lib/kfs-0.2.2.jar:/usr/lib/hadoop/lib/log4j-1.2.15.jar:/usr/lib/hadoop/lib/mockito-all-1.8.2.jar:/usr/lib/hadoop/lib/oro-2.0.8.jar:/usr/lib/hadoop/lib/protobuf-java-2.3.0.jar:/usr/lib/hadoop/lib/servlet-api-2.5-20081211.jar:/usr/lib/hadoop/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hadoop/lib/slf4j-api-1.4.3.jar:/usr/lib/hadoop/lib/slf4j-api-1.5.8.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.4.3.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.5.10.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/yamlbeans-0.9.3.jar:/usr/lib/hadoop/lib/jsp-2.1/jsp-2.1.jar:/usr/lib/hadoop/lib/jsp-2.1/jsp-api-2.1.jar:/usr/lib/sqoop/conf:/usr/lib/hbase/conf::/usr/lib/sqoop/lib/ant-contrib-1.0b3.jar:/usr/lib/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/usr/lib/sqoop/lib/commons-io-1.4.jar:/usr/lib/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar:/usr/lib/sqoop/lib/ivy-2.0.0-rc2.jar:/usr/lib/sqoop/lib/sqljdbc4.jar:/usr/lib/hbase/hbase-0.90.1-cdh3u0-tests.jar:/usr/lib/hbase/hbase-0.90.1-cdh3u0.jar:/usr/lib/hbase/lib/activation-1.1.jar:/usr/lib/hbase/lib/asm-3.1.jar:/usr/lib/hbase/lib/avro-1.3.3.jar:/usr/lib/hbase/lib/commons-cli-1.2.jar:/usr/lib/hbase/lib/commons-codec-1.4.jar:/usr/lib/hbase/lib/commons-el-1.0.jar:/usr/lib/hbase/lib/commons-httpclient-3.1.jar:/usr/lib/hbase/lib/commons-lang-2.5.jar:/usr/lib/hbase/lib/commons-logging-1.1.1.jar:/usr/lib/hbase/lib/commons-net-1.4.1.jar:/usr/lib/hbase/lib/core-3.1.1.jar:/usr/lib/hbase/lib/guava-r06.jar:/usr/lib/hbase/lib/hadoop-core.jar:/usr/lib/hbase/lib/hbase-0.90.1-cdh3u0.jar:/usr/lib/hbase/lib/jackson-core-asl-1.5.2.jar:/usr/lib/hbase/lib/jackson-jaxrs-1.5.5.jar:/usr/lib/hbase/lib/jackson-mapper-asl-1.5.2.jar:/usr/lib/hbase/lib/jackson-xc-1.5.5.jar:/usr/lib/hbase/lib/jasper-compiler-5.5.23.jar:/usr/lib/hbase/lib/jasper-runtime-5.5.23.jar:/usr/lib/hbase/lib/jaxb-api-2.1.jar:/usr/lib/hbase/lib/jaxb-impl-2.1.12.jar:/usr/lib/hbase/lib/jersey-core-1.4.jar:/usr/lib/hbase/lib/jersey-json-1.4.jar:/usr/lib/hbase/lib/jersey-server-1.4.jar:/usr/lib/hbase/lib/jettison-1.1.jar:/usr/lib/hbase/lib/jetty-6.1.26.jar:/usr/lib/hbase/lib/jetty-util-6.1.26.jar:/usr/lib/hbase/lib/jruby-complete-1.0.3.jar:/usr/lib/hbase/lib/jsp-2.1-6.1.14.jar:/usr/lib/hbase/lib/jsp-api-2.1-6.1.14.jar:/usr/lib/hbase/lib/jsp-api-2.1.jar:/usr/lib/hbase/lib/jsr311-api-1.1.1.jar:/usr/lib/hbase/lib/log4j-1.2.16.jar:/usr/lib/hbase/lib/protobuf-java-2.3.0.jar:/usr/lib/hbase/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hbase/lib/servlet-api-2.5.jar:/usr/lib/hbase/lib/slf4j-api-1.5.8.jar:/usr/lib/hbase/lib/slf4j-log4j12-1.5.8.jar:/usr/lib/hbase/lib/stax-api-1.0.1.jar:/usr/lib/hbase/lib/thrift-0.2.0.jar:/usr/lib/hbase/lib/xmlenc-0.52.jar:/usr/lib/hbase/lib/zookeeper.jar:/usr/lib/zookeeper/zookeeper-3.3.3-cdh3u0.jar:/usr/lib/zookeeper/zookeeper.jar:/usr/lib/zookeeper/lib/jline-0.9.94.jar:/usr/lib/zookeeper/lib/log4j-1.2.15.jar:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar:/usr/lib/sqoop/sqoop-test-1.2.0-cdh3u0.jar:<somepath>/lib/tdgssconfig.jar:<somepath>/lib/terajdbc4.jar:/usr/lib/hadoop/hadoop-0.20.2-cdh3u0-core.jar:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
> > Note:
> >
> /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB_Temp_Table.java
> > uses or overrides a deprecated API.
> > Note: Recompile with -Xlint:deprecation for details.
> > 11/08/30 22:59:52 INFO orm.CompilationManager: Writing jar file:
> >
> /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB.Temp_Table.jar
> > 11/08/30 22:59:52 DEBUG orm.CompilationManager: Scanning for .class files
> in
> > directory: /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a
> > 11/08/30 22:59:52 DEBUG orm.CompilationManager: Got classfile:
> >
> /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DBTemp_Table.class
> > -> DB_Temp_Table.class
> > 11/08/30 22:59:52 DEBUG orm.CompilationManager: Finished writing jar file
> >
> /tmp/sqoop-hadrdev/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB.Temp_Table.jar
> > 11/08/30 22:59:52 INFO mapreduce.ExportJobBase: Beginning export of
> > DB.Temp_Table
> > 11/08/30 22:59:52 DEBUG mapreduce.JobBase: Using InputFormat: class
> > com.cloudera.sqoop.mapreduce.ExportInputFormat
> > 11/08/30 22:59:52 DEBUG manager.SqlManager: Using fetchSize for next
> query:
> > 1000
> > 11/08/30 22:59:52 INFO manager.SqlManager: Executing SQL statement:
> SELECT
> > t.* FROM DB.Temp_Table AS t WHERE 1=0
> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:<somepath>/lib/terajdbc4.jar
> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/lib/sqoop/lib/ivy-2.0.0-rc2.jar
> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/lib/sqoop/lib/commons-io-1.4.jar
> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/lib/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/lib/sqoop/lib/sqljdbc4.jar
> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/lib/sqoop/lib/ant-contrib-1.0b3.jar
> > 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
> > file:/usr/lib/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar
> > 11/08/30 22:59:53 INFO hdfs.DFSClient: Created HDFS_DELEGATION_TOKEN
> token
> > 795 for hadoop
> > 11/08/30 22:59:53 INFO security.TokenCache: Got dt for
> >
> hdfs://<cname>:9000/tmp/hadoop-mapred/mapred/staging/hadoop/.staging/job_201107010928_0398/libjars/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar;uri=xx.xxx.xx.xx:9000;t.service=xx.xxx.xx.xx:9000
> > 11/08/30 22:59:53 INFO input.FileInputFormat: Total input paths to
> process :
> > 1
> > 11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat: Target numMapTasks=1
> > 11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat: Total input bytes=18
> > 11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat: maxSplitSize=18
> > 11/08/30 22:59:53 INFO input.FileInputFormat: Total input paths to
> process :
> > 1
> > 11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat: Generated splits:
> > 11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat:
> > Paths:/user/hadrdev/sqoop_test.txt:0+18 Locations:
> > 11/08/30 22:59:53 INFO mapred.JobClient: Running job:
> job_201107010928_0398
> > 11/08/30 22:59:54 INFO mapred.JobClient:  map 0% reduce 0%
> > 11/08/30 23:00:01 INFO mapred.JobClient: Task Id :
> > attempt_201107010928_0398_m_000000_0, Status : FAILED
> > java.io.IOException: java.lang.NullPointerException
> >         at
> >
> com.cloudera.sqoop.mapreduce.ExportOutputFormat.getRecordWriter(ExportOutputFormat.java:80)
> >         at
> >
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.<init>(MapTask.java:520)
> >         at
> org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:635)
> >         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:322)
> >         at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> >         at java.security.AccessController.doPrivileged(Native Method)
> >         at javax.security.auth.Subject.doAs(Subject.java:396)
> >         at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1115)
> >         at org.apache.hadoop.mapred.Child.main(Child.java:262)
> > Caused by: java.lang.NullPointerException
> >         at
> > com.teradata.tdgss.jtdgss.TdgssConfigApi.GetMechanisms(DashoA1*..)
> >         at com.teradata.tdgss.jtdgss.TdgssManager.<init>(DashoA1*..)
> >         at com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)
> >         at
> >
> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:612)
> >         at
> >
> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:630)
> >         at
> >
> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(GenericTeraEncrypt.java:723)
> >         at com.teradata.jdbc.AuthMechanism.<init>(AuthMechanism.java:50)
> >         at
> >
> com.teradata.jdbc.jdbc.GenericInitDBConfigState.action(GenericInitDBConfigState.java:105)
> >         at
> >
> com.teradata.jdbc.jdbc.GenericLogonController.run(GenericLogonController.java:49)
> >         at com.teradata.jdbc.jdbc_4.TDSession.<init>(TDSession.java:199)
> >         at
> >
> com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalConnection.<init>(TeraLocalConnection.java:95)
> >         at
> >
> com.teradata.jdbc.jdbc.ConnectionFactory.createConnection(ConnectionFactory.java:54)
> >         at com.teradata.jdbc.TeraDriver.doConnect(TeraDriver.java:217)
> >         at com.teradata.jdbc.TeraDriver.connect(TeraDriver.java:150)
> >         at java.sql.DriverManager.getConnection(DriverManager.java:582)
> >         at java.sql.DriverManager.getConnection(DriverManager.java:185)
> >         at
> >
> com.cloudera.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:184)
> >         at
> >
> com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.<init>(AsyncSqlRecordWriter.java:73)
> >         at
> >
> com.cloudera.sqoop.mapreduce.ExportOutputFormat$ExportRecordWriter.<init>(ExportOutputFormat.java:96)
> >         at
> >
> com.cloudera.sqoop.mapreduce.ExportOutputFormat.getRecordWriter(ExportOutputFormat.java:78)
> >         ... 8 more
> >
> > attempt_201107010928_0398_m_000000_0: GSSException: Failure unspecified
> at
> > GSS-API level (Mechanism level: UserFile parameter null)
> > attempt_201107010928_0398_m_000000_0:   at
> > com.teradata.tdgss.jtdgss.TdgssParseXml.<init>(DashoA1*..)
> > attempt_201107010928_0398_m_000000_0:   at
> > com.teradata.tdgss.jtdgss.TdgssConfigApi.<init>(DashoA1*..)
> > attempt_201107010928_0398_m_000000_0:   at
> > com.teradata.tdgss.jtdgss.TdgssManager.<init>(DashoA1*..)
> > attempt_201107010928_0398_m_000000_0:   at
> > com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)
> > attempt_201107010928_0398_m_000000_0:   at
> >
> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:612)
> > attempt_201107010928_0398_m_000000_0:   at
> >
> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:630)
> > attempt_201107010928_0398_m_000000_0:   at
> > com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(
> > On Mon, Aug 29, 2011 at 2:19 PM, SRINIVAS SURASANI <va...@gmail.com>
> wrote:
> >>
> >> Arvind,
> >>
> >> I have subscribed at sqoop-user@incubator.apache.org and posted
> Question .
> >> Sorry for the inconvinence from my end, since I'm close to deadline I'm
> >> taking your valuable time.
> >>
> >> sqoop list-tables --driver com.teradata.jdbc.TeraDriver --connect
> >> jdbc:teradata://PKTD/E1_CMS_WORK --username srini -P
> >> Iam getting the following error:
> >> 11/08/29 13:08:03 INFO manager.SqlManager: Using default fetchSize of
> 1000
> >> GSSException: Failure unspecified at GSS-API level (Mechanism level:
> >> UserFile parameter null)
> >>         at com.teradata.tdgss.jtdgss.TdgssParseXml.<init>(DashoA1*..)
> >>         at com.teradata.tdgss.jtdgss.TdgssConfigApi.<init>(DashoA1*..)
> >>         at com.teradata.tdgss.jtdgss.TdgssManager.<init>(DashoA1*..)
> >>         at
> com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)
> >>         at
> >>
> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:612)
> >>         at
> >>
> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:630)
> >>         at
> >>
> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(GenericTeraEncrypt.java:723)
> >>         at com.teradata.jdbc.AuthMechanism.<init>(AuthMechanism.java:50)
> >>         at
> >>
> com.teradata.jdbc.jdbc.GenericInitDBConfigState.action(GenericInitDBConfigState.java:105)
> >>         at
> >>
> com.teradata.jdbc.jdbc.GenericLogonController.run(GenericLogonController.java:49)
> >>         at com.teradata.jdbc.jdbc_4.TDSession.<init>(TDSession.java:199)
> >>         at
> >>
> com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalConnection.<init>(TeraLocalConnection.java:95)
> >>         at
> >>
> com.teradata.jdbc.jdbc.ConnectionFactory.createConnection(ConnectionFactory.java:54)
> >>         at com.teradata.jdbc.TeraDriver.doConnect(TeraDriver.java:217)
> >>         at com.teradata.jdbc.TeraDriver.connect(TeraDriver.java:150)
> >>         at java.sql.DriverManager.getConnection(DriverManager.java:582)
> >>         at java.sql.DriverManager.getConnection(DriverManager.java:185)
> >>         at
> >>
> com.cloudera.sqoop.manager.SqlManager.makeConnection(SqlManager.java:643)
> >>         at
> >>
> com.cloudera.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:51)
> >>         at
> >> com.cloudera.sqoop.manager.SqlManager.listTables(SqlManager.java:270)
> >>         at
> >> com.cloudera.sqoop.tool.ListTablesTool.run(ListTablesTool.java:49)
> >>         at com.cloudera.sqoop.Sqoop.run(Sqoop.java:144)
> >>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> >>         at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:180)
> >>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:219)
> >>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:228)
> >>         at com.cloudera.sqoop.Sqoop.main(Sqoop.java:237)
> >> 11/08/29 13:08:04 ERROR sqoop.Sqoop: Got exception running Sqoop:
> >> java.lang.NullPointerException
> >> java.lang.NullPointerException
> >>         at
> >> com.teradata.tdgss.jtdgss.TdgssConfigApi.GetMechanisms(DashoA1*..)
> >>         at com.teradata.tdgss.jtdgss.TdgssManager.<init>(DashoA1*..)
> >>         at
> com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)
> >>         at
> >>
> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:612)
> >>         at
> >>
> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:630)
> >>         at
> >>
> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(GenericTeraEncrypt.java:723)
> >>         at com.teradata.jdbc.AuthMechanism.<init>(AuthMechanism.java:50)
> >>         at
> >>
> com.teradata.jdbc.jdbc.GenericInitDBConfigState.action(GenericInitDBConfigState.java:105)
> >>         at
> >>
> com.teradata.jdbc.jdbc.GenericLogonController.run(GenericLogonController.java:49)
> >>         at com.teradata.jdbc.jdbc_4.TDSession.<init>(TDSession.java:199)
> >>         at
> >>
> com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalConnection.<init>(TeraLocalConnection.java:95)
> >>         at
> >>
> com.teradata.jdbc.jdbc.ConnectionFactory.createConnection(ConnectionFactory.java:54)
> >>         at com.teradata.jdbc.TeraDriver.doConnect(TeraDriver.java:217)
> >>         at com.teradata.jdbc.TeraDriver.connect(TeraDriver.java:150)
> >>         at java.sql.DriverManager.getConnection(DriverManager.java:582)
> >>         at java.sql.DriverManager.getConnection(DriverManager.java:185)
> >>         at
> >>
> com.cloudera.sqoop.manager.SqlManager.makeConnection(SqlManager.java:643)
> >>         at
> >>
> com.cloudera.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:51)
> >>         at
> >> com.cloudera.sqoop.manager.SqlManager.listTables(SqlManager.java:270)
> >>         at
> >> com.cloudera.sqoop.tool.ListTablesTool.run(ListTablesTool.java:49)
> >>         at com.cloudera.sqoop.Sqoop.run(Sqoop.java:144)
> >>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> >>         at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:180)
> >>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:219)
> >>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:228)
> >>         at com.cloudera.sqoop.Sqoop.main(Sqoop.java:237)
> >> Thanks,
> >> Srini
> >>
> >>
> >>
> >> On Mon, Aug 29, 2011 at 12:58 PM, arvind@cloudera.com
> >> <ar...@cloudera.com> wrote:
> >>>
> >>> [Please subscribe and respond to sqoop-user@incubator.apache.org]
> >>>
> >>> Please use HADOOP_CLASSPATH instead of CLASSPATH. Also, in order to
> >>> use the generic JDBC connector, you will have to specify the driver
> >>> class explicitly via the command line option --driver
> >>> com.teradata.jdbc.TeraDriver.
> >>>
> >>> Thanks,
> >>> Arvind
> >>>
> >>> On Mon, Aug 29, 2011 at 9:53 AM, SRINIVAS SURASANI <va...@gmail.com>
> >>> wrote:
> >>> > Arvind,
> >>> > I have set the classpath to teradata4.jar [ not placed the
> >>> > teradata4.jar in
> >>> > sqoop lib, as I dont have permissions].
> >>> > I'm getting the following error
> >>> >
> >>> > sqoop list-tables --connect jdbc:teradata://PKTD/E1_CMS_WORK
> --username
> >>> > srini -P
> >>> > ERROR: tool.BaseSqoopTool: Got error creating database manager:
> >>> > java.io.IOexception: No manager for connect string:
> >>> > jdbc:teradata:///PKTD/E1_CMS_WORK
> >>> >    at com.cloudera.sqoop.ConnFactory.getManager(ConnFactory.java:119)
> >>> >    at
> >>> > com.cloudera.sqoop.tool.BaseSqoopTool.init(BaseSqoopTool.java:186)
> >>> >    ...
> >>> >    ...
> >>> > Thanks,
> >>> > Srini
> >>> >
> >>> >
> >>> >
> >>> > On Mon, Aug 29, 2011 at 8:52 AM, SRINIVAS SURASANI <vasajb@gmail.com
> >
> >>> > wrote:
> >>> >>
> >>> >> Thanks-a-lot Arvind.
> >>> >>
> >>> >> On Mon, Aug 29, 2011 at 8:45 AM, arvind@cloudera.com
> >>> >> <ar...@cloudera.com>
> >>> >> wrote:
> >>> >>>
> >>> >>> [Moving the thread to sqoop-user@incubator.apache.org]
> >>> >>>
> >>> >>> Hi Srini,
> >>> >>>
> >>> >>> You should be able to use the generic JDBC connector to
> import/export
> >>> >>> from Teradata. There is also a specialized connector that is
> >>> >>> available
> >>> >>> for use with Teradata if you are interested. This connector is not
> a
> >>> >>> part of Sqoop and can be obtained from Cloudera by going to:
> >>> >>>
> >>> >>> http://www.cloudera.com/partners/connectors/
> >>> >>>
> >>> >>> Thanks,
> >>> >>> Arvind
> >>> >>>
> >>> >>> On Mon, Aug 29, 2011 at 8:17 AM, SRINIVAS SURASANI <
> vasajb@gmail.com>
> >>> >>> wrote:
> >>> >>> > I have csv file in hadoop and looking to load into Teradata. I
> was
> >>> >>> > wondering does the sqoop works with Terradata.(with JDBC jar
> >>> >>> > placing
> >>> >>> > in sqoop lib dir).
> >>> >>> >
> >>> >>> > Regards
> >>> >>> > Srini
> >>> >>> >
> >>> >>> > --
> >>> >>> > NOTE: The mailing list sqoop-user@cloudera.org is deprecated in
> >>> >>> > favor
> >>> >>> > of Apache Sqoop mailing list sqoop-user@incubator.apache.org.
> >>> >>> > Please
> >>> >>> > subscribe to it by sending an email to
> >>> >>> > incubator-sqoop-user-subscribe@apache.org.
> >>> >>> >
> >>> >>>
> >>> >>> --
> >>> >>> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in
> favor
> >>> >>> of
> >>> >>> Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please
> >>> >>> subscribe
> >>> >>> to it by sending an email to
> >>> >>> incubator-sqoop-user-subscribe@apache.org.
> >>> >>
> >>> >
> >>> > --
> >>> > NOTE: The mailing list sqoop-user@cloudera.org is deprecated in
> favor
> >>> > of
> >>> > Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please
> >>> > subscribe
> >>> > to it by sending an email to
> incubator-sqoop-user-subscribe@apache.org.
> >>> >
> >>>
> >>> --
> >>> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor
> of
> >>> Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please
> subscribe
> >>> to it by sending an email to incubator-sqoop-user-subscribe@apache.org
> .
> >>
> >
> >
>

Re: [sqoop-user] Sqoop-with Terradata

Posted by Arvind Prabhakar <ar...@apache.org>.
Srini,

This is happening because the GSS config Jar is not getting put in
Distributed Cache. Sqoop only puts certain jars in the cache as
opposed to putting every jar that exists in its classpath. In order to
force any Jar to be put in the Distributed Cache, you must copy it
over to Sqoop's lib directory.

Thanks,
Arvind

On Tue, Aug 30, 2011 at 9:36 PM, SRINIVAS SURASANI <va...@gmail.com> wrote:
> Getting the error while exporting .. And from my observation while compiling
> .java, sets the classpath for terajdbc4.jar and tdgssconfig.jar ( as I
> marked in bold letters below) but just before launching map-reduce adding
> jar classpath of tdgssconfig.jar.
>  I set the HADOOP_CLASSPTH=<path to> terajdbc4.jar:<path to>tdgssconfig.jar
> Any Help Appreciated.
> $ sqoop export --verbose --driver com.teradata.jdbc.TeraDriver --connect
> jdbc:teradata://TD/DB --username WBD -P --table DB.Temp_Table --export-dir
> /user/hadoop/sqoop_test.txt --fields-terminated-by , --lines-terminated-by
> \n -m 1>
> 11/08/30 22:59:43 DEBUG tool.BaseSqoopTool: Enabled debug logging.
> Enter password:
> 11/08/30 22:59:50 DEBUG sqoop.ConnFactory: Loaded manager factory:
> com.cloudera.sqoop.manager.DefaultManagerFactory
> 11/08/30 22:59:50 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
> com.cloudera.sqoop.manager.DefaultManagerFactory
> 11/08/30 22:59:50 INFO manager.SqlManager: Using default fetchSize of 1000
> 11/08/30 22:59:50 DEBUG sqoop.ConnFactory: Instantiated ConnManager
> com.cloudera.sqoop.manager.GenericJdbcManager@2b76e552
> 11/08/30 22:59:50 INFO tool.CodeGenTool: Beginning code generation
> 11/08/30 22:59:51 DEBUG manager.SqlManager: Using fetchSize for next query:
> 1000
> 11/08/30 22:59:51 INFO manager.SqlManager: Executing SQL statement: SELECT
> t.* FROM DB.Temp_Table AS t WHERE 1=0
> 11/08/30 22:59:51 DEBUG manager.SqlManager: Using fetchSize for next query:
> 1000
> 11/08/30 22:59:51 INFO manager.SqlManager: Executing SQL statement: SELECT
> t.* FROM DB.Temp_Table AS t WHERE 1=0
> 11/08/30 22:59:51 DEBUG orm.ClassWriter: selected columns:
> 11/08/30 22:59:51 DEBUG orm.ClassWriter:   NAME
> 11/08/30 22:59:51 DEBUG orm.ClassWriter:   SALARY
> 11/08/30 22:59:51 DEBUG orm.ClassWriter: Writing source file:
> /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB_Temp_Table.java
> 11/08/30 22:59:51 DEBUG orm.ClassWriter: Table name:DB.Temp_Table
> 11/08/30 22:59:51 DEBUG orm.ClassWriter: Columns: NAME:12, SALARY:3,
> 11/08/30 22:59:51 DEBUG orm.ClassWriter: sourceFilename is
> DB_Temp_Table.java
> 11/08/30 22:59:51 DEBUG orm.CompilationManager: Found existing
> /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/
> 11/08/30 22:59:51 INFO orm.CompilationManager: HADOOP_HOME is
> /usr/lib/hadoop
> 11/08/30 22:59:51 INFO orm.CompilationManager: Found hadoop core jar at:
> /usr/lib/hadoop/hadoop-0.20.2-cdh3u0-core.jar
> 11/08/30 22:59:51 DEBUG orm.CompilationManager: Adding source file:
> /tmp/sqoop-haoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB_Temp_Table.java
> 11/08/30 22:59:51 DEBUG orm.CompilationManager: Invoking javac with args:
> 11/08/30 22:59:51 DEBUG orm.CompilationManager:   -sourcepath
> 11/08/30 22:59:51 DEBUG orm.CompilationManager:
> /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/
> 11/08/30 22:59:51 DEBUG orm.CompilationManager:   -d
> 11/08/30 22:59:51 DEBUG orm.CompilationManager:
> /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/
> 11/08/30 22:59:51 DEBUG orm.CompilationManager:   -classpath
> 11/08/30 22:59:51 DEBUG orm.CompilationManager:
> /usr/lib/hadoop:/usr/lib/hadoop/hadoop-core-0.20.2-cdh3u0.jar:/usr/lib/hadoop/lib/ant-contrib-1.0b3.jar:/usr/lib/hadoop/lib/aspectjrt-1.6.5.jar:/usr/lib/hadoop/lib/aspectjtools-1.6.5.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/commons-daemon-1.0.1.jar:/usr/lib/hadoop/lib/commons-el-1.0.jar:/usr/lib/hadoop/lib/commons-httpclient-3.0.1.jar:/usr/lib/hadoop/lib/commons-logging-1.0.4.jar:/usr/lib/hadoop/lib/commons-logging-api-1.0.4.jar:/usr/lib/hadoop/lib/commons-net-1.4.1.jar:/usr/lib/hadoop/lib/core-3.1.1.jar:/usr/lib/hadoop/lib/elephant-bird-1.0.jar:/usr/lib/hadoop/lib/hadoop-fairscheduler-0.20.2-cdh3u0.jar:/usr/lib/hadoop/lib/hadoop-lzo-0.4.8.jar:/usr/lib/hadoop/lib/hsqldb-1.8.0.10.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.5.2.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.5.2.jar:/usr/lib/hadoop/lib/jasper-compiler-5.5.12.jar:/usr/lib/hadoop/lib/jasper-runtime-5.5.12.jar:/usr/lib/hadoop/lib/jets3t-0.6.1.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/jetty-servlet-tester-6.1.26.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/junit-4.5.jar:/usr/lib/hadoop/lib/kfs-0.2.2.jar:/usr/lib/hadoop/lib/log4j-1.2.15.jar:/usr/lib/hadoop/lib/mockito-all-1.8.2.jar:/usr/lib/hadoop/lib/oro-2.0.8.jar:/usr/lib/hadoop/lib/protobuf-java-2.3.0.jar:/usr/lib/hadoop/lib/servlet-api-2.5-20081211.jar:/usr/lib/hadoop/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hadoop/lib/slf4j-api-1.4.3.jar:/usr/lib/hadoop/lib/slf4j-api-1.5.8.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.4.3.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.5.10.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/yamlbeans-0.9.3.jar:/usr/lib/hadoop/lib/jsp-2.1/jsp-2.1.jar:/usr/lib/hadoop/lib/jsp-2.1/jsp-api-2.1.jar:/usr/lib/sqoop/conf:/usr/lib/hbase/conf::/usr/lib/sqoop/lib/ant-contrib-1.0b3.jar:/usr/lib/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/usr/lib/sqoop/lib/commons-io-1.4.jar:/usr/lib/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar:/usr/lib/sqoop/lib/ivy-2.0.0-rc2.jar:/usr/lib/sqoop/lib/sqljdbc4.jar:/usr/lib/hbase/hbase-0.90.1-cdh3u0-tests.jar:/usr/lib/hbase/hbase-0.90.1-cdh3u0.jar:/usr/lib/hbase/lib/activation-1.1.jar:/usr/lib/hbase/lib/asm-3.1.jar:/usr/lib/hbase/lib/avro-1.3.3.jar:/usr/lib/hbase/lib/commons-cli-1.2.jar:/usr/lib/hbase/lib/commons-codec-1.4.jar:/usr/lib/hbase/lib/commons-el-1.0.jar:/usr/lib/hbase/lib/commons-httpclient-3.1.jar:/usr/lib/hbase/lib/commons-lang-2.5.jar:/usr/lib/hbase/lib/commons-logging-1.1.1.jar:/usr/lib/hbase/lib/commons-net-1.4.1.jar:/usr/lib/hbase/lib/core-3.1.1.jar:/usr/lib/hbase/lib/guava-r06.jar:/usr/lib/hbase/lib/hadoop-core.jar:/usr/lib/hbase/lib/hbase-0.90.1-cdh3u0.jar:/usr/lib/hbase/lib/jackson-core-asl-1.5.2.jar:/usr/lib/hbase/lib/jackson-jaxrs-1.5.5.jar:/usr/lib/hbase/lib/jackson-mapper-asl-1.5.2.jar:/usr/lib/hbase/lib/jackson-xc-1.5.5.jar:/usr/lib/hbase/lib/jasper-compiler-5.5.23.jar:/usr/lib/hbase/lib/jasper-runtime-5.5.23.jar:/usr/lib/hbase/lib/jaxb-api-2.1.jar:/usr/lib/hbase/lib/jaxb-impl-2.1.12.jar:/usr/lib/hbase/lib/jersey-core-1.4.jar:/usr/lib/hbase/lib/jersey-json-1.4.jar:/usr/lib/hbase/lib/jersey-server-1.4.jar:/usr/lib/hbase/lib/jettison-1.1.jar:/usr/lib/hbase/lib/jetty-6.1.26.jar:/usr/lib/hbase/lib/jetty-util-6.1.26.jar:/usr/lib/hbase/lib/jruby-complete-1.0.3.jar:/usr/lib/hbase/lib/jsp-2.1-6.1.14.jar:/usr/lib/hbase/lib/jsp-api-2.1-6.1.14.jar:/usr/lib/hbase/lib/jsp-api-2.1.jar:/usr/lib/hbase/lib/jsr311-api-1.1.1.jar:/usr/lib/hbase/lib/log4j-1.2.16.jar:/usr/lib/hbase/lib/protobuf-java-2.3.0.jar:/usr/lib/hbase/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hbase/lib/servlet-api-2.5.jar:/usr/lib/hbase/lib/slf4j-api-1.5.8.jar:/usr/lib/hbase/lib/slf4j-log4j12-1.5.8.jar:/usr/lib/hbase/lib/stax-api-1.0.1.jar:/usr/lib/hbase/lib/thrift-0.2.0.jar:/usr/lib/hbase/lib/xmlenc-0.52.jar:/usr/lib/hbase/lib/zookeeper.jar:/usr/lib/zookeeper/zookeeper-3.3.3-cdh3u0.jar:/usr/lib/zookeeper/zookeeper.jar:/usr/lib/zookeeper/lib/jline-0.9.94.jar:/usr/lib/zookeeper/lib/log4j-1.2.15.jar:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar:/usr/lib/sqoop/sqoop-test-1.2.0-cdh3u0.jar:<somepath>/lib/tdgssconfig.jar:<somepath>/lib/terajdbc4.jar:/usr/lib/hadoop/hadoop-0.20.2-cdh3u0-core.jar:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
> Note:
> /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB_Temp_Table.java
> uses or overrides a deprecated API.
> Note: Recompile with -Xlint:deprecation for details.
> 11/08/30 22:59:52 INFO orm.CompilationManager: Writing jar file:
> /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB.Temp_Table.jar
> 11/08/30 22:59:52 DEBUG orm.CompilationManager: Scanning for .class files in
> directory: /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a
> 11/08/30 22:59:52 DEBUG orm.CompilationManager: Got classfile:
> /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DBTemp_Table.class
> -> DB_Temp_Table.class
> 11/08/30 22:59:52 DEBUG orm.CompilationManager: Finished writing jar file
> /tmp/sqoop-hadrdev/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB.Temp_Table.jar
> 11/08/30 22:59:52 INFO mapreduce.ExportJobBase: Beginning export of
> DB.Temp_Table
> 11/08/30 22:59:52 DEBUG mapreduce.JobBase: Using InputFormat: class
> com.cloudera.sqoop.mapreduce.ExportInputFormat
> 11/08/30 22:59:52 DEBUG manager.SqlManager: Using fetchSize for next query:
> 1000
> 11/08/30 22:59:52 INFO manager.SqlManager: Executing SQL statement: SELECT
> t.* FROM DB.Temp_Table AS t WHERE 1=0
> 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
> 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:<somepath>/lib/terajdbc4.jar
> 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
> 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
> 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/lib/ivy-2.0.0-rc2.jar
> 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/lib/commons-io-1.4.jar
> 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
> 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/lib/sqljdbc4.jar
> 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/lib/ant-contrib-1.0b3.jar
> 11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
> file:/usr/lib/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar
> 11/08/30 22:59:53 INFO hdfs.DFSClient: Created HDFS_DELEGATION_TOKEN token
> 795 for hadoop
> 11/08/30 22:59:53 INFO security.TokenCache: Got dt for
> hdfs://<cname>:9000/tmp/hadoop-mapred/mapred/staging/hadoop/.staging/job_201107010928_0398/libjars/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar;uri=xx.xxx.xx.xx:9000;t.service=xx.xxx.xx.xx:9000
> 11/08/30 22:59:53 INFO input.FileInputFormat: Total input paths to process :
> 1
> 11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat: Target numMapTasks=1
> 11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat: Total input bytes=18
> 11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat: maxSplitSize=18
> 11/08/30 22:59:53 INFO input.FileInputFormat: Total input paths to process :
> 1
> 11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat: Generated splits:
> 11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat:
> Paths:/user/hadrdev/sqoop_test.txt:0+18 Locations:
> 11/08/30 22:59:53 INFO mapred.JobClient: Running job: job_201107010928_0398
> 11/08/30 22:59:54 INFO mapred.JobClient:  map 0% reduce 0%
> 11/08/30 23:00:01 INFO mapred.JobClient: Task Id :
> attempt_201107010928_0398_m_000000_0, Status : FAILED
> java.io.IOException: java.lang.NullPointerException
>         at
> com.cloudera.sqoop.mapreduce.ExportOutputFormat.getRecordWriter(ExportOutputFormat.java:80)
>         at
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.<init>(MapTask.java:520)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:635)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:322)
>         at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:396)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1115)
>         at org.apache.hadoop.mapred.Child.main(Child.java:262)
> Caused by: java.lang.NullPointerException
>         at
> com.teradata.tdgss.jtdgss.TdgssConfigApi.GetMechanisms(DashoA1*..)
>         at com.teradata.tdgss.jtdgss.TdgssManager.<init>(DashoA1*..)
>         at com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)
>         at
> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:612)
>         at
> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:630)
>         at
> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(GenericTeraEncrypt.java:723)
>         at com.teradata.jdbc.AuthMechanism.<init>(AuthMechanism.java:50)
>         at
> com.teradata.jdbc.jdbc.GenericInitDBConfigState.action(GenericInitDBConfigState.java:105)
>         at
> com.teradata.jdbc.jdbc.GenericLogonController.run(GenericLogonController.java:49)
>         at com.teradata.jdbc.jdbc_4.TDSession.<init>(TDSession.java:199)
>         at
> com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalConnection.<init>(TeraLocalConnection.java:95)
>         at
> com.teradata.jdbc.jdbc.ConnectionFactory.createConnection(ConnectionFactory.java:54)
>         at com.teradata.jdbc.TeraDriver.doConnect(TeraDriver.java:217)
>         at com.teradata.jdbc.TeraDriver.connect(TeraDriver.java:150)
>         at java.sql.DriverManager.getConnection(DriverManager.java:582)
>         at java.sql.DriverManager.getConnection(DriverManager.java:185)
>         at
> com.cloudera.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:184)
>         at
> com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.<init>(AsyncSqlRecordWriter.java:73)
>         at
> com.cloudera.sqoop.mapreduce.ExportOutputFormat$ExportRecordWriter.<init>(ExportOutputFormat.java:96)
>         at
> com.cloudera.sqoop.mapreduce.ExportOutputFormat.getRecordWriter(ExportOutputFormat.java:78)
>         ... 8 more
>
> attempt_201107010928_0398_m_000000_0: GSSException: Failure unspecified at
> GSS-API level (Mechanism level: UserFile parameter null)
> attempt_201107010928_0398_m_000000_0:   at
> com.teradata.tdgss.jtdgss.TdgssParseXml.<init>(DashoA1*..)
> attempt_201107010928_0398_m_000000_0:   at
> com.teradata.tdgss.jtdgss.TdgssConfigApi.<init>(DashoA1*..)
> attempt_201107010928_0398_m_000000_0:   at
> com.teradata.tdgss.jtdgss.TdgssManager.<init>(DashoA1*..)
> attempt_201107010928_0398_m_000000_0:   at
> com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)
> attempt_201107010928_0398_m_000000_0:   at
> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:612)
> attempt_201107010928_0398_m_000000_0:   at
> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:630)
> attempt_201107010928_0398_m_000000_0:   at
> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(
> On Mon, Aug 29, 2011 at 2:19 PM, SRINIVAS SURASANI <va...@gmail.com> wrote:
>>
>> Arvind,
>>
>> I have subscribed at sqoop-user@incubator.apache.org and posted Question .
>> Sorry for the inconvinence from my end, since I'm close to deadline I'm
>> taking your valuable time.
>>
>> sqoop list-tables --driver com.teradata.jdbc.TeraDriver --connect
>> jdbc:teradata://PKTD/E1_CMS_WORK --username srini -P
>> Iam getting the following error:
>> 11/08/29 13:08:03 INFO manager.SqlManager: Using default fetchSize of 1000
>> GSSException: Failure unspecified at GSS-API level (Mechanism level:
>> UserFile parameter null)
>>         at com.teradata.tdgss.jtdgss.TdgssParseXml.<init>(DashoA1*..)
>>         at com.teradata.tdgss.jtdgss.TdgssConfigApi.<init>(DashoA1*..)
>>         at com.teradata.tdgss.jtdgss.TdgssManager.<init>(DashoA1*..)
>>         at com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)
>>         at
>> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:612)
>>         at
>> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:630)
>>         at
>> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(GenericTeraEncrypt.java:723)
>>         at com.teradata.jdbc.AuthMechanism.<init>(AuthMechanism.java:50)
>>         at
>> com.teradata.jdbc.jdbc.GenericInitDBConfigState.action(GenericInitDBConfigState.java:105)
>>         at
>> com.teradata.jdbc.jdbc.GenericLogonController.run(GenericLogonController.java:49)
>>         at com.teradata.jdbc.jdbc_4.TDSession.<init>(TDSession.java:199)
>>         at
>> com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalConnection.<init>(TeraLocalConnection.java:95)
>>         at
>> com.teradata.jdbc.jdbc.ConnectionFactory.createConnection(ConnectionFactory.java:54)
>>         at com.teradata.jdbc.TeraDriver.doConnect(TeraDriver.java:217)
>>         at com.teradata.jdbc.TeraDriver.connect(TeraDriver.java:150)
>>         at java.sql.DriverManager.getConnection(DriverManager.java:582)
>>         at java.sql.DriverManager.getConnection(DriverManager.java:185)
>>         at
>> com.cloudera.sqoop.manager.SqlManager.makeConnection(SqlManager.java:643)
>>         at
>> com.cloudera.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:51)
>>         at
>> com.cloudera.sqoop.manager.SqlManager.listTables(SqlManager.java:270)
>>         at
>> com.cloudera.sqoop.tool.ListTablesTool.run(ListTablesTool.java:49)
>>         at com.cloudera.sqoop.Sqoop.run(Sqoop.java:144)
>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>>         at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:180)
>>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:219)
>>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:228)
>>         at com.cloudera.sqoop.Sqoop.main(Sqoop.java:237)
>> 11/08/29 13:08:04 ERROR sqoop.Sqoop: Got exception running Sqoop:
>> java.lang.NullPointerException
>> java.lang.NullPointerException
>>         at
>> com.teradata.tdgss.jtdgss.TdgssConfigApi.GetMechanisms(DashoA1*..)
>>         at com.teradata.tdgss.jtdgss.TdgssManager.<init>(DashoA1*..)
>>         at com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)
>>         at
>> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:612)
>>         at
>> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:630)
>>         at
>> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(GenericTeraEncrypt.java:723)
>>         at com.teradata.jdbc.AuthMechanism.<init>(AuthMechanism.java:50)
>>         at
>> com.teradata.jdbc.jdbc.GenericInitDBConfigState.action(GenericInitDBConfigState.java:105)
>>         at
>> com.teradata.jdbc.jdbc.GenericLogonController.run(GenericLogonController.java:49)
>>         at com.teradata.jdbc.jdbc_4.TDSession.<init>(TDSession.java:199)
>>         at
>> com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalConnection.<init>(TeraLocalConnection.java:95)
>>         at
>> com.teradata.jdbc.jdbc.ConnectionFactory.createConnection(ConnectionFactory.java:54)
>>         at com.teradata.jdbc.TeraDriver.doConnect(TeraDriver.java:217)
>>         at com.teradata.jdbc.TeraDriver.connect(TeraDriver.java:150)
>>         at java.sql.DriverManager.getConnection(DriverManager.java:582)
>>         at java.sql.DriverManager.getConnection(DriverManager.java:185)
>>         at
>> com.cloudera.sqoop.manager.SqlManager.makeConnection(SqlManager.java:643)
>>         at
>> com.cloudera.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:51)
>>         at
>> com.cloudera.sqoop.manager.SqlManager.listTables(SqlManager.java:270)
>>         at
>> com.cloudera.sqoop.tool.ListTablesTool.run(ListTablesTool.java:49)
>>         at com.cloudera.sqoop.Sqoop.run(Sqoop.java:144)
>>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>>         at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:180)
>>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:219)
>>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:228)
>>         at com.cloudera.sqoop.Sqoop.main(Sqoop.java:237)
>> Thanks,
>> Srini
>>
>>
>>
>> On Mon, Aug 29, 2011 at 12:58 PM, arvind@cloudera.com
>> <ar...@cloudera.com> wrote:
>>>
>>> [Please subscribe and respond to sqoop-user@incubator.apache.org]
>>>
>>> Please use HADOOP_CLASSPATH instead of CLASSPATH. Also, in order to
>>> use the generic JDBC connector, you will have to specify the driver
>>> class explicitly via the command line option --driver
>>> com.teradata.jdbc.TeraDriver.
>>>
>>> Thanks,
>>> Arvind
>>>
>>> On Mon, Aug 29, 2011 at 9:53 AM, SRINIVAS SURASANI <va...@gmail.com>
>>> wrote:
>>> > Arvind,
>>> > I have set the classpath to teradata4.jar [ not placed the
>>> > teradata4.jar in
>>> > sqoop lib, as I dont have permissions].
>>> > I'm getting the following error
>>> >
>>> > sqoop list-tables --connect jdbc:teradata://PKTD/E1_CMS_WORK --username
>>> > srini -P
>>> > ERROR: tool.BaseSqoopTool: Got error creating database manager:
>>> > java.io.IOexception: No manager for connect string:
>>> > jdbc:teradata:///PKTD/E1_CMS_WORK
>>> >    at com.cloudera.sqoop.ConnFactory.getManager(ConnFactory.java:119)
>>> >    at
>>> > com.cloudera.sqoop.tool.BaseSqoopTool.init(BaseSqoopTool.java:186)
>>> >    ...
>>> >    ...
>>> > Thanks,
>>> > Srini
>>> >
>>> >
>>> >
>>> > On Mon, Aug 29, 2011 at 8:52 AM, SRINIVAS SURASANI <va...@gmail.com>
>>> > wrote:
>>> >>
>>> >> Thanks-a-lot Arvind.
>>> >>
>>> >> On Mon, Aug 29, 2011 at 8:45 AM, arvind@cloudera.com
>>> >> <ar...@cloudera.com>
>>> >> wrote:
>>> >>>
>>> >>> [Moving the thread to sqoop-user@incubator.apache.org]
>>> >>>
>>> >>> Hi Srini,
>>> >>>
>>> >>> You should be able to use the generic JDBC connector to import/export
>>> >>> from Teradata. There is also a specialized connector that is
>>> >>> available
>>> >>> for use with Teradata if you are interested. This connector is not a
>>> >>> part of Sqoop and can be obtained from Cloudera by going to:
>>> >>>
>>> >>> http://www.cloudera.com/partners/connectors/
>>> >>>
>>> >>> Thanks,
>>> >>> Arvind
>>> >>>
>>> >>> On Mon, Aug 29, 2011 at 8:17 AM, SRINIVAS SURASANI <va...@gmail.com>
>>> >>> wrote:
>>> >>> > I have csv file in hadoop and looking to load into Teradata. I was
>>> >>> > wondering does the sqoop works with Terradata.(with JDBC jar
>>> >>> > placing
>>> >>> > in sqoop lib dir).
>>> >>> >
>>> >>> > Regards
>>> >>> > Srini
>>> >>> >
>>> >>> > --
>>> >>> > NOTE: The mailing list sqoop-user@cloudera.org is deprecated in
>>> >>> > favor
>>> >>> > of Apache Sqoop mailing list sqoop-user@incubator.apache.org.
>>> >>> > Please
>>> >>> > subscribe to it by sending an email to
>>> >>> > incubator-sqoop-user-subscribe@apache.org.
>>> >>> >
>>> >>>
>>> >>> --
>>> >>> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor
>>> >>> of
>>> >>> Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please
>>> >>> subscribe
>>> >>> to it by sending an email to
>>> >>> incubator-sqoop-user-subscribe@apache.org.
>>> >>
>>> >
>>> > --
>>> > NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor
>>> > of
>>> > Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please
>>> > subscribe
>>> > to it by sending an email to incubator-sqoop-user-subscribe@apache.org.
>>> >
>>>
>>> --
>>> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor of
>>> Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please subscribe
>>> to it by sending an email to incubator-sqoop-user-subscribe@apache.org.
>>
>
>

Re: [sqoop-user] Sqoop-with Terradata

Posted by SRINIVAS SURASANI <va...@gmail.com>.
Getting the error while exporting .. And from my observation while compiling
.java, sets the classpath for terajdbc4.jar and tdgssconfig.ja*r ( as I
marked in bold letters below)* but just before launching map-reduce adding
jar classpath of tdgssconfig.jar.
 I set the HADOOP_CLASSPTH=<path to> terajdbc4.jar:<path to>tdgssconfig.ja*r
*

Any Help Appreciated.

$ sqoop export --verbose --driver com.teradata.jdbc.TeraDriver --connect
jdbc:teradata://TD/DB --username WBD -P --table DB.Temp_Table --export-dir
/user/hadoop/sqoop_test.txt --fields-terminated-by , --lines-terminated-by
\n -m 1>
11/08/30 22:59:43 DEBUG tool.BaseSqoopTool: Enabled debug logging.
Enter password:
11/08/30 22:59:50 DEBUG sqoop.ConnFactory: Loaded manager factory:
com.cloudera.sqoop.manager.DefaultManagerFactory
11/08/30 22:59:50 DEBUG sqoop.ConnFactory: Trying ManagerFactory:
com.cloudera.sqoop.manager.DefaultManagerFactory
11/08/30 22:59:50 INFO manager.SqlManager: Using default fetchSize of 1000
11/08/30 22:59:50 DEBUG sqoop.ConnFactory: Instantiated ConnManager
com.cloudera.sqoop.manager.GenericJdbcManager@2b76e552
11/08/30 22:59:50 INFO tool.CodeGenTool: Beginning code generation
11/08/30 22:59:51 DEBUG manager.SqlManager: Using fetchSize for next query:
1000
11/08/30 22:59:51 INFO manager.SqlManager: Executing SQL statement: SELECT
t.* FROM DB.Temp_Table AS t WHERE 1=0
11/08/30 22:59:51 DEBUG manager.SqlManager: Using fetchSize for next query:
1000
11/08/30 22:59:51 INFO manager.SqlManager: Executing SQL statement: SELECT
t.* FROM DB.Temp_Table AS t WHERE 1=0
11/08/30 22:59:51 DEBUG orm.ClassWriter: selected columns:
11/08/30 22:59:51 DEBUG orm.ClassWriter:   NAME
11/08/30 22:59:51 DEBUG orm.ClassWriter:   SALARY
11/08/30 22:59:51 DEBUG orm.ClassWriter: Writing source file:
/tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB_Temp_Table.java
11/08/30 22:59:51 DEBUG orm.ClassWriter: Table name:DB.Temp_Table
11/08/30 22:59:51 DEBUG orm.ClassWriter: Columns: NAME:12, SALARY:3,
11/08/30 22:59:51 DEBUG orm.ClassWriter: sourceFilename is
DB_Temp_Table.java
11/08/30 22:59:51 DEBUG orm.CompilationManager: Found existing
/tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/
11/08/30 22:59:51 INFO orm.CompilationManager: HADOOP_HOME is
/usr/lib/hadoop
11/08/30 22:59:51 INFO orm.CompilationManager: Found hadoop core jar at:
/usr/lib/hadoop/hadoop-0.20.2-cdh3u0-core.jar
11/08/30 22:59:51 DEBUG orm.CompilationManager: Adding source file:
/tmp/sqoop-haoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB_Temp_Table.java
11/08/30 22:59:51 DEBUG orm.CompilationManager: Invoking javac with args:
11/08/30 22:59:51 DEBUG orm.CompilationManager:   -sourcepath
11/08/30 22:59:51 DEBUG orm.CompilationManager:
/tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/
11/08/30 22:59:51 DEBUG orm.CompilationManager:   -d
11/08/30 22:59:51 DEBUG orm.CompilationManager:
/tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/
11/08/30 22:59:51 DEBUG orm.CompilationManager:   -classpath
11/08/30 22:59:51 DEBUG orm.CompilationManager:
/usr/lib/hadoop:/usr/lib/hadoop/hadoop-core-0.20.2-cdh3u0.jar:/usr/lib/hadoop/lib/ant-contrib-1.0b3.jar:/usr/lib/hadoop/lib/aspectjrt-1.6.5.jar:/usr/lib/hadoop/lib/aspectjtools-1.6.5.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/commons-daemon-1.0.1.jar:/usr/lib/hadoop/lib/commons-el-1.0.jar:/usr/lib/hadoop/lib/commons-httpclient-3.0.1.jar:/usr/lib/hadoop/lib/commons-logging-1.0.4.jar:/usr/lib/hadoop/lib/commons-logging-api-1.0.4.jar:/usr/lib/hadoop/lib/commons-net-1.4.1.jar:/usr/lib/hadoop/lib/core-3.1.1.jar:/usr/lib/hadoop/lib/elephant-bird-1.0.jar:/usr/lib/hadoop/lib/hadoop-fairscheduler-0.20.2-cdh3u0.jar:/usr/lib/hadoop/lib/hadoop-lzo-0.4.8.jar:/usr/lib/hadoop/lib/hsqldb-1.8.0.10.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.5.2.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.5.2.jar:/usr/lib/hadoop/lib/jasper-compiler-5.5.12.jar:/usr/lib/hadoop/lib/jasper-runtime-5.5.12.jar:/usr/lib/hadoop/lib/jets3t-0.6.1.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/jetty-servlet-tester-6.1.26.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/junit-4.5.jar:/usr/lib/hadoop/lib/kfs-0.2.2.jar:/usr/lib/hadoop/lib/log4j-1.2.15.jar:/usr/lib/hadoop/lib/mockito-all-1.8.2.jar:/usr/lib/hadoop/lib/oro-2.0.8.jar:/usr/lib/hadoop/lib/protobuf-java-2.3.0.jar:/usr/lib/hadoop/lib/servlet-api-2.5-20081211.jar:/usr/lib/hadoop/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hadoop/lib/slf4j-api-1.4.3.jar:/usr/lib/hadoop/lib/slf4j-api-1.5.8.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.4.3.jar:/usr/lib/hadoop/lib/slf4j-log4j12-1.5.10.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/hadoop/lib/yamlbeans-0.9.3.jar:/usr/lib/hadoop/lib/jsp-2.1/jsp-2.1.jar:/usr/lib/hadoop/lib/jsp-2.1/jsp-api-2.1.jar:/usr/lib/sqoop/conf:/usr/lib/hbase/conf::/usr/lib/sqoop/lib/ant-contrib-1.0b3.jar:/usr/lib/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar:/usr/lib/sqoop/lib/commons-io-1.4.jar:/usr/lib/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar:/usr/lib/sqoop/lib/ivy-2.0.0-rc2.jar:/usr/lib/sqoop/lib/sqljdbc4.jar:/usr/lib/hbase/hbase-0.90.1-cdh3u0-tests.jar:/usr/lib/hbase/hbase-0.90.1-cdh3u0.jar:/usr/lib/hbase/lib/activation-1.1.jar:/usr/lib/hbase/lib/asm-3.1.jar:/usr/lib/hbase/lib/avro-1.3.3.jar:/usr/lib/hbase/lib/commons-cli-1.2.jar:/usr/lib/hbase/lib/commons-codec-1.4.jar:/usr/lib/hbase/lib/commons-el-1.0.jar:/usr/lib/hbase/lib/commons-httpclient-3.1.jar:/usr/lib/hbase/lib/commons-lang-2.5.jar:/usr/lib/hbase/lib/commons-logging-1.1.1.jar:/usr/lib/hbase/lib/commons-net-1.4.1.jar:/usr/lib/hbase/lib/core-3.1.1.jar:/usr/lib/hbase/lib/guava-r06.jar:/usr/lib/hbase/lib/hadoop-core.jar:/usr/lib/hbase/lib/hbase-0.90.1-cdh3u0.jar:/usr/lib/hbase/lib/jackson-core-asl-1.5.2.jar:/usr/lib/hbase/lib/jackson-jaxrs-1.5.5.jar:/usr/lib/hbase/lib/jackson-mapper-asl-1.5.2.jar:/usr/lib/hbase/lib/jackson-xc-1.5.5.jar:/usr/lib/hbase/lib/jasper-compiler-5.5.23.jar:/usr/lib/hbase/lib/jasper-runtime-5.5.23.jar:/usr/lib/hbase/lib/jaxb-api-2.1.jar:/usr/lib/hbase/lib/jaxb-impl-2.1.12.jar:/usr/lib/hbase/lib/jersey-core-1.4.jar:/usr/lib/hbase/lib/jersey-json-1.4.jar:/usr/lib/hbase/lib/jersey-server-1.4.jar:/usr/lib/hbase/lib/jettison-1.1.jar:/usr/lib/hbase/lib/jetty-6.1.26.jar:/usr/lib/hbase/lib/jetty-util-6.1.26.jar:/usr/lib/hbase/lib/jruby-complete-1.0.3.jar:/usr/lib/hbase/lib/jsp-2.1-6.1.14.jar:/usr/lib/hbase/lib/jsp-api-2.1-6.1.14.jar:/usr/lib/hbase/lib/jsp-api-2.1.jar:/usr/lib/hbase/lib/jsr311-api-1.1.1.jar:/usr/lib/hbase/lib/log4j-1.2.16.jar:/usr/lib/hbase/lib/protobuf-java-2.3.0.jar:/usr/lib/hbase/lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hbase/lib/servlet-api-2.5.jar:/usr/lib/hbase/lib/slf4j-api-1.5.8.jar:/usr/lib/hbase/lib/slf4j-log4j12-1.5.8.jar:/usr/lib/hbase/lib/stax-api-1.0.1.jar:/usr/lib/hbase/lib/thrift-0.2.0.jar:/usr/lib/hbase/lib/xmlenc-0.52.jar:/usr/lib/hbase/lib/zookeeper.jar:/usr/lib/zookeeper/zookeeper-3.3.3-cdh3u0.jar:/usr/lib/zookeeper/zookeeper.jar:/usr/lib/zookeeper/lib/jline-0.9.94.jar:/usr/lib/zookeeper/lib/log4j-1.2.15.jar:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar:/usr/lib/sqoop/sqoop-test-1.2.0-cdh3u0.jar
*:<somepath>/lib/tdgssconfig.jar*:*<somepath>/lib/terajdbc4.ja*
r:/usr/lib/hadoop/hadoop-0.20.2-cdh3u0-core.jar:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
Note:
/tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB_Temp_Table.java
uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
11/08/30 22:59:52 INFO orm.CompilationManager: Writing jar file:
/tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB.Temp_Table.jar
11/08/30 22:59:52 DEBUG orm.CompilationManager: Scanning for .class files in
directory: /tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a
11/08/30 22:59:52 DEBUG orm.CompilationManager: Got classfile:
/tmp/sqoop-hadoop/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DBTemp_Table.class
-> DB_Temp_Table.class
11/08/30 22:59:52 DEBUG orm.CompilationManager: Finished writing jar file
/tmp/sqoop-hadrdev/compile/cc68f6e38603d6705ad75a3d8ea1bf3a/DB.Temp_Table.jar
11/08/30 22:59:52 INFO mapreduce.ExportJobBase: Beginning export of
DB.Temp_Table
11/08/30 22:59:52 DEBUG mapreduce.JobBase: Using InputFormat: class
com.cloudera.sqoop.mapreduce.ExportInputFormat
11/08/30 22:59:52 DEBUG manager.SqlManager: Using fetchSize for next query:
1000
11/08/30 22:59:52 INFO manager.SqlManager: Executing SQL statement: SELECT
t.* FROM DB.Temp_Table AS t WHERE 1=0
11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job *classpath:
file:<somepath>/lib/terajdbc4.ja*r
11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/lib/sqoop/sqoop-1.2.0-cdh3u0.jar
11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/lib/sqoop/lib/ivy-2.0.0-rc2.jar
11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/lib/sqoop/lib/commons-io-1.4.jar
11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/lib/sqoop/lib/ant-eclipse-1.0-jvm1.2.jar
11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/lib/sqoop/lib/sqljdbc4.jar
11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/lib/sqoop/lib/ant-contrib-1.0b3.jar
11/08/30 22:59:53 DEBUG mapreduce.JobBase: Adding to job classpath:
file:/usr/lib/sqoop/lib/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar
11/08/30 22:59:53 INFO hdfs.DFSClient: Created HDFS_DELEGATION_TOKEN token
795 for hadoop
11/08/30 22:59:53 INFO security.TokenCache: Got dt for
hdfs://<cname>:9000/tmp/hadoop-mapred/mapred/staging/hadoop/.staging/job_201107010928_0398/libjars/hadoop-mrunit-0.20.2-CDH3b2-SNAPSHOT.jar;uri=xx.xxx.xx.xx:9000;t.service=xx.xxx.xx.xx:9000
11/08/30 22:59:53 INFO input.FileInputFormat: Total input paths to process :
1
11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat: Target numMapTasks=1
11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat: Total input bytes=18
11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat: maxSplitSize=18
11/08/30 22:59:53 INFO input.FileInputFormat: Total input paths to process :
1
11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat: Generated splits:
11/08/30 22:59:53 DEBUG mapreduce.ExportInputFormat:
Paths:/user/hadrdev/sqoop_test.txt:0+18 Locations:
11/08/30 22:59:53 INFO mapred.JobClient: Running job: job_201107010928_0398
11/08/30 22:59:54 INFO mapred.JobClient:  map 0% reduce 0%
11/08/30 23:00:01 INFO mapred.JobClient: Task Id :
attempt_201107010928_0398_m_000000_0, Status : FAILED
java.io.IOException: java.lang.NullPointerException
        at
com.cloudera.sqoop.mapreduce.ExportOutputFormat.getRecordWriter(ExportOutputFormat.java:80)
        at
org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.<init>(MapTask.java:520)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:635)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:322)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1115)
        at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: java.lang.NullPointerException
        at
com.teradata.tdgss.jtdgss.TdgssConfigApi.GetMechanisms(DashoA1*..)
        at com.teradata.tdgss.jtdgss.TdgssManager.<init>(DashoA1*..)
        at com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)
        at
com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:612)
        at
com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:630)
        at
com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(GenericTeraEncrypt.java:723)
        at com.teradata.jdbc.AuthMechanism.<init>(AuthMechanism.java:50)
        at
com.teradata.jdbc.jdbc.GenericInitDBConfigState.action(GenericInitDBConfigState.java:105)
        at
com.teradata.jdbc.jdbc.GenericLogonController.run(GenericLogonController.java:49)
        at com.teradata.jdbc.jdbc_4.TDSession.<init>(TDSession.java:199)
        at
com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalConnection.<init>(TeraLocalConnection.java:95)
        at
com.teradata.jdbc.jdbc.ConnectionFactory.createConnection(ConnectionFactory.java:54)
        at com.teradata.jdbc.TeraDriver.doConnect(TeraDriver.java:217)
        at com.teradata.jdbc.TeraDriver.connect(TeraDriver.java:150)
        at java.sql.DriverManager.getConnection(DriverManager.java:582)
        at java.sql.DriverManager.getConnection(DriverManager.java:185)
        at
com.cloudera.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:184)
        at
com.cloudera.sqoop.mapreduce.AsyncSqlRecordWriter.<init>(AsyncSqlRecordWriter.java:73)
        at
com.cloudera.sqoop.mapreduce.ExportOutputFormat$ExportRecordWriter.<init>(ExportOutputFormat.java:96)
        at
com.cloudera.sqoop.mapreduce.ExportOutputFormat.getRecordWriter(ExportOutputFormat.java:78)
        ... 8 more

attempt_201107010928_0398_m_000000_0: GSSException: Failure unspecified at
GSS-API level (Mechanism level: UserFile parameter null)
attempt_201107010928_0398_m_000000_0:   at
com.teradata.tdgss.jtdgss.TdgssParseXml.<init>(DashoA1*..)
attempt_201107010928_0398_m_000000_0:   at
com.teradata.tdgss.jtdgss.TdgssConfigApi.<init>(DashoA1*..)
attempt_201107010928_0398_m_000000_0:   at
com.teradata.tdgss.jtdgss.TdgssManager.<init>(DashoA1*..)
attempt_201107010928_0398_m_000000_0:   at
com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)
attempt_201107010928_0398_m_000000_0:   at
com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:612)
attempt_201107010928_0398_m_000000_0:   at
com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:630)
attempt_201107010928_0398_m_000000_0:   at
com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(

On Mon, Aug 29, 2011 at 2:19 PM, SRINIVAS SURASANI <va...@gmail.com> wrote:

> Arvind,
>
> I have subscribed at sqoop-user@incubator.apache.org and posted Question .
> Sorry for the inconvinence from my end, since I'm close to deadline I'm
> taking your valuable time.
>
>
> sqoop list-tables --driver com.teradata.jdbc.TeraDriver --connect
> jdbc:teradata://PKTD/E1_CMS_WORK --username srini -P
> Iam getting the following error:
>
> 11/08/29 13:08:03 INFO manager.SqlManager: Using default fetchSize of 1000
> GSSException: Failure unspecified at GSS-API level (Mechanism level:
> UserFile parameter null)
>         at com.teradata.tdgss.jtdgss.TdgssParseXml.<init>(DashoA1*..)
>         at com.teradata.tdgss.jtdgss.TdgssConfigApi.<init>(DashoA1*..)
>         at com.teradata.tdgss.jtdgss.TdgssManager.<init>(DashoA1*..)
>         at com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)
>         at
> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:612)
>         at
> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:630)
>         at
> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(GenericTeraEncrypt.java:723)
>         at com.teradata.jdbc.AuthMechanism.<init>(AuthMechanism.java:50)
>         at
> com.teradata.jdbc.jdbc.GenericInitDBConfigState.action(GenericInitDBConfigState.java:105)
>         at
> com.teradata.jdbc.jdbc.GenericLogonController.run(GenericLogonController.java:49)
>         at com.teradata.jdbc.jdbc_4.TDSession.<init>(TDSession.java:199)
>         at
> com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalConnection.<init>(TeraLocalConnection.java:95)
>         at
> com.teradata.jdbc.jdbc.ConnectionFactory.createConnection(ConnectionFactory.java:54)
>         at com.teradata.jdbc.TeraDriver.doConnect(TeraDriver.java:217)
>         at com.teradata.jdbc.TeraDriver.connect(TeraDriver.java:150)
>         at java.sql.DriverManager.getConnection(DriverManager.java:582)
>         at java.sql.DriverManager.getConnection(DriverManager.java:185)
>         at
> com.cloudera.sqoop.manager.SqlManager.makeConnection(SqlManager.java:643)
>         at
> com.cloudera.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:51)
>         at
> com.cloudera.sqoop.manager.SqlManager.listTables(SqlManager.java:270)
>         at
> com.cloudera.sqoop.tool.ListTablesTool.run(ListTablesTool.java:49)
>         at com.cloudera.sqoop.Sqoop.run(Sqoop.java:144)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>         at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:180)
>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:219)
>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:228)
>         at com.cloudera.sqoop.Sqoop.main(Sqoop.java:237)
> 11/08/29 13:08:04 ERROR sqoop.Sqoop: Got exception running Sqoop:
> java.lang.NullPointerException
> java.lang.NullPointerException
>         at
> com.teradata.tdgss.jtdgss.TdgssConfigApi.GetMechanisms(DashoA1*..)
>         at com.teradata.tdgss.jtdgss.TdgssManager.<init>(DashoA1*..)
>         at com.teradata.tdgss.jtdgss.TdgssManager.getInstance(DashoA1*..)
>         at
> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getGSSM(GenericTeraEncrypt.java:612)
>         at
> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getConfig(GenericTeraEncrypt.java:630)
>         at
> com.teradata.jdbc.jdbc.GenericTeraEncrypt.getUserNameForOid(GenericTeraEncrypt.java:723)
>         at com.teradata.jdbc.AuthMechanism.<init>(AuthMechanism.java:50)
>         at
> com.teradata.jdbc.jdbc.GenericInitDBConfigState.action(GenericInitDBConfigState.java:105)
>         at
> com.teradata.jdbc.jdbc.GenericLogonController.run(GenericLogonController.java:49)
>         at com.teradata.jdbc.jdbc_4.TDSession.<init>(TDSession.java:199)
>         at
> com.teradata.jdbc.jdbc_3.ifjdbc_4.TeraLocalConnection.<init>(TeraLocalConnection.java:95)
>         at
> com.teradata.jdbc.jdbc.ConnectionFactory.createConnection(ConnectionFactory.java:54)
>         at com.teradata.jdbc.TeraDriver.doConnect(TeraDriver.java:217)
>         at com.teradata.jdbc.TeraDriver.connect(TeraDriver.java:150)
>         at java.sql.DriverManager.getConnection(DriverManager.java:582)
>         at java.sql.DriverManager.getConnection(DriverManager.java:185)
>         at
> com.cloudera.sqoop.manager.SqlManager.makeConnection(SqlManager.java:643)
>         at
> com.cloudera.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:51)
>         at
> com.cloudera.sqoop.manager.SqlManager.listTables(SqlManager.java:270)
>         at
> com.cloudera.sqoop.tool.ListTablesTool.run(ListTablesTool.java:49)
>         at com.cloudera.sqoop.Sqoop.run(Sqoop.java:144)
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>         at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:180)
>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:219)
>         at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:228)
>         at com.cloudera.sqoop.Sqoop.main(Sqoop.java:237)
>
> Thanks,
> Srini
>
>
>
>
> On Mon, Aug 29, 2011 at 12:58 PM, arvind@cloudera.com <arvind@cloudera.com
> > wrote:
>
>> [Please subscribe and respond to sqoop-user@incubator.apache.org]
>>
>> Please use HADOOP_CLASSPATH instead of CLASSPATH. Also, in order to
>> use the generic JDBC connector, you will have to specify the driver
>> class explicitly via the command line option --driver
>> com.teradata.jdbc.TeraDriver.
>>
>> Thanks,
>> Arvind
>>
>> On Mon, Aug 29, 2011 at 9:53 AM, SRINIVAS SURASANI <va...@gmail.com>
>> wrote:
>> > Arvind,
>> > I have set the classpath to teradata4.jar [ not placed the teradata4.jar
>> in
>> > sqoop lib, as I dont have permissions].
>> > I'm getting the following error
>> >
>> > sqoop list-tables --connect jdbc:teradata://PKTD/E1_CMS_WORK --username
>> > srini -P
>> > ERROR: tool.BaseSqoopTool: Got error creating database manager:
>> > java.io.IOexception: No manager for connect string:
>> > jdbc:teradata:///PKTD/E1_CMS_WORK
>> >    at com.cloudera.sqoop.ConnFactory.getManager(ConnFactory.java:119)
>> >    at com.cloudera.sqoop.tool.BaseSqoopTool.init(BaseSqoopTool.java:186)
>> >    ...
>> >    ...
>> > Thanks,
>> > Srini
>> >
>> >
>> >
>> > On Mon, Aug 29, 2011 at 8:52 AM, SRINIVAS SURASANI <va...@gmail.com>
>> wrote:
>> >>
>> >> Thanks-a-lot Arvind.
>> >>
>> >> On Mon, Aug 29, 2011 at 8:45 AM, arvind@cloudera.com <
>> arvind@cloudera.com>
>> >> wrote:
>> >>>
>> >>> [Moving the thread to sqoop-user@incubator.apache.org]
>> >>>
>> >>> Hi Srini,
>> >>>
>> >>> You should be able to use the generic JDBC connector to import/export
>> >>> from Teradata. There is also a specialized connector that is available
>> >>> for use with Teradata if you are interested. This connector is not a
>> >>> part of Sqoop and can be obtained from Cloudera by going to:
>> >>>
>> >>> http://www.cloudera.com/partners/connectors/
>> >>>
>> >>> Thanks,
>> >>> Arvind
>> >>>
>> >>> On Mon, Aug 29, 2011 at 8:17 AM, SRINIVAS SURASANI <va...@gmail.com>
>> >>> wrote:
>> >>> > I have csv file in hadoop and looking to load into Teradata. I was
>> >>> > wondering does the sqoop works with Terradata.(with JDBC jar placing
>> >>> > in sqoop lib dir).
>> >>> >
>> >>> > Regards
>> >>> > Srini
>> >>> >
>> >>> > --
>> >>> > NOTE: The mailing list sqoop-user@cloudera.org is deprecated in
>> favor
>> >>> > of Apache Sqoop mailing list sqoop-user@incubator.apache.org.
>> Please
>> >>> > subscribe to it by sending an email to
>> >>> > incubator-sqoop-user-subscribe@apache.org.
>> >>> >
>> >>>
>> >>> --
>> >>> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor
>> of
>> >>> Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please
>> subscribe
>> >>> to it by sending an email to
>> incubator-sqoop-user-subscribe@apache.org.
>> >>
>> >
>> > --
>> > NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor
>> of
>> > Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please
>> subscribe
>> > to it by sending an email to incubator-sqoop-user-subscribe@apache.org.
>> >
>>
>> --
>> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor of
>> Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please
>> subscribe to it by sending an email to
>> incubator-sqoop-user-subscribe@apache.org.
>>
>
>

Re: [sqoop-user] Sqoop-with Terradata

Posted by "arvind@cloudera.com" <ar...@cloudera.com>.
[Please subscribe and respond to sqoop-user@incubator.apache.org]

Please use HADOOP_CLASSPATH instead of CLASSPATH. Also, in order to
use the generic JDBC connector, you will have to specify the driver
class explicitly via the command line option --driver
com.teradata.jdbc.TeraDriver.

Thanks,
Arvind

On Mon, Aug 29, 2011 at 9:53 AM, SRINIVAS SURASANI <va...@gmail.com> wrote:
> Arvind,
> I have set the classpath to teradata4.jar [ not placed the teradata4.jar in
> sqoop lib, as I dont have permissions].
> I'm getting the following error
>
> sqoop list-tables --connect jdbc:teradata://PKTD/E1_CMS_WORK --username
> srini -P
> ERROR: tool.BaseSqoopTool: Got error creating database manager:
> java.io.IOexception: No manager for connect string:
> jdbc:teradata:///PKTD/E1_CMS_WORK
>    at com.cloudera.sqoop.ConnFactory.getManager(ConnFactory.java:119)
>    at com.cloudera.sqoop.tool.BaseSqoopTool.init(BaseSqoopTool.java:186)
>    ...
>    ...
> Thanks,
> Srini
>
>
>
> On Mon, Aug 29, 2011 at 8:52 AM, SRINIVAS SURASANI <va...@gmail.com> wrote:
>>
>> Thanks-a-lot Arvind.
>>
>> On Mon, Aug 29, 2011 at 8:45 AM, arvind@cloudera.com <ar...@cloudera.com>
>> wrote:
>>>
>>> [Moving the thread to sqoop-user@incubator.apache.org]
>>>
>>> Hi Srini,
>>>
>>> You should be able to use the generic JDBC connector to import/export
>>> from Teradata. There is also a specialized connector that is available
>>> for use with Teradata if you are interested. This connector is not a
>>> part of Sqoop and can be obtained from Cloudera by going to:
>>>
>>> http://www.cloudera.com/partners/connectors/
>>>
>>> Thanks,
>>> Arvind
>>>
>>> On Mon, Aug 29, 2011 at 8:17 AM, SRINIVAS SURASANI <va...@gmail.com>
>>> wrote:
>>> > I have csv file in hadoop and looking to load into Teradata. I was
>>> > wondering does the sqoop works with Terradata.(with JDBC jar placing
>>> > in sqoop lib dir).
>>> >
>>> > Regards
>>> > Srini
>>> >
>>> > --
>>> > NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor
>>> > of Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please
>>> > subscribe to it by sending an email to
>>> > incubator-sqoop-user-subscribe@apache.org.
>>> >
>>>
>>> --
>>> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor of
>>> Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please subscribe
>>> to it by sending an email to incubator-sqoop-user-subscribe@apache.org.
>>
>
> --
> NOTE: The mailing list sqoop-user@cloudera.org is deprecated in favor of
> Apache Sqoop mailing list sqoop-user@incubator.apache.org. Please subscribe
> to it by sending an email to incubator-sqoop-user-subscribe@apache.org.
>