You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by Raj Hadoop <ha...@yahoo.com> on 2013/08/28 18:36:07 UTC

Sqoop issue related to Hadoop

Hello all,
 
I am getting an error while using sqoop export ( Load HDFS file to Oracle ). I am not sure the issue might be a Sqoop or Hadoop related one. So I am sending it to both the dist lists.
 
I am using -
 
sqoop export --connect jdbc:oracle:thin:@//dbserv:9876/OKI --table RAJ.CUSTOMERS --export-dir /user/hive/warehouse/web_cust --input-null-string '\\N' --input-null-non-string '\\N'  --username <> --password <> -m 1 --input-fields-terminated-by '\t'

I am getting the following error -
 
Warning: /usr/lib/hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: $HADOOP_HOME is deprecated.
13/08/28 09:42:36 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
13/08/28 09:42:36 INFO manager.SqlManager: Using default fetchSize of 1000
13/08/28 09:42:36 INFO tool.CodeGenTool: Beginning code generation
13/08/28 09:42:38 INFO manager.OracleManager: Time zone has been set to GMT
13/08/28 09:42:38 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM RAJ.CUSTOMERS t WHERE 1=0
13/08/28 09:42:38 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /software/hadoop/hadoop/hadoop-1.1.2
Note: /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ_CUSTOMERS.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
13/08/28 09:42:40 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ.CUSTOMERS.jar
13/08/28 09:42:40 INFO mapreduce.ExportJobBase: Beginning export of RAJ.CUSTOMERS
13/08/28 09:42:41 INFO manager.OracleManager: Time zone has been set to GMT
13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process : 1
13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process : 1
13/08/28 09:42:43 INFO util.NativeCodeLoader: Loaded the native-hadoop library
13/08/28 09:42:43 WARN snappy.LoadSnappy: Snappy native library not loaded
13/08/28 09:42:43 INFO mapred.JobClient: Running job: job_201307041900_0463
13/08/28 09:42:44 INFO mapred.JobClient:  map 0% reduce 0%
13/08/28 09:42:56 INFO mapred.JobClient:  map 1% reduce 0%
13/08/28 09:43:00 INFO mapred.JobClient:  map 2% reduce 0%
13/08/28 09:43:03 INFO mapred.JobClient:  map 4% reduce 0%
13/08/28 09:43:10 INFO mapred.JobClient:  map 5% reduce 0%
13/08/28 09:43:13 INFO mapred.JobClient:  map 6% reduce 0%
13/08/28 09:43:17 INFO mapred.JobClient: Task Id : attempt_201307041900_0463_m_000000_0, Status : FAILED
java.io.IOException: Can't export data, please check task tracker logs
        at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
        at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
        at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:415)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
        at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: java.util.NoSuchElementException
        at java.util.ArrayList$Itr.next(ArrayList.java:794)
        at RAJ_CUSTOMERS.__loadFromFields(RAJ_CUSTOMERS.java:1057)
        at RAJ_CUSTOMERS.parse(RAJ_CUSTOMERS.java:876)
        at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
        ... 10 more
 
Thanks,
Raj

Re: Sqoop issue related to Hadoop

Posted by be...@gmail.com.
Hi Raj

The easiest approach to pull out task log is using JT web UI.

Got to JT web UI, drill down on the sqoop job. You'll get a list of failed/killed tasks, your failed thask would be in there. Clicking on that task would give you the logs for the same. 

Regards 
Bejoy KS

Sent from remote device, Please excuse typos

-----Original Message-----
From: Hadoop Raj <ha...@yahoo.com>
Date: Thu, 29 Aug 2013 00:43:59 
To: <us...@hadoop.apache.org>
Reply-To: user@hadoop.apache.org
Subject: Re: Sqoop issue related to Hadoop

Hi Kate,

Where can I find the task attempt log? Can you specify the location please?


Thanks,
Raj

On Aug 28, 2013, at 7:13 PM, Kathleen Ting <ka...@apache.org> wrote:

> Raj, in addition to what Abe said, please also send the failed task attempt log
> attempt_201307041900_0463_m_000000_0 as well.
> 
> Thanks,
> Kate
> 
> On Wed, Aug 28, 2013 at 2:25 PM, Abraham Elmahrek <ab...@cloudera.com> wrote:
>> Hey Raj,
>> 
>> It seems like the number of fields you have in your data doesn't match the
>> number of fields in your RAJ.CUSTOMERS table.
>> 
>> Could you please add "--verbose" to the beginning of your argument list and
>> provide the entire contents here?
>> 
>> -Abe
>> 
>> 
>> On Wed, Aug 28, 2013 at 9:36 AM, Raj Hadoop <ha...@yahoo.com> wrote:
>>> 
>>> Hello all,
>>> 
>>> I am getting an error while using sqoop export ( Load HDFS file to Oracle
>>> ). I am not sure the issue might be a Sqoop or Hadoop related one. So I am
>>> sending it to both the dist lists.
>>> 
>>> I am using -
>>> 
>>> sqoop export --connect jdbc:oracle:thin:@//dbserv:9876/OKI --table
>>> RAJ.CUSTOMERS --export-dir /user/hive/warehouse/web_cust --input-null-string
>>> '\\N' --input-null-non-string '\\N'  --username <> --password <> -m 1
>>> --input-fields-terminated-by '\t'
>>> I am getting the following error -
>>> 
>>> Warning: /usr/lib/hbase does not exist! HBase imports will fail.
>>> Please set $HBASE_HOME to the root of your HBase installation.
>>> Warning: $HADOOP_HOME is deprecated.
>>> 13/08/28 09:42:36 WARN tool.BaseSqoopTool: Setting your password on the
>>> command-line is insecure. Consider using -P instead.
>>> 13/08/28 09:42:36 INFO manager.SqlManager: Using default fetchSize of 1000
>>> 13/08/28 09:42:36 INFO tool.CodeGenTool: Beginning code generation
>>> 13/08/28 09:42:38 INFO manager.OracleManager: Time zone has been set to
>>> GMT
>>> 13/08/28 09:42:38 INFO manager.SqlManager: Executing SQL statement: SELECT
>>> t.* FROM RAJ.CUSTOMERS t WHERE 1=0
>>> 13/08/28 09:42:38 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is
>>> /software/hadoop/hadoop/hadoop-1.1.2
>>> Note:
>>> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ_CUSTOMERS.java
>>> uses or overrides a deprecated API.
>>> Note: Recompile with -Xlint:deprecation for details.
>>> 13/08/28 09:42:40 INFO orm.CompilationManager: Writing jar file:
>>> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ.CUSTOMERS.jar
>>> 13/08/28 09:42:40 INFO mapreduce.ExportJobBase: Beginning export of
>>> RAJ.CUSTOMERS
>>> 13/08/28 09:42:41 INFO manager.OracleManager: Time zone has been set to
>>> GMT
>>> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
>>> : 1
>>> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
>>> : 1
>>> 13/08/28 09:42:43 INFO util.NativeCodeLoader: Loaded the native-hadoop
>>> library
>>> 13/08/28 09:42:43 WARN snappy.LoadSnappy: Snappy native library not loaded
>>> 13/08/28 09:42:43 INFO mapred.JobClient: Running job:
>>> job_201307041900_0463
>>> 13/08/28 09:42:44 INFO mapred.JobClient:  map 0% reduce 0%
>>> 13/08/28 09:42:56 INFO mapred.JobClient:  map 1% reduce 0%
>>> 13/08/28 09:43:00 INFO mapred.JobClient:  map 2% reduce 0%
>>> 13/08/28 09:43:03 INFO mapred.JobClient:  map 4% reduce 0%
>>> 13/08/28 09:43:10 INFO mapred.JobClient:  map 5% reduce 0%
>>> 13/08/28 09:43:13 INFO mapred.JobClient:  map 6% reduce 0%
>>> 13/08/28 09:43:17 INFO mapred.JobClient: Task Id :
>>> attempt_201307041900_0463_m_000000_0, Status : FAILED
>>> java.io.IOException: Can't export data, please check task tracker logs
>>>        at
>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
>>>        at
>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
>>>        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>>>        at
>>> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
>>>        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>>>        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>>>        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>>>        at java.security.AccessController.doPrivileged(Native Method)
>>>        at javax.security.auth.Subject.doAs(Subject.java:415)
>>>        at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
>>>        at org.apache.hadoop.mapred.Child.main(Child.java:249)
>>> Caused by: java.util.NoSuchElementException
>>>        at java.util.ArrayList$Itr.next(ArrayList.java:794)
>>>        at RAJ_CUSTOMERS.__loadFromFields(RAJ_CUSTOMERS.java:1057)
>>>        at RAJ_CUSTOMERS.parse(RAJ_CUSTOMERS.java:876)
>>>        at
>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
>>>        ... 10 more
>>> 
>>> Thanks,
>>> Raj
>>> 
>>> 
>> 
>> 


Re: Sqoop issue related to Hadoop

Posted by be...@gmail.com.
Hi Raj

The easiest approach to pull out task log is using JT web UI.

Got to JT web UI, drill down on the sqoop job. You'll get a list of failed/killed tasks, your failed thask would be in there. Clicking on that task would give you the logs for the same. 

Regards 
Bejoy KS

Sent from remote device, Please excuse typos

-----Original Message-----
From: Hadoop Raj <ha...@yahoo.com>
Date: Thu, 29 Aug 2013 00:43:59 
To: <us...@hadoop.apache.org>
Reply-To: user@hadoop.apache.org
Subject: Re: Sqoop issue related to Hadoop

Hi Kate,

Where can I find the task attempt log? Can you specify the location please?


Thanks,
Raj

On Aug 28, 2013, at 7:13 PM, Kathleen Ting <ka...@apache.org> wrote:

> Raj, in addition to what Abe said, please also send the failed task attempt log
> attempt_201307041900_0463_m_000000_0 as well.
> 
> Thanks,
> Kate
> 
> On Wed, Aug 28, 2013 at 2:25 PM, Abraham Elmahrek <ab...@cloudera.com> wrote:
>> Hey Raj,
>> 
>> It seems like the number of fields you have in your data doesn't match the
>> number of fields in your RAJ.CUSTOMERS table.
>> 
>> Could you please add "--verbose" to the beginning of your argument list and
>> provide the entire contents here?
>> 
>> -Abe
>> 
>> 
>> On Wed, Aug 28, 2013 at 9:36 AM, Raj Hadoop <ha...@yahoo.com> wrote:
>>> 
>>> Hello all,
>>> 
>>> I am getting an error while using sqoop export ( Load HDFS file to Oracle
>>> ). I am not sure the issue might be a Sqoop or Hadoop related one. So I am
>>> sending it to both the dist lists.
>>> 
>>> I am using -
>>> 
>>> sqoop export --connect jdbc:oracle:thin:@//dbserv:9876/OKI --table
>>> RAJ.CUSTOMERS --export-dir /user/hive/warehouse/web_cust --input-null-string
>>> '\\N' --input-null-non-string '\\N'  --username <> --password <> -m 1
>>> --input-fields-terminated-by '\t'
>>> I am getting the following error -
>>> 
>>> Warning: /usr/lib/hbase does not exist! HBase imports will fail.
>>> Please set $HBASE_HOME to the root of your HBase installation.
>>> Warning: $HADOOP_HOME is deprecated.
>>> 13/08/28 09:42:36 WARN tool.BaseSqoopTool: Setting your password on the
>>> command-line is insecure. Consider using -P instead.
>>> 13/08/28 09:42:36 INFO manager.SqlManager: Using default fetchSize of 1000
>>> 13/08/28 09:42:36 INFO tool.CodeGenTool: Beginning code generation
>>> 13/08/28 09:42:38 INFO manager.OracleManager: Time zone has been set to
>>> GMT
>>> 13/08/28 09:42:38 INFO manager.SqlManager: Executing SQL statement: SELECT
>>> t.* FROM RAJ.CUSTOMERS t WHERE 1=0
>>> 13/08/28 09:42:38 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is
>>> /software/hadoop/hadoop/hadoop-1.1.2
>>> Note:
>>> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ_CUSTOMERS.java
>>> uses or overrides a deprecated API.
>>> Note: Recompile with -Xlint:deprecation for details.
>>> 13/08/28 09:42:40 INFO orm.CompilationManager: Writing jar file:
>>> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ.CUSTOMERS.jar
>>> 13/08/28 09:42:40 INFO mapreduce.ExportJobBase: Beginning export of
>>> RAJ.CUSTOMERS
>>> 13/08/28 09:42:41 INFO manager.OracleManager: Time zone has been set to
>>> GMT
>>> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
>>> : 1
>>> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
>>> : 1
>>> 13/08/28 09:42:43 INFO util.NativeCodeLoader: Loaded the native-hadoop
>>> library
>>> 13/08/28 09:42:43 WARN snappy.LoadSnappy: Snappy native library not loaded
>>> 13/08/28 09:42:43 INFO mapred.JobClient: Running job:
>>> job_201307041900_0463
>>> 13/08/28 09:42:44 INFO mapred.JobClient:  map 0% reduce 0%
>>> 13/08/28 09:42:56 INFO mapred.JobClient:  map 1% reduce 0%
>>> 13/08/28 09:43:00 INFO mapred.JobClient:  map 2% reduce 0%
>>> 13/08/28 09:43:03 INFO mapred.JobClient:  map 4% reduce 0%
>>> 13/08/28 09:43:10 INFO mapred.JobClient:  map 5% reduce 0%
>>> 13/08/28 09:43:13 INFO mapred.JobClient:  map 6% reduce 0%
>>> 13/08/28 09:43:17 INFO mapred.JobClient: Task Id :
>>> attempt_201307041900_0463_m_000000_0, Status : FAILED
>>> java.io.IOException: Can't export data, please check task tracker logs
>>>        at
>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
>>>        at
>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
>>>        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>>>        at
>>> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
>>>        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>>>        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>>>        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>>>        at java.security.AccessController.doPrivileged(Native Method)
>>>        at javax.security.auth.Subject.doAs(Subject.java:415)
>>>        at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
>>>        at org.apache.hadoop.mapred.Child.main(Child.java:249)
>>> Caused by: java.util.NoSuchElementException
>>>        at java.util.ArrayList$Itr.next(ArrayList.java:794)
>>>        at RAJ_CUSTOMERS.__loadFromFields(RAJ_CUSTOMERS.java:1057)
>>>        at RAJ_CUSTOMERS.parse(RAJ_CUSTOMERS.java:876)
>>>        at
>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
>>>        ... 10 more
>>> 
>>> Thanks,
>>> Raj
>>> 
>>> 
>> 
>> 


Re: Sqoop issue related to Hadoop

Posted by Shekhar Sharma <sh...@gmail.com>.
Go inside the $HADOOP_HOME/log/user/history...
Regards,
Som Shekhar Sharma
+91-8197243810


On Thu, Aug 29, 2013 at 10:13 AM, Hadoop Raj <ha...@yahoo.com> wrote:
> Hi Kate,
>
> Where can I find the task attempt log? Can you specify the location please?
>
>
> Thanks,
> Raj
>
> On Aug 28, 2013, at 7:13 PM, Kathleen Ting <ka...@apache.org> wrote:
>
>> Raj, in addition to what Abe said, please also send the failed task attempt log
>> attempt_201307041900_0463_m_000000_0 as well.
>>
>> Thanks,
>> Kate
>>
>> On Wed, Aug 28, 2013 at 2:25 PM, Abraham Elmahrek <ab...@cloudera.com> wrote:
>>> Hey Raj,
>>>
>>> It seems like the number of fields you have in your data doesn't match the
>>> number of fields in your RAJ.CUSTOMERS table.
>>>
>>> Could you please add "--verbose" to the beginning of your argument list and
>>> provide the entire contents here?
>>>
>>> -Abe
>>>
>>>
>>> On Wed, Aug 28, 2013 at 9:36 AM, Raj Hadoop <ha...@yahoo.com> wrote:
>>>>
>>>> Hello all,
>>>>
>>>> I am getting an error while using sqoop export ( Load HDFS file to Oracle
>>>> ). I am not sure the issue might be a Sqoop or Hadoop related one. So I am
>>>> sending it to both the dist lists.
>>>>
>>>> I am using -
>>>>
>>>> sqoop export --connect jdbc:oracle:thin:@//dbserv:9876/OKI --table
>>>> RAJ.CUSTOMERS --export-dir /user/hive/warehouse/web_cust --input-null-string
>>>> '\\N' --input-null-non-string '\\N'  --username <> --password <> -m 1
>>>> --input-fields-terminated-by '\t'
>>>> I am getting the following error -
>>>>
>>>> Warning: /usr/lib/hbase does not exist! HBase imports will fail.
>>>> Please set $HBASE_HOME to the root of your HBase installation.
>>>> Warning: $HADOOP_HOME is deprecated.
>>>> 13/08/28 09:42:36 WARN tool.BaseSqoopTool: Setting your password on the
>>>> command-line is insecure. Consider using -P instead.
>>>> 13/08/28 09:42:36 INFO manager.SqlManager: Using default fetchSize of 1000
>>>> 13/08/28 09:42:36 INFO tool.CodeGenTool: Beginning code generation
>>>> 13/08/28 09:42:38 INFO manager.OracleManager: Time zone has been set to
>>>> GMT
>>>> 13/08/28 09:42:38 INFO manager.SqlManager: Executing SQL statement: SELECT
>>>> t.* FROM RAJ.CUSTOMERS t WHERE 1=0
>>>> 13/08/28 09:42:38 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is
>>>> /software/hadoop/hadoop/hadoop-1.1.2
>>>> Note:
>>>> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ_CUSTOMERS.java
>>>> uses or overrides a deprecated API.
>>>> Note: Recompile with -Xlint:deprecation for details.
>>>> 13/08/28 09:42:40 INFO orm.CompilationManager: Writing jar file:
>>>> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ.CUSTOMERS.jar
>>>> 13/08/28 09:42:40 INFO mapreduce.ExportJobBase: Beginning export of
>>>> RAJ.CUSTOMERS
>>>> 13/08/28 09:42:41 INFO manager.OracleManager: Time zone has been set to
>>>> GMT
>>>> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
>>>> : 1
>>>> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
>>>> : 1
>>>> 13/08/28 09:42:43 INFO util.NativeCodeLoader: Loaded the native-hadoop
>>>> library
>>>> 13/08/28 09:42:43 WARN snappy.LoadSnappy: Snappy native library not loaded
>>>> 13/08/28 09:42:43 INFO mapred.JobClient: Running job:
>>>> job_201307041900_0463
>>>> 13/08/28 09:42:44 INFO mapred.JobClient:  map 0% reduce 0%
>>>> 13/08/28 09:42:56 INFO mapred.JobClient:  map 1% reduce 0%
>>>> 13/08/28 09:43:00 INFO mapred.JobClient:  map 2% reduce 0%
>>>> 13/08/28 09:43:03 INFO mapred.JobClient:  map 4% reduce 0%
>>>> 13/08/28 09:43:10 INFO mapred.JobClient:  map 5% reduce 0%
>>>> 13/08/28 09:43:13 INFO mapred.JobClient:  map 6% reduce 0%
>>>> 13/08/28 09:43:17 INFO mapred.JobClient: Task Id :
>>>> attempt_201307041900_0463_m_000000_0, Status : FAILED
>>>> java.io.IOException: Can't export data, please check task tracker logs
>>>>        at
>>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
>>>>        at
>>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
>>>>        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>>>>        at
>>>> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
>>>>        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>>>>        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>>>>        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>>>>        at java.security.AccessController.doPrivileged(Native Method)
>>>>        at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>        at
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
>>>>        at org.apache.hadoop.mapred.Child.main(Child.java:249)
>>>> Caused by: java.util.NoSuchElementException
>>>>        at java.util.ArrayList$Itr.next(ArrayList.java:794)
>>>>        at RAJ_CUSTOMERS.__loadFromFields(RAJ_CUSTOMERS.java:1057)
>>>>        at RAJ_CUSTOMERS.parse(RAJ_CUSTOMERS.java:876)
>>>>        at
>>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
>>>>        ... 10 more
>>>>
>>>> Thanks,
>>>> Raj
>>>>
>>>>
>>>
>>>
>

Re: Sqoop issue related to Hadoop

Posted by Shekhar Sharma <sh...@gmail.com>.
Go inside the $HADOOP_HOME/log/user/history...
Regards,
Som Shekhar Sharma
+91-8197243810


On Thu, Aug 29, 2013 at 10:13 AM, Hadoop Raj <ha...@yahoo.com> wrote:
> Hi Kate,
>
> Where can I find the task attempt log? Can you specify the location please?
>
>
> Thanks,
> Raj
>
> On Aug 28, 2013, at 7:13 PM, Kathleen Ting <ka...@apache.org> wrote:
>
>> Raj, in addition to what Abe said, please also send the failed task attempt log
>> attempt_201307041900_0463_m_000000_0 as well.
>>
>> Thanks,
>> Kate
>>
>> On Wed, Aug 28, 2013 at 2:25 PM, Abraham Elmahrek <ab...@cloudera.com> wrote:
>>> Hey Raj,
>>>
>>> It seems like the number of fields you have in your data doesn't match the
>>> number of fields in your RAJ.CUSTOMERS table.
>>>
>>> Could you please add "--verbose" to the beginning of your argument list and
>>> provide the entire contents here?
>>>
>>> -Abe
>>>
>>>
>>> On Wed, Aug 28, 2013 at 9:36 AM, Raj Hadoop <ha...@yahoo.com> wrote:
>>>>
>>>> Hello all,
>>>>
>>>> I am getting an error while using sqoop export ( Load HDFS file to Oracle
>>>> ). I am not sure the issue might be a Sqoop or Hadoop related one. So I am
>>>> sending it to both the dist lists.
>>>>
>>>> I am using -
>>>>
>>>> sqoop export --connect jdbc:oracle:thin:@//dbserv:9876/OKI --table
>>>> RAJ.CUSTOMERS --export-dir /user/hive/warehouse/web_cust --input-null-string
>>>> '\\N' --input-null-non-string '\\N'  --username <> --password <> -m 1
>>>> --input-fields-terminated-by '\t'
>>>> I am getting the following error -
>>>>
>>>> Warning: /usr/lib/hbase does not exist! HBase imports will fail.
>>>> Please set $HBASE_HOME to the root of your HBase installation.
>>>> Warning: $HADOOP_HOME is deprecated.
>>>> 13/08/28 09:42:36 WARN tool.BaseSqoopTool: Setting your password on the
>>>> command-line is insecure. Consider using -P instead.
>>>> 13/08/28 09:42:36 INFO manager.SqlManager: Using default fetchSize of 1000
>>>> 13/08/28 09:42:36 INFO tool.CodeGenTool: Beginning code generation
>>>> 13/08/28 09:42:38 INFO manager.OracleManager: Time zone has been set to
>>>> GMT
>>>> 13/08/28 09:42:38 INFO manager.SqlManager: Executing SQL statement: SELECT
>>>> t.* FROM RAJ.CUSTOMERS t WHERE 1=0
>>>> 13/08/28 09:42:38 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is
>>>> /software/hadoop/hadoop/hadoop-1.1.2
>>>> Note:
>>>> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ_CUSTOMERS.java
>>>> uses or overrides a deprecated API.
>>>> Note: Recompile with -Xlint:deprecation for details.
>>>> 13/08/28 09:42:40 INFO orm.CompilationManager: Writing jar file:
>>>> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ.CUSTOMERS.jar
>>>> 13/08/28 09:42:40 INFO mapreduce.ExportJobBase: Beginning export of
>>>> RAJ.CUSTOMERS
>>>> 13/08/28 09:42:41 INFO manager.OracleManager: Time zone has been set to
>>>> GMT
>>>> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
>>>> : 1
>>>> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
>>>> : 1
>>>> 13/08/28 09:42:43 INFO util.NativeCodeLoader: Loaded the native-hadoop
>>>> library
>>>> 13/08/28 09:42:43 WARN snappy.LoadSnappy: Snappy native library not loaded
>>>> 13/08/28 09:42:43 INFO mapred.JobClient: Running job:
>>>> job_201307041900_0463
>>>> 13/08/28 09:42:44 INFO mapred.JobClient:  map 0% reduce 0%
>>>> 13/08/28 09:42:56 INFO mapred.JobClient:  map 1% reduce 0%
>>>> 13/08/28 09:43:00 INFO mapred.JobClient:  map 2% reduce 0%
>>>> 13/08/28 09:43:03 INFO mapred.JobClient:  map 4% reduce 0%
>>>> 13/08/28 09:43:10 INFO mapred.JobClient:  map 5% reduce 0%
>>>> 13/08/28 09:43:13 INFO mapred.JobClient:  map 6% reduce 0%
>>>> 13/08/28 09:43:17 INFO mapred.JobClient: Task Id :
>>>> attempt_201307041900_0463_m_000000_0, Status : FAILED
>>>> java.io.IOException: Can't export data, please check task tracker logs
>>>>        at
>>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
>>>>        at
>>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
>>>>        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>>>>        at
>>>> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
>>>>        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>>>>        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>>>>        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>>>>        at java.security.AccessController.doPrivileged(Native Method)
>>>>        at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>        at
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
>>>>        at org.apache.hadoop.mapred.Child.main(Child.java:249)
>>>> Caused by: java.util.NoSuchElementException
>>>>        at java.util.ArrayList$Itr.next(ArrayList.java:794)
>>>>        at RAJ_CUSTOMERS.__loadFromFields(RAJ_CUSTOMERS.java:1057)
>>>>        at RAJ_CUSTOMERS.parse(RAJ_CUSTOMERS.java:876)
>>>>        at
>>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
>>>>        ... 10 more
>>>>
>>>> Thanks,
>>>> Raj
>>>>
>>>>
>>>
>>>
>

Re: Sqoop issue related to Hadoop

Posted by be...@gmail.com.
Hi Raj

The easiest approach to pull out task log is using JT web UI.

Got to JT web UI, drill down on the sqoop job. You'll get a list of failed/killed tasks, your failed thask would be in there. Clicking on that task would give you the logs for the same. 

Regards 
Bejoy KS

Sent from remote device, Please excuse typos

-----Original Message-----
From: Hadoop Raj <ha...@yahoo.com>
Date: Thu, 29 Aug 2013 00:43:59 
To: <us...@hadoop.apache.org>
Reply-To: user@hadoop.apache.org
Subject: Re: Sqoop issue related to Hadoop

Hi Kate,

Where can I find the task attempt log? Can you specify the location please?


Thanks,
Raj

On Aug 28, 2013, at 7:13 PM, Kathleen Ting <ka...@apache.org> wrote:

> Raj, in addition to what Abe said, please also send the failed task attempt log
> attempt_201307041900_0463_m_000000_0 as well.
> 
> Thanks,
> Kate
> 
> On Wed, Aug 28, 2013 at 2:25 PM, Abraham Elmahrek <ab...@cloudera.com> wrote:
>> Hey Raj,
>> 
>> It seems like the number of fields you have in your data doesn't match the
>> number of fields in your RAJ.CUSTOMERS table.
>> 
>> Could you please add "--verbose" to the beginning of your argument list and
>> provide the entire contents here?
>> 
>> -Abe
>> 
>> 
>> On Wed, Aug 28, 2013 at 9:36 AM, Raj Hadoop <ha...@yahoo.com> wrote:
>>> 
>>> Hello all,
>>> 
>>> I am getting an error while using sqoop export ( Load HDFS file to Oracle
>>> ). I am not sure the issue might be a Sqoop or Hadoop related one. So I am
>>> sending it to both the dist lists.
>>> 
>>> I am using -
>>> 
>>> sqoop export --connect jdbc:oracle:thin:@//dbserv:9876/OKI --table
>>> RAJ.CUSTOMERS --export-dir /user/hive/warehouse/web_cust --input-null-string
>>> '\\N' --input-null-non-string '\\N'  --username <> --password <> -m 1
>>> --input-fields-terminated-by '\t'
>>> I am getting the following error -
>>> 
>>> Warning: /usr/lib/hbase does not exist! HBase imports will fail.
>>> Please set $HBASE_HOME to the root of your HBase installation.
>>> Warning: $HADOOP_HOME is deprecated.
>>> 13/08/28 09:42:36 WARN tool.BaseSqoopTool: Setting your password on the
>>> command-line is insecure. Consider using -P instead.
>>> 13/08/28 09:42:36 INFO manager.SqlManager: Using default fetchSize of 1000
>>> 13/08/28 09:42:36 INFO tool.CodeGenTool: Beginning code generation
>>> 13/08/28 09:42:38 INFO manager.OracleManager: Time zone has been set to
>>> GMT
>>> 13/08/28 09:42:38 INFO manager.SqlManager: Executing SQL statement: SELECT
>>> t.* FROM RAJ.CUSTOMERS t WHERE 1=0
>>> 13/08/28 09:42:38 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is
>>> /software/hadoop/hadoop/hadoop-1.1.2
>>> Note:
>>> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ_CUSTOMERS.java
>>> uses or overrides a deprecated API.
>>> Note: Recompile with -Xlint:deprecation for details.
>>> 13/08/28 09:42:40 INFO orm.CompilationManager: Writing jar file:
>>> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ.CUSTOMERS.jar
>>> 13/08/28 09:42:40 INFO mapreduce.ExportJobBase: Beginning export of
>>> RAJ.CUSTOMERS
>>> 13/08/28 09:42:41 INFO manager.OracleManager: Time zone has been set to
>>> GMT
>>> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
>>> : 1
>>> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
>>> : 1
>>> 13/08/28 09:42:43 INFO util.NativeCodeLoader: Loaded the native-hadoop
>>> library
>>> 13/08/28 09:42:43 WARN snappy.LoadSnappy: Snappy native library not loaded
>>> 13/08/28 09:42:43 INFO mapred.JobClient: Running job:
>>> job_201307041900_0463
>>> 13/08/28 09:42:44 INFO mapred.JobClient:  map 0% reduce 0%
>>> 13/08/28 09:42:56 INFO mapred.JobClient:  map 1% reduce 0%
>>> 13/08/28 09:43:00 INFO mapred.JobClient:  map 2% reduce 0%
>>> 13/08/28 09:43:03 INFO mapred.JobClient:  map 4% reduce 0%
>>> 13/08/28 09:43:10 INFO mapred.JobClient:  map 5% reduce 0%
>>> 13/08/28 09:43:13 INFO mapred.JobClient:  map 6% reduce 0%
>>> 13/08/28 09:43:17 INFO mapred.JobClient: Task Id :
>>> attempt_201307041900_0463_m_000000_0, Status : FAILED
>>> java.io.IOException: Can't export data, please check task tracker logs
>>>        at
>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
>>>        at
>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
>>>        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>>>        at
>>> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
>>>        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>>>        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>>>        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>>>        at java.security.AccessController.doPrivileged(Native Method)
>>>        at javax.security.auth.Subject.doAs(Subject.java:415)
>>>        at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
>>>        at org.apache.hadoop.mapred.Child.main(Child.java:249)
>>> Caused by: java.util.NoSuchElementException
>>>        at java.util.ArrayList$Itr.next(ArrayList.java:794)
>>>        at RAJ_CUSTOMERS.__loadFromFields(RAJ_CUSTOMERS.java:1057)
>>>        at RAJ_CUSTOMERS.parse(RAJ_CUSTOMERS.java:876)
>>>        at
>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
>>>        ... 10 more
>>> 
>>> Thanks,
>>> Raj
>>> 
>>> 
>> 
>> 


Re: Sqoop issue related to Hadoop

Posted by be...@gmail.com.
Hi Raj

The easiest approach to pull out task log is using JT web UI.

Got to JT web UI, drill down on the sqoop job. You'll get a list of failed/killed tasks, your failed thask would be in there. Clicking on that task would give you the logs for the same. 

Regards 
Bejoy KS

Sent from remote device, Please excuse typos

-----Original Message-----
From: Hadoop Raj <ha...@yahoo.com>
Date: Thu, 29 Aug 2013 00:43:59 
To: <us...@hadoop.apache.org>
Reply-To: user@hadoop.apache.org
Subject: Re: Sqoop issue related to Hadoop

Hi Kate,

Where can I find the task attempt log? Can you specify the location please?


Thanks,
Raj

On Aug 28, 2013, at 7:13 PM, Kathleen Ting <ka...@apache.org> wrote:

> Raj, in addition to what Abe said, please also send the failed task attempt log
> attempt_201307041900_0463_m_000000_0 as well.
> 
> Thanks,
> Kate
> 
> On Wed, Aug 28, 2013 at 2:25 PM, Abraham Elmahrek <ab...@cloudera.com> wrote:
>> Hey Raj,
>> 
>> It seems like the number of fields you have in your data doesn't match the
>> number of fields in your RAJ.CUSTOMERS table.
>> 
>> Could you please add "--verbose" to the beginning of your argument list and
>> provide the entire contents here?
>> 
>> -Abe
>> 
>> 
>> On Wed, Aug 28, 2013 at 9:36 AM, Raj Hadoop <ha...@yahoo.com> wrote:
>>> 
>>> Hello all,
>>> 
>>> I am getting an error while using sqoop export ( Load HDFS file to Oracle
>>> ). I am not sure the issue might be a Sqoop or Hadoop related one. So I am
>>> sending it to both the dist lists.
>>> 
>>> I am using -
>>> 
>>> sqoop export --connect jdbc:oracle:thin:@//dbserv:9876/OKI --table
>>> RAJ.CUSTOMERS --export-dir /user/hive/warehouse/web_cust --input-null-string
>>> '\\N' --input-null-non-string '\\N'  --username <> --password <> -m 1
>>> --input-fields-terminated-by '\t'
>>> I am getting the following error -
>>> 
>>> Warning: /usr/lib/hbase does not exist! HBase imports will fail.
>>> Please set $HBASE_HOME to the root of your HBase installation.
>>> Warning: $HADOOP_HOME is deprecated.
>>> 13/08/28 09:42:36 WARN tool.BaseSqoopTool: Setting your password on the
>>> command-line is insecure. Consider using -P instead.
>>> 13/08/28 09:42:36 INFO manager.SqlManager: Using default fetchSize of 1000
>>> 13/08/28 09:42:36 INFO tool.CodeGenTool: Beginning code generation
>>> 13/08/28 09:42:38 INFO manager.OracleManager: Time zone has been set to
>>> GMT
>>> 13/08/28 09:42:38 INFO manager.SqlManager: Executing SQL statement: SELECT
>>> t.* FROM RAJ.CUSTOMERS t WHERE 1=0
>>> 13/08/28 09:42:38 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is
>>> /software/hadoop/hadoop/hadoop-1.1.2
>>> Note:
>>> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ_CUSTOMERS.java
>>> uses or overrides a deprecated API.
>>> Note: Recompile with -Xlint:deprecation for details.
>>> 13/08/28 09:42:40 INFO orm.CompilationManager: Writing jar file:
>>> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ.CUSTOMERS.jar
>>> 13/08/28 09:42:40 INFO mapreduce.ExportJobBase: Beginning export of
>>> RAJ.CUSTOMERS
>>> 13/08/28 09:42:41 INFO manager.OracleManager: Time zone has been set to
>>> GMT
>>> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
>>> : 1
>>> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
>>> : 1
>>> 13/08/28 09:42:43 INFO util.NativeCodeLoader: Loaded the native-hadoop
>>> library
>>> 13/08/28 09:42:43 WARN snappy.LoadSnappy: Snappy native library not loaded
>>> 13/08/28 09:42:43 INFO mapred.JobClient: Running job:
>>> job_201307041900_0463
>>> 13/08/28 09:42:44 INFO mapred.JobClient:  map 0% reduce 0%
>>> 13/08/28 09:42:56 INFO mapred.JobClient:  map 1% reduce 0%
>>> 13/08/28 09:43:00 INFO mapred.JobClient:  map 2% reduce 0%
>>> 13/08/28 09:43:03 INFO mapred.JobClient:  map 4% reduce 0%
>>> 13/08/28 09:43:10 INFO mapred.JobClient:  map 5% reduce 0%
>>> 13/08/28 09:43:13 INFO mapred.JobClient:  map 6% reduce 0%
>>> 13/08/28 09:43:17 INFO mapred.JobClient: Task Id :
>>> attempt_201307041900_0463_m_000000_0, Status : FAILED
>>> java.io.IOException: Can't export data, please check task tracker logs
>>>        at
>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
>>>        at
>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
>>>        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>>>        at
>>> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
>>>        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>>>        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>>>        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>>>        at java.security.AccessController.doPrivileged(Native Method)
>>>        at javax.security.auth.Subject.doAs(Subject.java:415)
>>>        at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
>>>        at org.apache.hadoop.mapred.Child.main(Child.java:249)
>>> Caused by: java.util.NoSuchElementException
>>>        at java.util.ArrayList$Itr.next(ArrayList.java:794)
>>>        at RAJ_CUSTOMERS.__loadFromFields(RAJ_CUSTOMERS.java:1057)
>>>        at RAJ_CUSTOMERS.parse(RAJ_CUSTOMERS.java:876)
>>>        at
>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
>>>        ... 10 more
>>> 
>>> Thanks,
>>> Raj
>>> 
>>> 
>> 
>> 


Re: Sqoop issue related to Hadoop

Posted by Shekhar Sharma <sh...@gmail.com>.
Go inside the $HADOOP_HOME/log/user/history...
Regards,
Som Shekhar Sharma
+91-8197243810


On Thu, Aug 29, 2013 at 10:13 AM, Hadoop Raj <ha...@yahoo.com> wrote:
> Hi Kate,
>
> Where can I find the task attempt log? Can you specify the location please?
>
>
> Thanks,
> Raj
>
> On Aug 28, 2013, at 7:13 PM, Kathleen Ting <ka...@apache.org> wrote:
>
>> Raj, in addition to what Abe said, please also send the failed task attempt log
>> attempt_201307041900_0463_m_000000_0 as well.
>>
>> Thanks,
>> Kate
>>
>> On Wed, Aug 28, 2013 at 2:25 PM, Abraham Elmahrek <ab...@cloudera.com> wrote:
>>> Hey Raj,
>>>
>>> It seems like the number of fields you have in your data doesn't match the
>>> number of fields in your RAJ.CUSTOMERS table.
>>>
>>> Could you please add "--verbose" to the beginning of your argument list and
>>> provide the entire contents here?
>>>
>>> -Abe
>>>
>>>
>>> On Wed, Aug 28, 2013 at 9:36 AM, Raj Hadoop <ha...@yahoo.com> wrote:
>>>>
>>>> Hello all,
>>>>
>>>> I am getting an error while using sqoop export ( Load HDFS file to Oracle
>>>> ). I am not sure the issue might be a Sqoop or Hadoop related one. So I am
>>>> sending it to both the dist lists.
>>>>
>>>> I am using -
>>>>
>>>> sqoop export --connect jdbc:oracle:thin:@//dbserv:9876/OKI --table
>>>> RAJ.CUSTOMERS --export-dir /user/hive/warehouse/web_cust --input-null-string
>>>> '\\N' --input-null-non-string '\\N'  --username <> --password <> -m 1
>>>> --input-fields-terminated-by '\t'
>>>> I am getting the following error -
>>>>
>>>> Warning: /usr/lib/hbase does not exist! HBase imports will fail.
>>>> Please set $HBASE_HOME to the root of your HBase installation.
>>>> Warning: $HADOOP_HOME is deprecated.
>>>> 13/08/28 09:42:36 WARN tool.BaseSqoopTool: Setting your password on the
>>>> command-line is insecure. Consider using -P instead.
>>>> 13/08/28 09:42:36 INFO manager.SqlManager: Using default fetchSize of 1000
>>>> 13/08/28 09:42:36 INFO tool.CodeGenTool: Beginning code generation
>>>> 13/08/28 09:42:38 INFO manager.OracleManager: Time zone has been set to
>>>> GMT
>>>> 13/08/28 09:42:38 INFO manager.SqlManager: Executing SQL statement: SELECT
>>>> t.* FROM RAJ.CUSTOMERS t WHERE 1=0
>>>> 13/08/28 09:42:38 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is
>>>> /software/hadoop/hadoop/hadoop-1.1.2
>>>> Note:
>>>> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ_CUSTOMERS.java
>>>> uses or overrides a deprecated API.
>>>> Note: Recompile with -Xlint:deprecation for details.
>>>> 13/08/28 09:42:40 INFO orm.CompilationManager: Writing jar file:
>>>> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ.CUSTOMERS.jar
>>>> 13/08/28 09:42:40 INFO mapreduce.ExportJobBase: Beginning export of
>>>> RAJ.CUSTOMERS
>>>> 13/08/28 09:42:41 INFO manager.OracleManager: Time zone has been set to
>>>> GMT
>>>> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
>>>> : 1
>>>> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
>>>> : 1
>>>> 13/08/28 09:42:43 INFO util.NativeCodeLoader: Loaded the native-hadoop
>>>> library
>>>> 13/08/28 09:42:43 WARN snappy.LoadSnappy: Snappy native library not loaded
>>>> 13/08/28 09:42:43 INFO mapred.JobClient: Running job:
>>>> job_201307041900_0463
>>>> 13/08/28 09:42:44 INFO mapred.JobClient:  map 0% reduce 0%
>>>> 13/08/28 09:42:56 INFO mapred.JobClient:  map 1% reduce 0%
>>>> 13/08/28 09:43:00 INFO mapred.JobClient:  map 2% reduce 0%
>>>> 13/08/28 09:43:03 INFO mapred.JobClient:  map 4% reduce 0%
>>>> 13/08/28 09:43:10 INFO mapred.JobClient:  map 5% reduce 0%
>>>> 13/08/28 09:43:13 INFO mapred.JobClient:  map 6% reduce 0%
>>>> 13/08/28 09:43:17 INFO mapred.JobClient: Task Id :
>>>> attempt_201307041900_0463_m_000000_0, Status : FAILED
>>>> java.io.IOException: Can't export data, please check task tracker logs
>>>>        at
>>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
>>>>        at
>>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
>>>>        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>>>>        at
>>>> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
>>>>        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>>>>        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>>>>        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>>>>        at java.security.AccessController.doPrivileged(Native Method)
>>>>        at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>        at
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
>>>>        at org.apache.hadoop.mapred.Child.main(Child.java:249)
>>>> Caused by: java.util.NoSuchElementException
>>>>        at java.util.ArrayList$Itr.next(ArrayList.java:794)
>>>>        at RAJ_CUSTOMERS.__loadFromFields(RAJ_CUSTOMERS.java:1057)
>>>>        at RAJ_CUSTOMERS.parse(RAJ_CUSTOMERS.java:876)
>>>>        at
>>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
>>>>        ... 10 more
>>>>
>>>> Thanks,
>>>> Raj
>>>>
>>>>
>>>
>>>
>

Re: Sqoop issue related to Hadoop

Posted by Shekhar Sharma <sh...@gmail.com>.
Go inside the $HADOOP_HOME/log/user/history...
Regards,
Som Shekhar Sharma
+91-8197243810


On Thu, Aug 29, 2013 at 10:13 AM, Hadoop Raj <ha...@yahoo.com> wrote:
> Hi Kate,
>
> Where can I find the task attempt log? Can you specify the location please?
>
>
> Thanks,
> Raj
>
> On Aug 28, 2013, at 7:13 PM, Kathleen Ting <ka...@apache.org> wrote:
>
>> Raj, in addition to what Abe said, please also send the failed task attempt log
>> attempt_201307041900_0463_m_000000_0 as well.
>>
>> Thanks,
>> Kate
>>
>> On Wed, Aug 28, 2013 at 2:25 PM, Abraham Elmahrek <ab...@cloudera.com> wrote:
>>> Hey Raj,
>>>
>>> It seems like the number of fields you have in your data doesn't match the
>>> number of fields in your RAJ.CUSTOMERS table.
>>>
>>> Could you please add "--verbose" to the beginning of your argument list and
>>> provide the entire contents here?
>>>
>>> -Abe
>>>
>>>
>>> On Wed, Aug 28, 2013 at 9:36 AM, Raj Hadoop <ha...@yahoo.com> wrote:
>>>>
>>>> Hello all,
>>>>
>>>> I am getting an error while using sqoop export ( Load HDFS file to Oracle
>>>> ). I am not sure the issue might be a Sqoop or Hadoop related one. So I am
>>>> sending it to both the dist lists.
>>>>
>>>> I am using -
>>>>
>>>> sqoop export --connect jdbc:oracle:thin:@//dbserv:9876/OKI --table
>>>> RAJ.CUSTOMERS --export-dir /user/hive/warehouse/web_cust --input-null-string
>>>> '\\N' --input-null-non-string '\\N'  --username <> --password <> -m 1
>>>> --input-fields-terminated-by '\t'
>>>> I am getting the following error -
>>>>
>>>> Warning: /usr/lib/hbase does not exist! HBase imports will fail.
>>>> Please set $HBASE_HOME to the root of your HBase installation.
>>>> Warning: $HADOOP_HOME is deprecated.
>>>> 13/08/28 09:42:36 WARN tool.BaseSqoopTool: Setting your password on the
>>>> command-line is insecure. Consider using -P instead.
>>>> 13/08/28 09:42:36 INFO manager.SqlManager: Using default fetchSize of 1000
>>>> 13/08/28 09:42:36 INFO tool.CodeGenTool: Beginning code generation
>>>> 13/08/28 09:42:38 INFO manager.OracleManager: Time zone has been set to
>>>> GMT
>>>> 13/08/28 09:42:38 INFO manager.SqlManager: Executing SQL statement: SELECT
>>>> t.* FROM RAJ.CUSTOMERS t WHERE 1=0
>>>> 13/08/28 09:42:38 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is
>>>> /software/hadoop/hadoop/hadoop-1.1.2
>>>> Note:
>>>> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ_CUSTOMERS.java
>>>> uses or overrides a deprecated API.
>>>> Note: Recompile with -Xlint:deprecation for details.
>>>> 13/08/28 09:42:40 INFO orm.CompilationManager: Writing jar file:
>>>> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ.CUSTOMERS.jar
>>>> 13/08/28 09:42:40 INFO mapreduce.ExportJobBase: Beginning export of
>>>> RAJ.CUSTOMERS
>>>> 13/08/28 09:42:41 INFO manager.OracleManager: Time zone has been set to
>>>> GMT
>>>> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
>>>> : 1
>>>> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
>>>> : 1
>>>> 13/08/28 09:42:43 INFO util.NativeCodeLoader: Loaded the native-hadoop
>>>> library
>>>> 13/08/28 09:42:43 WARN snappy.LoadSnappy: Snappy native library not loaded
>>>> 13/08/28 09:42:43 INFO mapred.JobClient: Running job:
>>>> job_201307041900_0463
>>>> 13/08/28 09:42:44 INFO mapred.JobClient:  map 0% reduce 0%
>>>> 13/08/28 09:42:56 INFO mapred.JobClient:  map 1% reduce 0%
>>>> 13/08/28 09:43:00 INFO mapred.JobClient:  map 2% reduce 0%
>>>> 13/08/28 09:43:03 INFO mapred.JobClient:  map 4% reduce 0%
>>>> 13/08/28 09:43:10 INFO mapred.JobClient:  map 5% reduce 0%
>>>> 13/08/28 09:43:13 INFO mapred.JobClient:  map 6% reduce 0%
>>>> 13/08/28 09:43:17 INFO mapred.JobClient: Task Id :
>>>> attempt_201307041900_0463_m_000000_0, Status : FAILED
>>>> java.io.IOException: Can't export data, please check task tracker logs
>>>>        at
>>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
>>>>        at
>>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
>>>>        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>>>>        at
>>>> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
>>>>        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>>>>        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>>>>        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>>>>        at java.security.AccessController.doPrivileged(Native Method)
>>>>        at javax.security.auth.Subject.doAs(Subject.java:415)
>>>>        at
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
>>>>        at org.apache.hadoop.mapred.Child.main(Child.java:249)
>>>> Caused by: java.util.NoSuchElementException
>>>>        at java.util.ArrayList$Itr.next(ArrayList.java:794)
>>>>        at RAJ_CUSTOMERS.__loadFromFields(RAJ_CUSTOMERS.java:1057)
>>>>        at RAJ_CUSTOMERS.parse(RAJ_CUSTOMERS.java:876)
>>>>        at
>>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
>>>>        ... 10 more
>>>>
>>>> Thanks,
>>>> Raj
>>>>
>>>>
>>>
>>>
>

Re: Sqoop issue related to Hadoop

Posted by Hadoop Raj <ha...@yahoo.com>.
Hi Kate,

Where can I find the task attempt log? Can you specify the location please?


Thanks,
Raj

On Aug 28, 2013, at 7:13 PM, Kathleen Ting <ka...@apache.org> wrote:

> Raj, in addition to what Abe said, please also send the failed task attempt log
> attempt_201307041900_0463_m_000000_0 as well.
> 
> Thanks,
> Kate
> 
> On Wed, Aug 28, 2013 at 2:25 PM, Abraham Elmahrek <ab...@cloudera.com> wrote:
>> Hey Raj,
>> 
>> It seems like the number of fields you have in your data doesn't match the
>> number of fields in your RAJ.CUSTOMERS table.
>> 
>> Could you please add "--verbose" to the beginning of your argument list and
>> provide the entire contents here?
>> 
>> -Abe
>> 
>> 
>> On Wed, Aug 28, 2013 at 9:36 AM, Raj Hadoop <ha...@yahoo.com> wrote:
>>> 
>>> Hello all,
>>> 
>>> I am getting an error while using sqoop export ( Load HDFS file to Oracle
>>> ). I am not sure the issue might be a Sqoop or Hadoop related one. So I am
>>> sending it to both the dist lists.
>>> 
>>> I am using -
>>> 
>>> sqoop export --connect jdbc:oracle:thin:@//dbserv:9876/OKI --table
>>> RAJ.CUSTOMERS --export-dir /user/hive/warehouse/web_cust --input-null-string
>>> '\\N' --input-null-non-string '\\N'  --username <> --password <> -m 1
>>> --input-fields-terminated-by '\t'
>>> I am getting the following error -
>>> 
>>> Warning: /usr/lib/hbase does not exist! HBase imports will fail.
>>> Please set $HBASE_HOME to the root of your HBase installation.
>>> Warning: $HADOOP_HOME is deprecated.
>>> 13/08/28 09:42:36 WARN tool.BaseSqoopTool: Setting your password on the
>>> command-line is insecure. Consider using -P instead.
>>> 13/08/28 09:42:36 INFO manager.SqlManager: Using default fetchSize of 1000
>>> 13/08/28 09:42:36 INFO tool.CodeGenTool: Beginning code generation
>>> 13/08/28 09:42:38 INFO manager.OracleManager: Time zone has been set to
>>> GMT
>>> 13/08/28 09:42:38 INFO manager.SqlManager: Executing SQL statement: SELECT
>>> t.* FROM RAJ.CUSTOMERS t WHERE 1=0
>>> 13/08/28 09:42:38 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is
>>> /software/hadoop/hadoop/hadoop-1.1.2
>>> Note:
>>> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ_CUSTOMERS.java
>>> uses or overrides a deprecated API.
>>> Note: Recompile with -Xlint:deprecation for details.
>>> 13/08/28 09:42:40 INFO orm.CompilationManager: Writing jar file:
>>> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ.CUSTOMERS.jar
>>> 13/08/28 09:42:40 INFO mapreduce.ExportJobBase: Beginning export of
>>> RAJ.CUSTOMERS
>>> 13/08/28 09:42:41 INFO manager.OracleManager: Time zone has been set to
>>> GMT
>>> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
>>> : 1
>>> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
>>> : 1
>>> 13/08/28 09:42:43 INFO util.NativeCodeLoader: Loaded the native-hadoop
>>> library
>>> 13/08/28 09:42:43 WARN snappy.LoadSnappy: Snappy native library not loaded
>>> 13/08/28 09:42:43 INFO mapred.JobClient: Running job:
>>> job_201307041900_0463
>>> 13/08/28 09:42:44 INFO mapred.JobClient:  map 0% reduce 0%
>>> 13/08/28 09:42:56 INFO mapred.JobClient:  map 1% reduce 0%
>>> 13/08/28 09:43:00 INFO mapred.JobClient:  map 2% reduce 0%
>>> 13/08/28 09:43:03 INFO mapred.JobClient:  map 4% reduce 0%
>>> 13/08/28 09:43:10 INFO mapred.JobClient:  map 5% reduce 0%
>>> 13/08/28 09:43:13 INFO mapred.JobClient:  map 6% reduce 0%
>>> 13/08/28 09:43:17 INFO mapred.JobClient: Task Id :
>>> attempt_201307041900_0463_m_000000_0, Status : FAILED
>>> java.io.IOException: Can't export data, please check task tracker logs
>>>        at
>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
>>>        at
>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
>>>        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>>>        at
>>> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
>>>        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>>>        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>>>        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>>>        at java.security.AccessController.doPrivileged(Native Method)
>>>        at javax.security.auth.Subject.doAs(Subject.java:415)
>>>        at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
>>>        at org.apache.hadoop.mapred.Child.main(Child.java:249)
>>> Caused by: java.util.NoSuchElementException
>>>        at java.util.ArrayList$Itr.next(ArrayList.java:794)
>>>        at RAJ_CUSTOMERS.__loadFromFields(RAJ_CUSTOMERS.java:1057)
>>>        at RAJ_CUSTOMERS.parse(RAJ_CUSTOMERS.java:876)
>>>        at
>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
>>>        ... 10 more
>>> 
>>> Thanks,
>>> Raj
>>> 
>>> 
>> 
>> 


Re: Sqoop issue related to Hadoop

Posted by Hadoop Raj <ha...@yahoo.com>.
Hi Kate,

Where can I find the task attempt log? Can you specify the location please?


Thanks,
Raj

On Aug 28, 2013, at 7:13 PM, Kathleen Ting <ka...@apache.org> wrote:

> Raj, in addition to what Abe said, please also send the failed task attempt log
> attempt_201307041900_0463_m_000000_0 as well.
> 
> Thanks,
> Kate
> 
> On Wed, Aug 28, 2013 at 2:25 PM, Abraham Elmahrek <ab...@cloudera.com> wrote:
>> Hey Raj,
>> 
>> It seems like the number of fields you have in your data doesn't match the
>> number of fields in your RAJ.CUSTOMERS table.
>> 
>> Could you please add "--verbose" to the beginning of your argument list and
>> provide the entire contents here?
>> 
>> -Abe
>> 
>> 
>> On Wed, Aug 28, 2013 at 9:36 AM, Raj Hadoop <ha...@yahoo.com> wrote:
>>> 
>>> Hello all,
>>> 
>>> I am getting an error while using sqoop export ( Load HDFS file to Oracle
>>> ). I am not sure the issue might be a Sqoop or Hadoop related one. So I am
>>> sending it to both the dist lists.
>>> 
>>> I am using -
>>> 
>>> sqoop export --connect jdbc:oracle:thin:@//dbserv:9876/OKI --table
>>> RAJ.CUSTOMERS --export-dir /user/hive/warehouse/web_cust --input-null-string
>>> '\\N' --input-null-non-string '\\N'  --username <> --password <> -m 1
>>> --input-fields-terminated-by '\t'
>>> I am getting the following error -
>>> 
>>> Warning: /usr/lib/hbase does not exist! HBase imports will fail.
>>> Please set $HBASE_HOME to the root of your HBase installation.
>>> Warning: $HADOOP_HOME is deprecated.
>>> 13/08/28 09:42:36 WARN tool.BaseSqoopTool: Setting your password on the
>>> command-line is insecure. Consider using -P instead.
>>> 13/08/28 09:42:36 INFO manager.SqlManager: Using default fetchSize of 1000
>>> 13/08/28 09:42:36 INFO tool.CodeGenTool: Beginning code generation
>>> 13/08/28 09:42:38 INFO manager.OracleManager: Time zone has been set to
>>> GMT
>>> 13/08/28 09:42:38 INFO manager.SqlManager: Executing SQL statement: SELECT
>>> t.* FROM RAJ.CUSTOMERS t WHERE 1=0
>>> 13/08/28 09:42:38 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is
>>> /software/hadoop/hadoop/hadoop-1.1.2
>>> Note:
>>> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ_CUSTOMERS.java
>>> uses or overrides a deprecated API.
>>> Note: Recompile with -Xlint:deprecation for details.
>>> 13/08/28 09:42:40 INFO orm.CompilationManager: Writing jar file:
>>> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ.CUSTOMERS.jar
>>> 13/08/28 09:42:40 INFO mapreduce.ExportJobBase: Beginning export of
>>> RAJ.CUSTOMERS
>>> 13/08/28 09:42:41 INFO manager.OracleManager: Time zone has been set to
>>> GMT
>>> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
>>> : 1
>>> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
>>> : 1
>>> 13/08/28 09:42:43 INFO util.NativeCodeLoader: Loaded the native-hadoop
>>> library
>>> 13/08/28 09:42:43 WARN snappy.LoadSnappy: Snappy native library not loaded
>>> 13/08/28 09:42:43 INFO mapred.JobClient: Running job:
>>> job_201307041900_0463
>>> 13/08/28 09:42:44 INFO mapred.JobClient:  map 0% reduce 0%
>>> 13/08/28 09:42:56 INFO mapred.JobClient:  map 1% reduce 0%
>>> 13/08/28 09:43:00 INFO mapred.JobClient:  map 2% reduce 0%
>>> 13/08/28 09:43:03 INFO mapred.JobClient:  map 4% reduce 0%
>>> 13/08/28 09:43:10 INFO mapred.JobClient:  map 5% reduce 0%
>>> 13/08/28 09:43:13 INFO mapred.JobClient:  map 6% reduce 0%
>>> 13/08/28 09:43:17 INFO mapred.JobClient: Task Id :
>>> attempt_201307041900_0463_m_000000_0, Status : FAILED
>>> java.io.IOException: Can't export data, please check task tracker logs
>>>        at
>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
>>>        at
>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
>>>        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>>>        at
>>> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
>>>        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>>>        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>>>        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>>>        at java.security.AccessController.doPrivileged(Native Method)
>>>        at javax.security.auth.Subject.doAs(Subject.java:415)
>>>        at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
>>>        at org.apache.hadoop.mapred.Child.main(Child.java:249)
>>> Caused by: java.util.NoSuchElementException
>>>        at java.util.ArrayList$Itr.next(ArrayList.java:794)
>>>        at RAJ_CUSTOMERS.__loadFromFields(RAJ_CUSTOMERS.java:1057)
>>>        at RAJ_CUSTOMERS.parse(RAJ_CUSTOMERS.java:876)
>>>        at
>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
>>>        ... 10 more
>>> 
>>> Thanks,
>>> Raj
>>> 
>>> 
>> 
>> 


Re: Sqoop issue related to Hadoop

Posted by Hadoop Raj <ha...@yahoo.com>.
Hi Kate,

Where can I find the task attempt log? Can you specify the location please?


Thanks,
Raj

On Aug 28, 2013, at 7:13 PM, Kathleen Ting <ka...@apache.org> wrote:

> Raj, in addition to what Abe said, please also send the failed task attempt log
> attempt_201307041900_0463_m_000000_0 as well.
> 
> Thanks,
> Kate
> 
> On Wed, Aug 28, 2013 at 2:25 PM, Abraham Elmahrek <ab...@cloudera.com> wrote:
>> Hey Raj,
>> 
>> It seems like the number of fields you have in your data doesn't match the
>> number of fields in your RAJ.CUSTOMERS table.
>> 
>> Could you please add "--verbose" to the beginning of your argument list and
>> provide the entire contents here?
>> 
>> -Abe
>> 
>> 
>> On Wed, Aug 28, 2013 at 9:36 AM, Raj Hadoop <ha...@yahoo.com> wrote:
>>> 
>>> Hello all,
>>> 
>>> I am getting an error while using sqoop export ( Load HDFS file to Oracle
>>> ). I am not sure the issue might be a Sqoop or Hadoop related one. So I am
>>> sending it to both the dist lists.
>>> 
>>> I am using -
>>> 
>>> sqoop export --connect jdbc:oracle:thin:@//dbserv:9876/OKI --table
>>> RAJ.CUSTOMERS --export-dir /user/hive/warehouse/web_cust --input-null-string
>>> '\\N' --input-null-non-string '\\N'  --username <> --password <> -m 1
>>> --input-fields-terminated-by '\t'
>>> I am getting the following error -
>>> 
>>> Warning: /usr/lib/hbase does not exist! HBase imports will fail.
>>> Please set $HBASE_HOME to the root of your HBase installation.
>>> Warning: $HADOOP_HOME is deprecated.
>>> 13/08/28 09:42:36 WARN tool.BaseSqoopTool: Setting your password on the
>>> command-line is insecure. Consider using -P instead.
>>> 13/08/28 09:42:36 INFO manager.SqlManager: Using default fetchSize of 1000
>>> 13/08/28 09:42:36 INFO tool.CodeGenTool: Beginning code generation
>>> 13/08/28 09:42:38 INFO manager.OracleManager: Time zone has been set to
>>> GMT
>>> 13/08/28 09:42:38 INFO manager.SqlManager: Executing SQL statement: SELECT
>>> t.* FROM RAJ.CUSTOMERS t WHERE 1=0
>>> 13/08/28 09:42:38 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is
>>> /software/hadoop/hadoop/hadoop-1.1.2
>>> Note:
>>> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ_CUSTOMERS.java
>>> uses or overrides a deprecated API.
>>> Note: Recompile with -Xlint:deprecation for details.
>>> 13/08/28 09:42:40 INFO orm.CompilationManager: Writing jar file:
>>> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ.CUSTOMERS.jar
>>> 13/08/28 09:42:40 INFO mapreduce.ExportJobBase: Beginning export of
>>> RAJ.CUSTOMERS
>>> 13/08/28 09:42:41 INFO manager.OracleManager: Time zone has been set to
>>> GMT
>>> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
>>> : 1
>>> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
>>> : 1
>>> 13/08/28 09:42:43 INFO util.NativeCodeLoader: Loaded the native-hadoop
>>> library
>>> 13/08/28 09:42:43 WARN snappy.LoadSnappy: Snappy native library not loaded
>>> 13/08/28 09:42:43 INFO mapred.JobClient: Running job:
>>> job_201307041900_0463
>>> 13/08/28 09:42:44 INFO mapred.JobClient:  map 0% reduce 0%
>>> 13/08/28 09:42:56 INFO mapred.JobClient:  map 1% reduce 0%
>>> 13/08/28 09:43:00 INFO mapred.JobClient:  map 2% reduce 0%
>>> 13/08/28 09:43:03 INFO mapred.JobClient:  map 4% reduce 0%
>>> 13/08/28 09:43:10 INFO mapred.JobClient:  map 5% reduce 0%
>>> 13/08/28 09:43:13 INFO mapred.JobClient:  map 6% reduce 0%
>>> 13/08/28 09:43:17 INFO mapred.JobClient: Task Id :
>>> attempt_201307041900_0463_m_000000_0, Status : FAILED
>>> java.io.IOException: Can't export data, please check task tracker logs
>>>        at
>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
>>>        at
>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
>>>        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>>>        at
>>> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
>>>        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>>>        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>>>        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>>>        at java.security.AccessController.doPrivileged(Native Method)
>>>        at javax.security.auth.Subject.doAs(Subject.java:415)
>>>        at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
>>>        at org.apache.hadoop.mapred.Child.main(Child.java:249)
>>> Caused by: java.util.NoSuchElementException
>>>        at java.util.ArrayList$Itr.next(ArrayList.java:794)
>>>        at RAJ_CUSTOMERS.__loadFromFields(RAJ_CUSTOMERS.java:1057)
>>>        at RAJ_CUSTOMERS.parse(RAJ_CUSTOMERS.java:876)
>>>        at
>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
>>>        ... 10 more
>>> 
>>> Thanks,
>>> Raj
>>> 
>>> 
>> 
>> 


Re: Sqoop issue related to Hadoop

Posted by Hadoop Raj <ha...@yahoo.com>.
Hi Kate,

Where can I find the task attempt log? Can you specify the location please?


Thanks,
Raj

On Aug 28, 2013, at 7:13 PM, Kathleen Ting <ka...@apache.org> wrote:

> Raj, in addition to what Abe said, please also send the failed task attempt log
> attempt_201307041900_0463_m_000000_0 as well.
> 
> Thanks,
> Kate
> 
> On Wed, Aug 28, 2013 at 2:25 PM, Abraham Elmahrek <ab...@cloudera.com> wrote:
>> Hey Raj,
>> 
>> It seems like the number of fields you have in your data doesn't match the
>> number of fields in your RAJ.CUSTOMERS table.
>> 
>> Could you please add "--verbose" to the beginning of your argument list and
>> provide the entire contents here?
>> 
>> -Abe
>> 
>> 
>> On Wed, Aug 28, 2013 at 9:36 AM, Raj Hadoop <ha...@yahoo.com> wrote:
>>> 
>>> Hello all,
>>> 
>>> I am getting an error while using sqoop export ( Load HDFS file to Oracle
>>> ). I am not sure the issue might be a Sqoop or Hadoop related one. So I am
>>> sending it to both the dist lists.
>>> 
>>> I am using -
>>> 
>>> sqoop export --connect jdbc:oracle:thin:@//dbserv:9876/OKI --table
>>> RAJ.CUSTOMERS --export-dir /user/hive/warehouse/web_cust --input-null-string
>>> '\\N' --input-null-non-string '\\N'  --username <> --password <> -m 1
>>> --input-fields-terminated-by '\t'
>>> I am getting the following error -
>>> 
>>> Warning: /usr/lib/hbase does not exist! HBase imports will fail.
>>> Please set $HBASE_HOME to the root of your HBase installation.
>>> Warning: $HADOOP_HOME is deprecated.
>>> 13/08/28 09:42:36 WARN tool.BaseSqoopTool: Setting your password on the
>>> command-line is insecure. Consider using -P instead.
>>> 13/08/28 09:42:36 INFO manager.SqlManager: Using default fetchSize of 1000
>>> 13/08/28 09:42:36 INFO tool.CodeGenTool: Beginning code generation
>>> 13/08/28 09:42:38 INFO manager.OracleManager: Time zone has been set to
>>> GMT
>>> 13/08/28 09:42:38 INFO manager.SqlManager: Executing SQL statement: SELECT
>>> t.* FROM RAJ.CUSTOMERS t WHERE 1=0
>>> 13/08/28 09:42:38 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is
>>> /software/hadoop/hadoop/hadoop-1.1.2
>>> Note:
>>> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ_CUSTOMERS.java
>>> uses or overrides a deprecated API.
>>> Note: Recompile with -Xlint:deprecation for details.
>>> 13/08/28 09:42:40 INFO orm.CompilationManager: Writing jar file:
>>> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ.CUSTOMERS.jar
>>> 13/08/28 09:42:40 INFO mapreduce.ExportJobBase: Beginning export of
>>> RAJ.CUSTOMERS
>>> 13/08/28 09:42:41 INFO manager.OracleManager: Time zone has been set to
>>> GMT
>>> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
>>> : 1
>>> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
>>> : 1
>>> 13/08/28 09:42:43 INFO util.NativeCodeLoader: Loaded the native-hadoop
>>> library
>>> 13/08/28 09:42:43 WARN snappy.LoadSnappy: Snappy native library not loaded
>>> 13/08/28 09:42:43 INFO mapred.JobClient: Running job:
>>> job_201307041900_0463
>>> 13/08/28 09:42:44 INFO mapred.JobClient:  map 0% reduce 0%
>>> 13/08/28 09:42:56 INFO mapred.JobClient:  map 1% reduce 0%
>>> 13/08/28 09:43:00 INFO mapred.JobClient:  map 2% reduce 0%
>>> 13/08/28 09:43:03 INFO mapred.JobClient:  map 4% reduce 0%
>>> 13/08/28 09:43:10 INFO mapred.JobClient:  map 5% reduce 0%
>>> 13/08/28 09:43:13 INFO mapred.JobClient:  map 6% reduce 0%
>>> 13/08/28 09:43:17 INFO mapred.JobClient: Task Id :
>>> attempt_201307041900_0463_m_000000_0, Status : FAILED
>>> java.io.IOException: Can't export data, please check task tracker logs
>>>        at
>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
>>>        at
>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
>>>        at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>>>        at
>>> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
>>>        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>>>        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>>>        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>>>        at java.security.AccessController.doPrivileged(Native Method)
>>>        at javax.security.auth.Subject.doAs(Subject.java:415)
>>>        at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
>>>        at org.apache.hadoop.mapred.Child.main(Child.java:249)
>>> Caused by: java.util.NoSuchElementException
>>>        at java.util.ArrayList$Itr.next(ArrayList.java:794)
>>>        at RAJ_CUSTOMERS.__loadFromFields(RAJ_CUSTOMERS.java:1057)
>>>        at RAJ_CUSTOMERS.parse(RAJ_CUSTOMERS.java:876)
>>>        at
>>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
>>>        ... 10 more
>>> 
>>> Thanks,
>>> Raj
>>> 
>>> 
>> 
>> 


Re: Sqoop issue related to Hadoop

Posted by Kathleen Ting <ka...@apache.org>.
Raj, in addition to what Abe said, please also send the failed task attempt log
attempt_201307041900_0463_m_000000_0 as well.

Thanks,
Kate

On Wed, Aug 28, 2013 at 2:25 PM, Abraham Elmahrek <ab...@cloudera.com> wrote:
> Hey Raj,
>
> It seems like the number of fields you have in your data doesn't match the
> number of fields in your RAJ.CUSTOMERS table.
>
> Could you please add "--verbose" to the beginning of your argument list and
> provide the entire contents here?
>
> -Abe
>
>
> On Wed, Aug 28, 2013 at 9:36 AM, Raj Hadoop <ha...@yahoo.com> wrote:
>>
>> Hello all,
>>
>> I am getting an error while using sqoop export ( Load HDFS file to Oracle
>> ). I am not sure the issue might be a Sqoop or Hadoop related one. So I am
>> sending it to both the dist lists.
>>
>> I am using -
>>
>> sqoop export --connect jdbc:oracle:thin:@//dbserv:9876/OKI --table
>> RAJ.CUSTOMERS --export-dir /user/hive/warehouse/web_cust --input-null-string
>> '\\N' --input-null-non-string '\\N'  --username <> --password <> -m 1
>> --input-fields-terminated-by '\t'
>> I am getting the following error -
>>
>> Warning: /usr/lib/hbase does not exist! HBase imports will fail.
>> Please set $HBASE_HOME to the root of your HBase installation.
>> Warning: $HADOOP_HOME is deprecated.
>> 13/08/28 09:42:36 WARN tool.BaseSqoopTool: Setting your password on the
>> command-line is insecure. Consider using -P instead.
>> 13/08/28 09:42:36 INFO manager.SqlManager: Using default fetchSize of 1000
>> 13/08/28 09:42:36 INFO tool.CodeGenTool: Beginning code generation
>> 13/08/28 09:42:38 INFO manager.OracleManager: Time zone has been set to
>> GMT
>> 13/08/28 09:42:38 INFO manager.SqlManager: Executing SQL statement: SELECT
>> t.* FROM RAJ.CUSTOMERS t WHERE 1=0
>> 13/08/28 09:42:38 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is
>> /software/hadoop/hadoop/hadoop-1.1.2
>> Note:
>> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ_CUSTOMERS.java
>> uses or overrides a deprecated API.
>> Note: Recompile with -Xlint:deprecation for details.
>> 13/08/28 09:42:40 INFO orm.CompilationManager: Writing jar file:
>> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ.CUSTOMERS.jar
>> 13/08/28 09:42:40 INFO mapreduce.ExportJobBase: Beginning export of
>> RAJ.CUSTOMERS
>> 13/08/28 09:42:41 INFO manager.OracleManager: Time zone has been set to
>> GMT
>> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
>> : 1
>> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
>> : 1
>> 13/08/28 09:42:43 INFO util.NativeCodeLoader: Loaded the native-hadoop
>> library
>> 13/08/28 09:42:43 WARN snappy.LoadSnappy: Snappy native library not loaded
>> 13/08/28 09:42:43 INFO mapred.JobClient: Running job:
>> job_201307041900_0463
>> 13/08/28 09:42:44 INFO mapred.JobClient:  map 0% reduce 0%
>> 13/08/28 09:42:56 INFO mapred.JobClient:  map 1% reduce 0%
>> 13/08/28 09:43:00 INFO mapred.JobClient:  map 2% reduce 0%
>> 13/08/28 09:43:03 INFO mapred.JobClient:  map 4% reduce 0%
>> 13/08/28 09:43:10 INFO mapred.JobClient:  map 5% reduce 0%
>> 13/08/28 09:43:13 INFO mapred.JobClient:  map 6% reduce 0%
>> 13/08/28 09:43:17 INFO mapred.JobClient: Task Id :
>> attempt_201307041900_0463_m_000000_0, Status : FAILED
>> java.io.IOException: Can't export data, please check task tracker logs
>>         at
>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
>>         at
>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
>>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>>         at
>> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
>>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>>         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at javax.security.auth.Subject.doAs(Subject.java:415)
>>         at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
>>         at org.apache.hadoop.mapred.Child.main(Child.java:249)
>> Caused by: java.util.NoSuchElementException
>>         at java.util.ArrayList$Itr.next(ArrayList.java:794)
>>         at RAJ_CUSTOMERS.__loadFromFields(RAJ_CUSTOMERS.java:1057)
>>         at RAJ_CUSTOMERS.parse(RAJ_CUSTOMERS.java:876)
>>         at
>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
>>         ... 10 more
>>
>> Thanks,
>> Raj
>>
>>
>
>

Re: Sqoop issue related to Hadoop

Posted by Kathleen Ting <ka...@apache.org>.
Raj, in addition to what Abe said, please also send the failed task attempt log
attempt_201307041900_0463_m_000000_0 as well.

Thanks,
Kate

On Wed, Aug 28, 2013 at 2:25 PM, Abraham Elmahrek <ab...@cloudera.com> wrote:
> Hey Raj,
>
> It seems like the number of fields you have in your data doesn't match the
> number of fields in your RAJ.CUSTOMERS table.
>
> Could you please add "--verbose" to the beginning of your argument list and
> provide the entire contents here?
>
> -Abe
>
>
> On Wed, Aug 28, 2013 at 9:36 AM, Raj Hadoop <ha...@yahoo.com> wrote:
>>
>> Hello all,
>>
>> I am getting an error while using sqoop export ( Load HDFS file to Oracle
>> ). I am not sure the issue might be a Sqoop or Hadoop related one. So I am
>> sending it to both the dist lists.
>>
>> I am using -
>>
>> sqoop export --connect jdbc:oracle:thin:@//dbserv:9876/OKI --table
>> RAJ.CUSTOMERS --export-dir /user/hive/warehouse/web_cust --input-null-string
>> '\\N' --input-null-non-string '\\N'  --username <> --password <> -m 1
>> --input-fields-terminated-by '\t'
>> I am getting the following error -
>>
>> Warning: /usr/lib/hbase does not exist! HBase imports will fail.
>> Please set $HBASE_HOME to the root of your HBase installation.
>> Warning: $HADOOP_HOME is deprecated.
>> 13/08/28 09:42:36 WARN tool.BaseSqoopTool: Setting your password on the
>> command-line is insecure. Consider using -P instead.
>> 13/08/28 09:42:36 INFO manager.SqlManager: Using default fetchSize of 1000
>> 13/08/28 09:42:36 INFO tool.CodeGenTool: Beginning code generation
>> 13/08/28 09:42:38 INFO manager.OracleManager: Time zone has been set to
>> GMT
>> 13/08/28 09:42:38 INFO manager.SqlManager: Executing SQL statement: SELECT
>> t.* FROM RAJ.CUSTOMERS t WHERE 1=0
>> 13/08/28 09:42:38 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is
>> /software/hadoop/hadoop/hadoop-1.1.2
>> Note:
>> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ_CUSTOMERS.java
>> uses or overrides a deprecated API.
>> Note: Recompile with -Xlint:deprecation for details.
>> 13/08/28 09:42:40 INFO orm.CompilationManager: Writing jar file:
>> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ.CUSTOMERS.jar
>> 13/08/28 09:42:40 INFO mapreduce.ExportJobBase: Beginning export of
>> RAJ.CUSTOMERS
>> 13/08/28 09:42:41 INFO manager.OracleManager: Time zone has been set to
>> GMT
>> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
>> : 1
>> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
>> : 1
>> 13/08/28 09:42:43 INFO util.NativeCodeLoader: Loaded the native-hadoop
>> library
>> 13/08/28 09:42:43 WARN snappy.LoadSnappy: Snappy native library not loaded
>> 13/08/28 09:42:43 INFO mapred.JobClient: Running job:
>> job_201307041900_0463
>> 13/08/28 09:42:44 INFO mapred.JobClient:  map 0% reduce 0%
>> 13/08/28 09:42:56 INFO mapred.JobClient:  map 1% reduce 0%
>> 13/08/28 09:43:00 INFO mapred.JobClient:  map 2% reduce 0%
>> 13/08/28 09:43:03 INFO mapred.JobClient:  map 4% reduce 0%
>> 13/08/28 09:43:10 INFO mapred.JobClient:  map 5% reduce 0%
>> 13/08/28 09:43:13 INFO mapred.JobClient:  map 6% reduce 0%
>> 13/08/28 09:43:17 INFO mapred.JobClient: Task Id :
>> attempt_201307041900_0463_m_000000_0, Status : FAILED
>> java.io.IOException: Can't export data, please check task tracker logs
>>         at
>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
>>         at
>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
>>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>>         at
>> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
>>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>>         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at javax.security.auth.Subject.doAs(Subject.java:415)
>>         at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
>>         at org.apache.hadoop.mapred.Child.main(Child.java:249)
>> Caused by: java.util.NoSuchElementException
>>         at java.util.ArrayList$Itr.next(ArrayList.java:794)
>>         at RAJ_CUSTOMERS.__loadFromFields(RAJ_CUSTOMERS.java:1057)
>>         at RAJ_CUSTOMERS.parse(RAJ_CUSTOMERS.java:876)
>>         at
>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
>>         ... 10 more
>>
>> Thanks,
>> Raj
>>
>>
>
>

Re: Sqoop issue related to Hadoop

Posted by Kathleen Ting <ka...@apache.org>.
Raj, in addition to what Abe said, please also send the failed task attempt log
attempt_201307041900_0463_m_000000_0 as well.

Thanks,
Kate

On Wed, Aug 28, 2013 at 2:25 PM, Abraham Elmahrek <ab...@cloudera.com> wrote:
> Hey Raj,
>
> It seems like the number of fields you have in your data doesn't match the
> number of fields in your RAJ.CUSTOMERS table.
>
> Could you please add "--verbose" to the beginning of your argument list and
> provide the entire contents here?
>
> -Abe
>
>
> On Wed, Aug 28, 2013 at 9:36 AM, Raj Hadoop <ha...@yahoo.com> wrote:
>>
>> Hello all,
>>
>> I am getting an error while using sqoop export ( Load HDFS file to Oracle
>> ). I am not sure the issue might be a Sqoop or Hadoop related one. So I am
>> sending it to both the dist lists.
>>
>> I am using -
>>
>> sqoop export --connect jdbc:oracle:thin:@//dbserv:9876/OKI --table
>> RAJ.CUSTOMERS --export-dir /user/hive/warehouse/web_cust --input-null-string
>> '\\N' --input-null-non-string '\\N'  --username <> --password <> -m 1
>> --input-fields-terminated-by '\t'
>> I am getting the following error -
>>
>> Warning: /usr/lib/hbase does not exist! HBase imports will fail.
>> Please set $HBASE_HOME to the root of your HBase installation.
>> Warning: $HADOOP_HOME is deprecated.
>> 13/08/28 09:42:36 WARN tool.BaseSqoopTool: Setting your password on the
>> command-line is insecure. Consider using -P instead.
>> 13/08/28 09:42:36 INFO manager.SqlManager: Using default fetchSize of 1000
>> 13/08/28 09:42:36 INFO tool.CodeGenTool: Beginning code generation
>> 13/08/28 09:42:38 INFO manager.OracleManager: Time zone has been set to
>> GMT
>> 13/08/28 09:42:38 INFO manager.SqlManager: Executing SQL statement: SELECT
>> t.* FROM RAJ.CUSTOMERS t WHERE 1=0
>> 13/08/28 09:42:38 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is
>> /software/hadoop/hadoop/hadoop-1.1.2
>> Note:
>> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ_CUSTOMERS.java
>> uses or overrides a deprecated API.
>> Note: Recompile with -Xlint:deprecation for details.
>> 13/08/28 09:42:40 INFO orm.CompilationManager: Writing jar file:
>> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ.CUSTOMERS.jar
>> 13/08/28 09:42:40 INFO mapreduce.ExportJobBase: Beginning export of
>> RAJ.CUSTOMERS
>> 13/08/28 09:42:41 INFO manager.OracleManager: Time zone has been set to
>> GMT
>> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
>> : 1
>> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
>> : 1
>> 13/08/28 09:42:43 INFO util.NativeCodeLoader: Loaded the native-hadoop
>> library
>> 13/08/28 09:42:43 WARN snappy.LoadSnappy: Snappy native library not loaded
>> 13/08/28 09:42:43 INFO mapred.JobClient: Running job:
>> job_201307041900_0463
>> 13/08/28 09:42:44 INFO mapred.JobClient:  map 0% reduce 0%
>> 13/08/28 09:42:56 INFO mapred.JobClient:  map 1% reduce 0%
>> 13/08/28 09:43:00 INFO mapred.JobClient:  map 2% reduce 0%
>> 13/08/28 09:43:03 INFO mapred.JobClient:  map 4% reduce 0%
>> 13/08/28 09:43:10 INFO mapred.JobClient:  map 5% reduce 0%
>> 13/08/28 09:43:13 INFO mapred.JobClient:  map 6% reduce 0%
>> 13/08/28 09:43:17 INFO mapred.JobClient: Task Id :
>> attempt_201307041900_0463_m_000000_0, Status : FAILED
>> java.io.IOException: Can't export data, please check task tracker logs
>>         at
>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
>>         at
>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
>>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>>         at
>> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
>>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>>         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at javax.security.auth.Subject.doAs(Subject.java:415)
>>         at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
>>         at org.apache.hadoop.mapred.Child.main(Child.java:249)
>> Caused by: java.util.NoSuchElementException
>>         at java.util.ArrayList$Itr.next(ArrayList.java:794)
>>         at RAJ_CUSTOMERS.__loadFromFields(RAJ_CUSTOMERS.java:1057)
>>         at RAJ_CUSTOMERS.parse(RAJ_CUSTOMERS.java:876)
>>         at
>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
>>         ... 10 more
>>
>> Thanks,
>> Raj
>>
>>
>
>

Re: Sqoop issue related to Hadoop

Posted by Kathleen Ting <ka...@apache.org>.
Raj, in addition to what Abe said, please also send the failed task attempt log
attempt_201307041900_0463_m_000000_0 as well.

Thanks,
Kate

On Wed, Aug 28, 2013 at 2:25 PM, Abraham Elmahrek <ab...@cloudera.com> wrote:
> Hey Raj,
>
> It seems like the number of fields you have in your data doesn't match the
> number of fields in your RAJ.CUSTOMERS table.
>
> Could you please add "--verbose" to the beginning of your argument list and
> provide the entire contents here?
>
> -Abe
>
>
> On Wed, Aug 28, 2013 at 9:36 AM, Raj Hadoop <ha...@yahoo.com> wrote:
>>
>> Hello all,
>>
>> I am getting an error while using sqoop export ( Load HDFS file to Oracle
>> ). I am not sure the issue might be a Sqoop or Hadoop related one. So I am
>> sending it to both the dist lists.
>>
>> I am using -
>>
>> sqoop export --connect jdbc:oracle:thin:@//dbserv:9876/OKI --table
>> RAJ.CUSTOMERS --export-dir /user/hive/warehouse/web_cust --input-null-string
>> '\\N' --input-null-non-string '\\N'  --username <> --password <> -m 1
>> --input-fields-terminated-by '\t'
>> I am getting the following error -
>>
>> Warning: /usr/lib/hbase does not exist! HBase imports will fail.
>> Please set $HBASE_HOME to the root of your HBase installation.
>> Warning: $HADOOP_HOME is deprecated.
>> 13/08/28 09:42:36 WARN tool.BaseSqoopTool: Setting your password on the
>> command-line is insecure. Consider using -P instead.
>> 13/08/28 09:42:36 INFO manager.SqlManager: Using default fetchSize of 1000
>> 13/08/28 09:42:36 INFO tool.CodeGenTool: Beginning code generation
>> 13/08/28 09:42:38 INFO manager.OracleManager: Time zone has been set to
>> GMT
>> 13/08/28 09:42:38 INFO manager.SqlManager: Executing SQL statement: SELECT
>> t.* FROM RAJ.CUSTOMERS t WHERE 1=0
>> 13/08/28 09:42:38 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is
>> /software/hadoop/hadoop/hadoop-1.1.2
>> Note:
>> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ_CUSTOMERS.java
>> uses or overrides a deprecated API.
>> Note: Recompile with -Xlint:deprecation for details.
>> 13/08/28 09:42:40 INFO orm.CompilationManager: Writing jar file:
>> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ.CUSTOMERS.jar
>> 13/08/28 09:42:40 INFO mapreduce.ExportJobBase: Beginning export of
>> RAJ.CUSTOMERS
>> 13/08/28 09:42:41 INFO manager.OracleManager: Time zone has been set to
>> GMT
>> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
>> : 1
>> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
>> : 1
>> 13/08/28 09:42:43 INFO util.NativeCodeLoader: Loaded the native-hadoop
>> library
>> 13/08/28 09:42:43 WARN snappy.LoadSnappy: Snappy native library not loaded
>> 13/08/28 09:42:43 INFO mapred.JobClient: Running job:
>> job_201307041900_0463
>> 13/08/28 09:42:44 INFO mapred.JobClient:  map 0% reduce 0%
>> 13/08/28 09:42:56 INFO mapred.JobClient:  map 1% reduce 0%
>> 13/08/28 09:43:00 INFO mapred.JobClient:  map 2% reduce 0%
>> 13/08/28 09:43:03 INFO mapred.JobClient:  map 4% reduce 0%
>> 13/08/28 09:43:10 INFO mapred.JobClient:  map 5% reduce 0%
>> 13/08/28 09:43:13 INFO mapred.JobClient:  map 6% reduce 0%
>> 13/08/28 09:43:17 INFO mapred.JobClient: Task Id :
>> attempt_201307041900_0463_m_000000_0, Status : FAILED
>> java.io.IOException: Can't export data, please check task tracker logs
>>         at
>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
>>         at
>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
>>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>>         at
>> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
>>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>>         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at javax.security.auth.Subject.doAs(Subject.java:415)
>>         at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
>>         at org.apache.hadoop.mapred.Child.main(Child.java:249)
>> Caused by: java.util.NoSuchElementException
>>         at java.util.ArrayList$Itr.next(ArrayList.java:794)
>>         at RAJ_CUSTOMERS.__loadFromFields(RAJ_CUSTOMERS.java:1057)
>>         at RAJ_CUSTOMERS.parse(RAJ_CUSTOMERS.java:876)
>>         at
>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
>>         ... 10 more
>>
>> Thanks,
>> Raj
>>
>>
>
>

Re: Sqoop issue related to Hadoop

Posted by Kathleen Ting <ka...@apache.org>.
Raj, in addition to what Abe said, please also send the failed task attempt log
attempt_201307041900_0463_m_000000_0 as well.

Thanks,
Kate

On Wed, Aug 28, 2013 at 2:25 PM, Abraham Elmahrek <ab...@cloudera.com> wrote:
> Hey Raj,
>
> It seems like the number of fields you have in your data doesn't match the
> number of fields in your RAJ.CUSTOMERS table.
>
> Could you please add "--verbose" to the beginning of your argument list and
> provide the entire contents here?
>
> -Abe
>
>
> On Wed, Aug 28, 2013 at 9:36 AM, Raj Hadoop <ha...@yahoo.com> wrote:
>>
>> Hello all,
>>
>> I am getting an error while using sqoop export ( Load HDFS file to Oracle
>> ). I am not sure the issue might be a Sqoop or Hadoop related one. So I am
>> sending it to both the dist lists.
>>
>> I am using -
>>
>> sqoop export --connect jdbc:oracle:thin:@//dbserv:9876/OKI --table
>> RAJ.CUSTOMERS --export-dir /user/hive/warehouse/web_cust --input-null-string
>> '\\N' --input-null-non-string '\\N'  --username <> --password <> -m 1
>> --input-fields-terminated-by '\t'
>> I am getting the following error -
>>
>> Warning: /usr/lib/hbase does not exist! HBase imports will fail.
>> Please set $HBASE_HOME to the root of your HBase installation.
>> Warning: $HADOOP_HOME is deprecated.
>> 13/08/28 09:42:36 WARN tool.BaseSqoopTool: Setting your password on the
>> command-line is insecure. Consider using -P instead.
>> 13/08/28 09:42:36 INFO manager.SqlManager: Using default fetchSize of 1000
>> 13/08/28 09:42:36 INFO tool.CodeGenTool: Beginning code generation
>> 13/08/28 09:42:38 INFO manager.OracleManager: Time zone has been set to
>> GMT
>> 13/08/28 09:42:38 INFO manager.SqlManager: Executing SQL statement: SELECT
>> t.* FROM RAJ.CUSTOMERS t WHERE 1=0
>> 13/08/28 09:42:38 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is
>> /software/hadoop/hadoop/hadoop-1.1.2
>> Note:
>> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ_CUSTOMERS.java
>> uses or overrides a deprecated API.
>> Note: Recompile with -Xlint:deprecation for details.
>> 13/08/28 09:42:40 INFO orm.CompilationManager: Writing jar file:
>> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ.CUSTOMERS.jar
>> 13/08/28 09:42:40 INFO mapreduce.ExportJobBase: Beginning export of
>> RAJ.CUSTOMERS
>> 13/08/28 09:42:41 INFO manager.OracleManager: Time zone has been set to
>> GMT
>> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
>> : 1
>> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
>> : 1
>> 13/08/28 09:42:43 INFO util.NativeCodeLoader: Loaded the native-hadoop
>> library
>> 13/08/28 09:42:43 WARN snappy.LoadSnappy: Snappy native library not loaded
>> 13/08/28 09:42:43 INFO mapred.JobClient: Running job:
>> job_201307041900_0463
>> 13/08/28 09:42:44 INFO mapred.JobClient:  map 0% reduce 0%
>> 13/08/28 09:42:56 INFO mapred.JobClient:  map 1% reduce 0%
>> 13/08/28 09:43:00 INFO mapred.JobClient:  map 2% reduce 0%
>> 13/08/28 09:43:03 INFO mapred.JobClient:  map 4% reduce 0%
>> 13/08/28 09:43:10 INFO mapred.JobClient:  map 5% reduce 0%
>> 13/08/28 09:43:13 INFO mapred.JobClient:  map 6% reduce 0%
>> 13/08/28 09:43:17 INFO mapred.JobClient: Task Id :
>> attempt_201307041900_0463_m_000000_0, Status : FAILED
>> java.io.IOException: Can't export data, please check task tracker logs
>>         at
>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
>>         at
>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
>>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>>         at
>> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
>>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>>         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>>         at java.security.AccessController.doPrivileged(Native Method)
>>         at javax.security.auth.Subject.doAs(Subject.java:415)
>>         at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
>>         at org.apache.hadoop.mapred.Child.main(Child.java:249)
>> Caused by: java.util.NoSuchElementException
>>         at java.util.ArrayList$Itr.next(ArrayList.java:794)
>>         at RAJ_CUSTOMERS.__loadFromFields(RAJ_CUSTOMERS.java:1057)
>>         at RAJ_CUSTOMERS.parse(RAJ_CUSTOMERS.java:876)
>>         at
>> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
>>         ... 10 more
>>
>> Thanks,
>> Raj
>>
>>
>
>

Re: Sqoop issue related to Hadoop

Posted by Abraham Elmahrek <ab...@cloudera.com>.
Hey Raj,

It seems like the number of fields you have in your data doesn't match the
number of fields in your RAJ.CUSTOMERS table.

Could you please add "--verbose" to the beginning of your argument list and
provide the entire contents here?

-Abe


On Wed, Aug 28, 2013 at 9:36 AM, Raj Hadoop <ha...@yahoo.com> wrote:

> Hello all,
>
> I am getting an error while using sqoop export ( Load HDFS file to Oracle
> ). I am not sure the issue might be a Sqoop or Hadoop related one. So I am
> sending it to both the dist lists.
>
> *I am using -*
>
> sqoop export --connect jdbc:oracle:thin:@//dbserv:9876/OKI
> --table RAJ.CUSTOMERS --export-dir /user/hive/warehouse/web_cust
> --input-null-string '\\N' --input-null-non-string '\\N'  --username <>
> --password <> -m 1 --input-fields-terminated-by '\t'
>  *I am getting the following error -*
>
> Warning: /usr/lib/hbase does not exist! HBase imports will fail.
> Please set $HBASE_HOME to the root of your HBase installation.
> Warning: $HADOOP_HOME is deprecated.
> 13/08/28 09:42:36 WARN tool.BaseSqoopTool: Setting your password on the
> command-line is insecure. Consider using -P instead.
> 13/08/28 09:42:36 INFO manager.SqlManager: Using default fetchSize of 1000
> 13/08/28 09:42:36 INFO tool.CodeGenTool: Beginning code generation
> 13/08/28 09:42:38 INFO manager.OracleManager: Time zone has been set to GMT
> 13/08/28 09:42:38 INFO manager.SqlManager: Executing SQL statement: SELECT
> t.* FROM RAJ.CUSTOMERS t WHERE 1=0
> 13/08/28 09:42:38 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is
> /software/hadoop/hadoop/hadoop-1.1.2
> Note:
> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ_CUSTOMERS.java
> uses or overrides a deprecated API.
> Note: Recompile with -Xlint:deprecation for details.
> 13/08/28 09:42:40 INFO orm.CompilationManager: Writing jar file:
> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ.CUSTOMERS.jar
> 13/08/28 09:42:40 INFO mapreduce.ExportJobBase: Beginning export of
> RAJ.CUSTOMERS
> 13/08/28 09:42:41 INFO manager.OracleManager: Time zone has been set to GMT
> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
> : 1
> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
> : 1
> 13/08/28 09:42:43 INFO util.NativeCodeLoader: Loaded the native-hadoop
> library
> 13/08/28 09:42:43 WARN snappy.LoadSnappy: Snappy native library not loaded
> 13/08/28 09:42:43 INFO mapred.JobClient: Running job: job_201307041900_0463
> 13/08/28 09:42:44 INFO mapred.JobClient:  map 0% reduce 0%
> 13/08/28 09:42:56 INFO mapred.JobClient:  map 1% reduce 0%
> 13/08/28 09:43:00 INFO mapred.JobClient:  map 2% reduce 0%
> 13/08/28 09:43:03 INFO mapred.JobClient:  map 4% reduce 0%
> 13/08/28 09:43:10 INFO mapred.JobClient:  map 5% reduce 0%
> 13/08/28 09:43:13 INFO mapred.JobClient:  map 6% reduce 0%
> 13/08/28 09:43:17 INFO mapred.JobClient: Task Id :
> attempt_201307041900_0463_m_000000_0, Status : FAILED
> java.io.IOException: Can't export data, please check task tracker logs
>         at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
>         at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>         at
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:415)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
>         at org.apache.hadoop.mapred.Child.main(Child.java:249)
> Caused by: java.util.NoSuchElementException
>         at java.util.ArrayList$Itr.next(ArrayList.java:794)
>         at RAJ_CUSTOMERS.__loadFromFields(RAJ_CUSTOMERS.java:1057)
>         at RAJ_CUSTOMERS.parse(RAJ_CUSTOMERS.java:876)
>         at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
>         ... 10 more
>
> Thanks,
> Raj
>
>
>

Re: Sqoop issue related to Hadoop

Posted by Abraham Elmahrek <ab...@cloudera.com>.
Hey Raj,

It seems like the number of fields you have in your data doesn't match the
number of fields in your RAJ.CUSTOMERS table.

Could you please add "--verbose" to the beginning of your argument list and
provide the entire contents here?

-Abe


On Wed, Aug 28, 2013 at 9:36 AM, Raj Hadoop <ha...@yahoo.com> wrote:

> Hello all,
>
> I am getting an error while using sqoop export ( Load HDFS file to Oracle
> ). I am not sure the issue might be a Sqoop or Hadoop related one. So I am
> sending it to both the dist lists.
>
> *I am using -*
>
> sqoop export --connect jdbc:oracle:thin:@//dbserv:9876/OKI
> --table RAJ.CUSTOMERS --export-dir /user/hive/warehouse/web_cust
> --input-null-string '\\N' --input-null-non-string '\\N'  --username <>
> --password <> -m 1 --input-fields-terminated-by '\t'
>  *I am getting the following error -*
>
> Warning: /usr/lib/hbase does not exist! HBase imports will fail.
> Please set $HBASE_HOME to the root of your HBase installation.
> Warning: $HADOOP_HOME is deprecated.
> 13/08/28 09:42:36 WARN tool.BaseSqoopTool: Setting your password on the
> command-line is insecure. Consider using -P instead.
> 13/08/28 09:42:36 INFO manager.SqlManager: Using default fetchSize of 1000
> 13/08/28 09:42:36 INFO tool.CodeGenTool: Beginning code generation
> 13/08/28 09:42:38 INFO manager.OracleManager: Time zone has been set to GMT
> 13/08/28 09:42:38 INFO manager.SqlManager: Executing SQL statement: SELECT
> t.* FROM RAJ.CUSTOMERS t WHERE 1=0
> 13/08/28 09:42:38 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is
> /software/hadoop/hadoop/hadoop-1.1.2
> Note:
> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ_CUSTOMERS.java
> uses or overrides a deprecated API.
> Note: Recompile with -Xlint:deprecation for details.
> 13/08/28 09:42:40 INFO orm.CompilationManager: Writing jar file:
> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ.CUSTOMERS.jar
> 13/08/28 09:42:40 INFO mapreduce.ExportJobBase: Beginning export of
> RAJ.CUSTOMERS
> 13/08/28 09:42:41 INFO manager.OracleManager: Time zone has been set to GMT
> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
> : 1
> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
> : 1
> 13/08/28 09:42:43 INFO util.NativeCodeLoader: Loaded the native-hadoop
> library
> 13/08/28 09:42:43 WARN snappy.LoadSnappy: Snappy native library not loaded
> 13/08/28 09:42:43 INFO mapred.JobClient: Running job: job_201307041900_0463
> 13/08/28 09:42:44 INFO mapred.JobClient:  map 0% reduce 0%
> 13/08/28 09:42:56 INFO mapred.JobClient:  map 1% reduce 0%
> 13/08/28 09:43:00 INFO mapred.JobClient:  map 2% reduce 0%
> 13/08/28 09:43:03 INFO mapred.JobClient:  map 4% reduce 0%
> 13/08/28 09:43:10 INFO mapred.JobClient:  map 5% reduce 0%
> 13/08/28 09:43:13 INFO mapred.JobClient:  map 6% reduce 0%
> 13/08/28 09:43:17 INFO mapred.JobClient: Task Id :
> attempt_201307041900_0463_m_000000_0, Status : FAILED
> java.io.IOException: Can't export data, please check task tracker logs
>         at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
>         at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>         at
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:415)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
>         at org.apache.hadoop.mapred.Child.main(Child.java:249)
> Caused by: java.util.NoSuchElementException
>         at java.util.ArrayList$Itr.next(ArrayList.java:794)
>         at RAJ_CUSTOMERS.__loadFromFields(RAJ_CUSTOMERS.java:1057)
>         at RAJ_CUSTOMERS.parse(RAJ_CUSTOMERS.java:876)
>         at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
>         ... 10 more
>
> Thanks,
> Raj
>
>
>

Re: Sqoop issue related to Hadoop

Posted by Abraham Elmahrek <ab...@cloudera.com>.
Hey Raj,

It seems like the number of fields you have in your data doesn't match the
number of fields in your RAJ.CUSTOMERS table.

Could you please add "--verbose" to the beginning of your argument list and
provide the entire contents here?

-Abe


On Wed, Aug 28, 2013 at 9:36 AM, Raj Hadoop <ha...@yahoo.com> wrote:

> Hello all,
>
> I am getting an error while using sqoop export ( Load HDFS file to Oracle
> ). I am not sure the issue might be a Sqoop or Hadoop related one. So I am
> sending it to both the dist lists.
>
> *I am using -*
>
> sqoop export --connect jdbc:oracle:thin:@//dbserv:9876/OKI
> --table RAJ.CUSTOMERS --export-dir /user/hive/warehouse/web_cust
> --input-null-string '\\N' --input-null-non-string '\\N'  --username <>
> --password <> -m 1 --input-fields-terminated-by '\t'
>  *I am getting the following error -*
>
> Warning: /usr/lib/hbase does not exist! HBase imports will fail.
> Please set $HBASE_HOME to the root of your HBase installation.
> Warning: $HADOOP_HOME is deprecated.
> 13/08/28 09:42:36 WARN tool.BaseSqoopTool: Setting your password on the
> command-line is insecure. Consider using -P instead.
> 13/08/28 09:42:36 INFO manager.SqlManager: Using default fetchSize of 1000
> 13/08/28 09:42:36 INFO tool.CodeGenTool: Beginning code generation
> 13/08/28 09:42:38 INFO manager.OracleManager: Time zone has been set to GMT
> 13/08/28 09:42:38 INFO manager.SqlManager: Executing SQL statement: SELECT
> t.* FROM RAJ.CUSTOMERS t WHERE 1=0
> 13/08/28 09:42:38 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is
> /software/hadoop/hadoop/hadoop-1.1.2
> Note:
> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ_CUSTOMERS.java
> uses or overrides a deprecated API.
> Note: Recompile with -Xlint:deprecation for details.
> 13/08/28 09:42:40 INFO orm.CompilationManager: Writing jar file:
> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ.CUSTOMERS.jar
> 13/08/28 09:42:40 INFO mapreduce.ExportJobBase: Beginning export of
> RAJ.CUSTOMERS
> 13/08/28 09:42:41 INFO manager.OracleManager: Time zone has been set to GMT
> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
> : 1
> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
> : 1
> 13/08/28 09:42:43 INFO util.NativeCodeLoader: Loaded the native-hadoop
> library
> 13/08/28 09:42:43 WARN snappy.LoadSnappy: Snappy native library not loaded
> 13/08/28 09:42:43 INFO mapred.JobClient: Running job: job_201307041900_0463
> 13/08/28 09:42:44 INFO mapred.JobClient:  map 0% reduce 0%
> 13/08/28 09:42:56 INFO mapred.JobClient:  map 1% reduce 0%
> 13/08/28 09:43:00 INFO mapred.JobClient:  map 2% reduce 0%
> 13/08/28 09:43:03 INFO mapred.JobClient:  map 4% reduce 0%
> 13/08/28 09:43:10 INFO mapred.JobClient:  map 5% reduce 0%
> 13/08/28 09:43:13 INFO mapred.JobClient:  map 6% reduce 0%
> 13/08/28 09:43:17 INFO mapred.JobClient: Task Id :
> attempt_201307041900_0463_m_000000_0, Status : FAILED
> java.io.IOException: Can't export data, please check task tracker logs
>         at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
>         at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>         at
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:415)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
>         at org.apache.hadoop.mapred.Child.main(Child.java:249)
> Caused by: java.util.NoSuchElementException
>         at java.util.ArrayList$Itr.next(ArrayList.java:794)
>         at RAJ_CUSTOMERS.__loadFromFields(RAJ_CUSTOMERS.java:1057)
>         at RAJ_CUSTOMERS.parse(RAJ_CUSTOMERS.java:876)
>         at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
>         ... 10 more
>
> Thanks,
> Raj
>
>
>

Re: Sqoop issue related to Hadoop

Posted by Abraham Elmahrek <ab...@cloudera.com>.
Hey Raj,

It seems like the number of fields you have in your data doesn't match the
number of fields in your RAJ.CUSTOMERS table.

Could you please add "--verbose" to the beginning of your argument list and
provide the entire contents here?

-Abe


On Wed, Aug 28, 2013 at 9:36 AM, Raj Hadoop <ha...@yahoo.com> wrote:

> Hello all,
>
> I am getting an error while using sqoop export ( Load HDFS file to Oracle
> ). I am not sure the issue might be a Sqoop or Hadoop related one. So I am
> sending it to both the dist lists.
>
> *I am using -*
>
> sqoop export --connect jdbc:oracle:thin:@//dbserv:9876/OKI
> --table RAJ.CUSTOMERS --export-dir /user/hive/warehouse/web_cust
> --input-null-string '\\N' --input-null-non-string '\\N'  --username <>
> --password <> -m 1 --input-fields-terminated-by '\t'
>  *I am getting the following error -*
>
> Warning: /usr/lib/hbase does not exist! HBase imports will fail.
> Please set $HBASE_HOME to the root of your HBase installation.
> Warning: $HADOOP_HOME is deprecated.
> 13/08/28 09:42:36 WARN tool.BaseSqoopTool: Setting your password on the
> command-line is insecure. Consider using -P instead.
> 13/08/28 09:42:36 INFO manager.SqlManager: Using default fetchSize of 1000
> 13/08/28 09:42:36 INFO tool.CodeGenTool: Beginning code generation
> 13/08/28 09:42:38 INFO manager.OracleManager: Time zone has been set to GMT
> 13/08/28 09:42:38 INFO manager.SqlManager: Executing SQL statement: SELECT
> t.* FROM RAJ.CUSTOMERS t WHERE 1=0
> 13/08/28 09:42:38 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is
> /software/hadoop/hadoop/hadoop-1.1.2
> Note:
> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ_CUSTOMERS.java
> uses or overrides a deprecated API.
> Note: Recompile with -Xlint:deprecation for details.
> 13/08/28 09:42:40 INFO orm.CompilationManager: Writing jar file:
> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ.CUSTOMERS.jar
> 13/08/28 09:42:40 INFO mapreduce.ExportJobBase: Beginning export of
> RAJ.CUSTOMERS
> 13/08/28 09:42:41 INFO manager.OracleManager: Time zone has been set to GMT
> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
> : 1
> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
> : 1
> 13/08/28 09:42:43 INFO util.NativeCodeLoader: Loaded the native-hadoop
> library
> 13/08/28 09:42:43 WARN snappy.LoadSnappy: Snappy native library not loaded
> 13/08/28 09:42:43 INFO mapred.JobClient: Running job: job_201307041900_0463
> 13/08/28 09:42:44 INFO mapred.JobClient:  map 0% reduce 0%
> 13/08/28 09:42:56 INFO mapred.JobClient:  map 1% reduce 0%
> 13/08/28 09:43:00 INFO mapred.JobClient:  map 2% reduce 0%
> 13/08/28 09:43:03 INFO mapred.JobClient:  map 4% reduce 0%
> 13/08/28 09:43:10 INFO mapred.JobClient:  map 5% reduce 0%
> 13/08/28 09:43:13 INFO mapred.JobClient:  map 6% reduce 0%
> 13/08/28 09:43:17 INFO mapred.JobClient: Task Id :
> attempt_201307041900_0463_m_000000_0, Status : FAILED
> java.io.IOException: Can't export data, please check task tracker logs
>         at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
>         at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>         at
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:415)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
>         at org.apache.hadoop.mapred.Child.main(Child.java:249)
> Caused by: java.util.NoSuchElementException
>         at java.util.ArrayList$Itr.next(ArrayList.java:794)
>         at RAJ_CUSTOMERS.__loadFromFields(RAJ_CUSTOMERS.java:1057)
>         at RAJ_CUSTOMERS.parse(RAJ_CUSTOMERS.java:876)
>         at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
>         ... 10 more
>
> Thanks,
> Raj
>
>
>

Re: Sqoop issue related to Hadoop

Posted by Abraham Elmahrek <ab...@cloudera.com>.
Hey Raj,

It seems like the number of fields you have in your data doesn't match the
number of fields in your RAJ.CUSTOMERS table.

Could you please add "--verbose" to the beginning of your argument list and
provide the entire contents here?

-Abe


On Wed, Aug 28, 2013 at 9:36 AM, Raj Hadoop <ha...@yahoo.com> wrote:

> Hello all,
>
> I am getting an error while using sqoop export ( Load HDFS file to Oracle
> ). I am not sure the issue might be a Sqoop or Hadoop related one. So I am
> sending it to both the dist lists.
>
> *I am using -*
>
> sqoop export --connect jdbc:oracle:thin:@//dbserv:9876/OKI
> --table RAJ.CUSTOMERS --export-dir /user/hive/warehouse/web_cust
> --input-null-string '\\N' --input-null-non-string '\\N'  --username <>
> --password <> -m 1 --input-fields-terminated-by '\t'
>  *I am getting the following error -*
>
> Warning: /usr/lib/hbase does not exist! HBase imports will fail.
> Please set $HBASE_HOME to the root of your HBase installation.
> Warning: $HADOOP_HOME is deprecated.
> 13/08/28 09:42:36 WARN tool.BaseSqoopTool: Setting your password on the
> command-line is insecure. Consider using -P instead.
> 13/08/28 09:42:36 INFO manager.SqlManager: Using default fetchSize of 1000
> 13/08/28 09:42:36 INFO tool.CodeGenTool: Beginning code generation
> 13/08/28 09:42:38 INFO manager.OracleManager: Time zone has been set to GMT
> 13/08/28 09:42:38 INFO manager.SqlManager: Executing SQL statement: SELECT
> t.* FROM RAJ.CUSTOMERS t WHERE 1=0
> 13/08/28 09:42:38 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is
> /software/hadoop/hadoop/hadoop-1.1.2
> Note:
> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ_CUSTOMERS.java
> uses or overrides a deprecated API.
> Note: Recompile with -Xlint:deprecation for details.
> 13/08/28 09:42:40 INFO orm.CompilationManager: Writing jar file:
> /tmp/sqoop-hadoop/compile/c1376f66d2151b48024c54305377c981/RAJ.CUSTOMERS.jar
> 13/08/28 09:42:40 INFO mapreduce.ExportJobBase: Beginning export of
> RAJ.CUSTOMERS
> 13/08/28 09:42:41 INFO manager.OracleManager: Time zone has been set to GMT
> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
> : 1
> 13/08/28 09:42:43 INFO input.FileInputFormat: Total input paths to process
> : 1
> 13/08/28 09:42:43 INFO util.NativeCodeLoader: Loaded the native-hadoop
> library
> 13/08/28 09:42:43 WARN snappy.LoadSnappy: Snappy native library not loaded
> 13/08/28 09:42:43 INFO mapred.JobClient: Running job: job_201307041900_0463
> 13/08/28 09:42:44 INFO mapred.JobClient:  map 0% reduce 0%
> 13/08/28 09:42:56 INFO mapred.JobClient:  map 1% reduce 0%
> 13/08/28 09:43:00 INFO mapred.JobClient:  map 2% reduce 0%
> 13/08/28 09:43:03 INFO mapred.JobClient:  map 4% reduce 0%
> 13/08/28 09:43:10 INFO mapred.JobClient:  map 5% reduce 0%
> 13/08/28 09:43:13 INFO mapred.JobClient:  map 6% reduce 0%
> 13/08/28 09:43:17 INFO mapred.JobClient: Task Id :
> attempt_201307041900_0463_m_000000_0, Status : FAILED
> java.io.IOException: Can't export data, please check task tracker logs
>         at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:112)
>         at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
>         at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
>         at
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
>         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:415)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149)
>         at org.apache.hadoop.mapred.Child.main(Child.java:249)
> Caused by: java.util.NoSuchElementException
>         at java.util.ArrayList$Itr.next(ArrayList.java:794)
>         at RAJ_CUSTOMERS.__loadFromFields(RAJ_CUSTOMERS.java:1057)
>         at RAJ_CUSTOMERS.parse(RAJ_CUSTOMERS.java:876)
>         at
> org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:83)
>         ... 10 more
>
> Thanks,
> Raj
>
>
>