You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-user@hadoop.apache.org by samir das mohapatra <sa...@gmail.com> on 2013/02/21 07:28:46 UTC

ISSUE :Hadoop with HANA using sqoop

Hi All
    Can you plese tell me why I am getting error while loading data
from      SAP HANA   to Hadoop HDFS using sqoop (4.1.2).

Error Log:

java.io.IOException: SQLException in nextKeyValue
	at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
	at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
	at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
	at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
	at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:416)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
	at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: com.sap

Regards,
samir.

Re: Newbie Debuggin Question

Posted by be...@gmail.com.
Hi Sai

The location you are seeing should be the mapred.local.dir .

 From my understanding the files in distributed cache would be available in that location while you are running the job and would be cleaned up at the end of it.

Regards 
Bejoy KS

Sent from remote device, Please excuse typos

-----Original Message-----
From: Sai Sai <sa...@yahoo.in>
Date: Thu, 21 Feb 2013 20:38:06 
To: user@hadoop.apache.org<us...@hadoop.apache.org>; cdh-user@cloudera.org<cd...@cloudera.org>
Reply-To: user@hadoop.apache.org
Subject: Re: Newbie Debuggin Question

This may be a basic beginner debug question will appreciate if anyone can pour some light:

Here is the method i have in Eclipse:


*******************************

@Override
    protected void setup(Context context) throws java.io.IOException,
            InterruptedException {
        Path[] cacheFiles = DistributedCache.getLocalCacheFiles(context
                .getConfiguration());
        lookUp = cacheFiles[0];
    };
*******************************

I have put a breakpoint at the second line and inspected cacheFiles[0] and here is what i see:

[/tmp/hadoop-sai/mapred/local/archive/3401759285981873176_334405473_2022582449/fileinput/lookup.txt]

I went back to my local folders looking for these folders to see if there r in here but do not see them.

Just wondering where it is getting this file from.

Any help will be really appreciated.
Thanks
Sai

Re: Newbie Debuggin Question

Posted by be...@gmail.com.
Hi Sai

The location you are seeing should be the mapred.local.dir .

 From my understanding the files in distributed cache would be available in that location while you are running the job and would be cleaned up at the end of it.

Regards 
Bejoy KS

Sent from remote device, Please excuse typos

-----Original Message-----
From: Sai Sai <sa...@yahoo.in>
Date: Thu, 21 Feb 2013 20:38:06 
To: user@hadoop.apache.org<us...@hadoop.apache.org>; cdh-user@cloudera.org<cd...@cloudera.org>
Reply-To: user@hadoop.apache.org
Subject: Re: Newbie Debuggin Question

This may be a basic beginner debug question will appreciate if anyone can pour some light:

Here is the method i have in Eclipse:


*******************************

@Override
    protected void setup(Context context) throws java.io.IOException,
            InterruptedException {
        Path[] cacheFiles = DistributedCache.getLocalCacheFiles(context
                .getConfiguration());
        lookUp = cacheFiles[0];
    };
*******************************

I have put a breakpoint at the second line and inspected cacheFiles[0] and here is what i see:

[/tmp/hadoop-sai/mapred/local/archive/3401759285981873176_334405473_2022582449/fileinput/lookup.txt]

I went back to my local folders looking for these folders to see if there r in here but do not see them.

Just wondering where it is getting this file from.

Any help will be really appreciated.
Thanks
Sai

Re: Newbie Debuggin Question

Posted by be...@gmail.com.
Hi Sai

The location you are seeing should be the mapred.local.dir .

 From my understanding the files in distributed cache would be available in that location while you are running the job and would be cleaned up at the end of it.

Regards 
Bejoy KS

Sent from remote device, Please excuse typos

-----Original Message-----
From: Sai Sai <sa...@yahoo.in>
Date: Thu, 21 Feb 2013 20:38:06 
To: user@hadoop.apache.org<us...@hadoop.apache.org>; cdh-user@cloudera.org<cd...@cloudera.org>
Reply-To: user@hadoop.apache.org
Subject: Re: Newbie Debuggin Question

This may be a basic beginner debug question will appreciate if anyone can pour some light:

Here is the method i have in Eclipse:


*******************************

@Override
    protected void setup(Context context) throws java.io.IOException,
            InterruptedException {
        Path[] cacheFiles = DistributedCache.getLocalCacheFiles(context
                .getConfiguration());
        lookUp = cacheFiles[0];
    };
*******************************

I have put a breakpoint at the second line and inspected cacheFiles[0] and here is what i see:

[/tmp/hadoop-sai/mapred/local/archive/3401759285981873176_334405473_2022582449/fileinput/lookup.txt]

I went back to my local folders looking for these folders to see if there r in here but do not see them.

Just wondering where it is getting this file from.

Any help will be really appreciated.
Thanks
Sai

Re: Newbie Debuggin Question

Posted by be...@gmail.com.
Hi Sai

The location you are seeing should be the mapred.local.dir .

 From my understanding the files in distributed cache would be available in that location while you are running the job and would be cleaned up at the end of it.

Regards 
Bejoy KS

Sent from remote device, Please excuse typos

-----Original Message-----
From: Sai Sai <sa...@yahoo.in>
Date: Thu, 21 Feb 2013 20:38:06 
To: user@hadoop.apache.org<us...@hadoop.apache.org>; cdh-user@cloudera.org<cd...@cloudera.org>
Reply-To: user@hadoop.apache.org
Subject: Re: Newbie Debuggin Question

This may be a basic beginner debug question will appreciate if anyone can pour some light:

Here is the method i have in Eclipse:


*******************************

@Override
    protected void setup(Context context) throws java.io.IOException,
            InterruptedException {
        Path[] cacheFiles = DistributedCache.getLocalCacheFiles(context
                .getConfiguration());
        lookUp = cacheFiles[0];
    };
*******************************

I have put a breakpoint at the second line and inspected cacheFiles[0] and here is what i see:

[/tmp/hadoop-sai/mapred/local/archive/3401759285981873176_334405473_2022582449/fileinput/lookup.txt]

I went back to my local folders looking for these folders to see if there r in here but do not see them.

Just wondering where it is getting this file from.

Any help will be really appreciated.
Thanks
Sai

Re: Newbie Debuggin Question

Posted by Sai Sai <sa...@yahoo.in>.
This may be a basic beginner debug question will appreciate if anyone can pour some light:

Here is the method i have in Eclipse:


*******************************

@Override
    protected void setup(Context context) throws java.io.IOException,
            InterruptedException {
        Path[] cacheFiles = DistributedCache.getLocalCacheFiles(context
                .getConfiguration());
        lookUp = cacheFiles[0];
    };
*******************************

I have put a breakpoint at the second line and inspected cacheFiles[0] and here is what i see:

[/tmp/hadoop-sai/mapred/local/archive/3401759285981873176_334405473_2022582449/fileinput/lookup.txt]

I went back to my local folders looking for these folders to see if there r in here but do not see them.

Just wondering where it is getting this file from.

Any help will be really appreciated.
Thanks
Sai

Re: Newbie Debuggin Question

Posted by Sai Sai <sa...@yahoo.in>.
This may be a basic beginner debug question will appreciate if anyone can pour some light:

Here is the method i have in Eclipse:


*******************************

@Override
    protected void setup(Context context) throws java.io.IOException,
            InterruptedException {
        Path[] cacheFiles = DistributedCache.getLocalCacheFiles(context
                .getConfiguration());
        lookUp = cacheFiles[0];
    };
*******************************

I have put a breakpoint at the second line and inspected cacheFiles[0] and here is what i see:

[/tmp/hadoop-sai/mapred/local/archive/3401759285981873176_334405473_2022582449/fileinput/lookup.txt]

I went back to my local folders looking for these folders to see if there r in here but do not see them.

Just wondering where it is getting this file from.

Any help will be really appreciated.
Thanks
Sai

Re: ISSUE :Hadoop with HANA using sqoop

Posted by be...@gmail.com.
Hi Sameer

The query

"SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0"

Is first executed by SQOOP  to fetch the metadata. 

The actual data fetch happens as part of individual queries from each task which would be a sub query of the whole input query.

Regards 
Bejoy KS

Sent from remote device, Please excuse typos

-----Original Message-----
From: Alexander Alten-Lorenz <al...@cloudera.com>
Date: Thu, 21 Feb 2013 07:58:13 
To: cdh-user@cloudera.org<cd...@cloudera.org>
Reply-To: user@hadoop.apache.org
Subject: Re: ISSUE :Hadoop with HANA using sqoop

Hey Samir,

Since you've posted this already @CDH users, please go ahead there. 

Cheers
 Alex

On Feb 21, 2013, at 7:49 AM, samir das mohapatra <sa...@gmail.com> wrote:

> Harsh,
>     I copied whole logs and past here, It looks like only it is showing   "Caused by: com.sap" ,
> And One thing i did not get is why it is running  "SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0"  . It indicate where 1=0 means  no value . But I database we have records.
> 
> 
> Error:
> 
> hadoop@hadoophost2:~/Desktop$ sqoop import  --connect  jdbc:sap://sj1svm010.corp.adobe.com:30015/hd2  --driver com.sap.db.jdbc.Driver   --table  hgopalan.hana_training  -m  1 --username hgopalan     --password Adobe_23  --target-dir  /input/training
> 13/02/20 22:37:27 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
> 13/02/20 22:37:27 INFO manager.SqlManager: Using default fetchSize of 1000
> 13/02/20 22:37:27 INFO tool.CodeGenTool: Beginning code generation
> 13/02/20 22:37:28 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0
> 13/02/20 22:37:29 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0
> 13/02/20 22:37:30 INFO orm.CompilationManager: HADOOP_HOME is /usr/local/hadoop/hadoop-2.0.0-mr1-cdh4.1.2
> Note: /tmp/sqoop-hadoop/compile/d66ed0385ac93eb37215515e4e0c2caf/hgopalan_hana_training.java uses or overrides a deprecated API.
> Note: Recompile with -Xlint:deprecation for details.
> 13/02/20 22:37:32 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/d66ed0385ac93eb37215515e4e0c2caf/hgopalan.hana_training.jar
> 13/02/20 22:37:33 INFO mapreduce.ImportJobBase: Beginning import of hgopalan.hana_training
> 13/02/20 22:37:34 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0
> 13/02/20 22:37:36 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
> 13/02/20 22:37:39 INFO mapred.JobClient: Running job: job_201302202127_0014
> 13/02/20 22:37:40 INFO mapred.JobClient:  map 0% reduce 0%
> 13/02/20 22:38:06 INFO mapred.JobClient: Task Id : attempt_201302202127_0014_m_000000_0, Status : FAILED
> java.io.IOException: SQLException in nextKeyValue
>     at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
>     at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
>     at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
>     at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
>     at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
>     at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:416)
>     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
>     at org.apache.hadoop.mapred.Child.main(Child.java:262)
> Caused by: com.sap
> attempt_201302202127_0014_m_000000_0: log4j:WARN No appenders could be found for logger (org.apache.hadoop.hdfs.DFSClient).
> attempt_201302202127_0014_m_000000_0: log4j:WARN Please initialize the log4j system properly.
> attempt_201302202127_0014_m_000000_0: log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
> 13/02/20 22:38:22 INFO mapred.JobClient: Task Id : attempt_201302202127_0014_m_000000_1, Status : FAILED
> java.io.IOException: SQLException in nextKeyValue
>     at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
>     at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
>     at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
>     at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
>     at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
>     at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:416)
>     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
>     at org.apache.hadoop.mapred.Child.main(Child.java:262)
> Caused by: com.sap
> attempt_201302202127_0014_m_000000_1: log4j:WARN No appenders could be found for logger (org.apache.hadoop.hdfs.DFSClient).
> attempt_201302202127_0014_m_000000_1: log4j:WARN Please initialize the log4j system properly.
> attempt_201302202127_0014_m_000000_1: log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
> 13/02/20 22:38:32 INFO mapred.JobClient: Task Id : attempt_201302202127_0014_m_000000_2, Status : FAILED
> java.io.IOException: SQLException in nextKeyValue
>     at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
>     at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
>     at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
>     at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
>     at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
>     at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:416)
>     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
>     at org.apache.hadoop.mapred.Child.main(Child.java:262)
> Caused by: com.sap
> attempt_201302202127_0014_m_000000_2: log4j:WARN No appenders could be found for logger (org.apache.hadoop.hdfs.DFSClient).
> attempt_201302202127_0014_m_000000_2: log4j:WARN Please initialize the log4j system properly.
> attempt_201302202127_0014_m_000000_2: log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
> 13/02/20 22:38:46 INFO mapred.JobClient: Job complete: job_201302202127_0014
> 13/02/20 22:38:46 INFO mapred.JobClient: Counters: 6
> 13/02/20 22:38:46 INFO mapred.JobClient:   Job Counters 
> 13/02/20 22:38:46 INFO mapred.JobClient:     Failed map tasks=1
> 13/02/20 22:38:46 INFO mapred.JobClient:     Launched map tasks=4
> 13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all maps in occupied slots (ms)=56775
> 13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all reduces in occupied slots (ms)=0
> 13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=0
> 13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=0
> 13/02/20 22:38:46 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 70.5203 seconds (0 bytes/sec)
> 13/02/20 22:38:46 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
> 13/02/20 22:38:46 INFO mapreduce.ImportJobBase: Retrieved 0 records.
> 13/02/20 22:38:46 ERROR tool.ImportTool: Error during import: Import job failed!
> 
> 
> On Thu, Feb 21, 2013 at 12:03 PM, Harsh J <ha...@cloudera.com> wrote:
> The error is truncated, check the actual failed task's logs for complete info:
> 
> Caused by: com.sap… what?
> 
> Seems more like a SAP side fault than a Hadoop side one and you should
> ask on their forums with the stacktrace posted.
> 
> On Thu, Feb 21, 2013 at 11:58 AM, samir das mohapatra
> <sa...@gmail.com> wrote:
> > Hi All
> >     Can you plese tell me why I am getting error while loading data from
> > SAP HANA   to Hadoop HDFS using sqoop (4.1.2).
> >
> > Error Log:
> >
> > java.io.IOException: SQLException in nextKeyValue
> >       at
> > org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
> >       at
> > org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
> >       at
> > org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
> >       at
> > org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
> >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
> >       at
> > org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
> >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
> >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
> >       at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at javax.security.auth.Subject.doAs(Subject.java:416)
> >       at
> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
> >       at org.apache.hadoop.mapred.Child.main(Child.java:262)
> > Caused by: com.sap
> >
> > Regards,
> > samir.
> >
> >
> >
> > --
> >
> >
> >
> 
> 
> 
> --
> Harsh J
> 


Re: ISSUE :Hadoop with HANA using sqoop

Posted by be...@gmail.com.
Hi Sameer

The query

"SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0"

Is first executed by SQOOP  to fetch the metadata. 

The actual data fetch happens as part of individual queries from each task which would be a sub query of the whole input query.

Regards 
Bejoy KS

Sent from remote device, Please excuse typos

-----Original Message-----
From: Alexander Alten-Lorenz <al...@cloudera.com>
Date: Thu, 21 Feb 2013 07:58:13 
To: cdh-user@cloudera.org<cd...@cloudera.org>
Reply-To: user@hadoop.apache.org
Subject: Re: ISSUE :Hadoop with HANA using sqoop

Hey Samir,

Since you've posted this already @CDH users, please go ahead there. 

Cheers
 Alex

On Feb 21, 2013, at 7:49 AM, samir das mohapatra <sa...@gmail.com> wrote:

> Harsh,
>     I copied whole logs and past here, It looks like only it is showing   "Caused by: com.sap" ,
> And One thing i did not get is why it is running  "SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0"  . It indicate where 1=0 means  no value . But I database we have records.
> 
> 
> Error:
> 
> hadoop@hadoophost2:~/Desktop$ sqoop import  --connect  jdbc:sap://sj1svm010.corp.adobe.com:30015/hd2  --driver com.sap.db.jdbc.Driver   --table  hgopalan.hana_training  -m  1 --username hgopalan     --password Adobe_23  --target-dir  /input/training
> 13/02/20 22:37:27 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
> 13/02/20 22:37:27 INFO manager.SqlManager: Using default fetchSize of 1000
> 13/02/20 22:37:27 INFO tool.CodeGenTool: Beginning code generation
> 13/02/20 22:37:28 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0
> 13/02/20 22:37:29 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0
> 13/02/20 22:37:30 INFO orm.CompilationManager: HADOOP_HOME is /usr/local/hadoop/hadoop-2.0.0-mr1-cdh4.1.2
> Note: /tmp/sqoop-hadoop/compile/d66ed0385ac93eb37215515e4e0c2caf/hgopalan_hana_training.java uses or overrides a deprecated API.
> Note: Recompile with -Xlint:deprecation for details.
> 13/02/20 22:37:32 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/d66ed0385ac93eb37215515e4e0c2caf/hgopalan.hana_training.jar
> 13/02/20 22:37:33 INFO mapreduce.ImportJobBase: Beginning import of hgopalan.hana_training
> 13/02/20 22:37:34 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0
> 13/02/20 22:37:36 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
> 13/02/20 22:37:39 INFO mapred.JobClient: Running job: job_201302202127_0014
> 13/02/20 22:37:40 INFO mapred.JobClient:  map 0% reduce 0%
> 13/02/20 22:38:06 INFO mapred.JobClient: Task Id : attempt_201302202127_0014_m_000000_0, Status : FAILED
> java.io.IOException: SQLException in nextKeyValue
>     at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
>     at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
>     at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
>     at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
>     at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
>     at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:416)
>     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
>     at org.apache.hadoop.mapred.Child.main(Child.java:262)
> Caused by: com.sap
> attempt_201302202127_0014_m_000000_0: log4j:WARN No appenders could be found for logger (org.apache.hadoop.hdfs.DFSClient).
> attempt_201302202127_0014_m_000000_0: log4j:WARN Please initialize the log4j system properly.
> attempt_201302202127_0014_m_000000_0: log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
> 13/02/20 22:38:22 INFO mapred.JobClient: Task Id : attempt_201302202127_0014_m_000000_1, Status : FAILED
> java.io.IOException: SQLException in nextKeyValue
>     at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
>     at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
>     at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
>     at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
>     at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
>     at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:416)
>     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
>     at org.apache.hadoop.mapred.Child.main(Child.java:262)
> Caused by: com.sap
> attempt_201302202127_0014_m_000000_1: log4j:WARN No appenders could be found for logger (org.apache.hadoop.hdfs.DFSClient).
> attempt_201302202127_0014_m_000000_1: log4j:WARN Please initialize the log4j system properly.
> attempt_201302202127_0014_m_000000_1: log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
> 13/02/20 22:38:32 INFO mapred.JobClient: Task Id : attempt_201302202127_0014_m_000000_2, Status : FAILED
> java.io.IOException: SQLException in nextKeyValue
>     at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
>     at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
>     at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
>     at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
>     at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
>     at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:416)
>     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
>     at org.apache.hadoop.mapred.Child.main(Child.java:262)
> Caused by: com.sap
> attempt_201302202127_0014_m_000000_2: log4j:WARN No appenders could be found for logger (org.apache.hadoop.hdfs.DFSClient).
> attempt_201302202127_0014_m_000000_2: log4j:WARN Please initialize the log4j system properly.
> attempt_201302202127_0014_m_000000_2: log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
> 13/02/20 22:38:46 INFO mapred.JobClient: Job complete: job_201302202127_0014
> 13/02/20 22:38:46 INFO mapred.JobClient: Counters: 6
> 13/02/20 22:38:46 INFO mapred.JobClient:   Job Counters 
> 13/02/20 22:38:46 INFO mapred.JobClient:     Failed map tasks=1
> 13/02/20 22:38:46 INFO mapred.JobClient:     Launched map tasks=4
> 13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all maps in occupied slots (ms)=56775
> 13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all reduces in occupied slots (ms)=0
> 13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=0
> 13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=0
> 13/02/20 22:38:46 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 70.5203 seconds (0 bytes/sec)
> 13/02/20 22:38:46 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
> 13/02/20 22:38:46 INFO mapreduce.ImportJobBase: Retrieved 0 records.
> 13/02/20 22:38:46 ERROR tool.ImportTool: Error during import: Import job failed!
> 
> 
> On Thu, Feb 21, 2013 at 12:03 PM, Harsh J <ha...@cloudera.com> wrote:
> The error is truncated, check the actual failed task's logs for complete info:
> 
> Caused by: com.sap… what?
> 
> Seems more like a SAP side fault than a Hadoop side one and you should
> ask on their forums with the stacktrace posted.
> 
> On Thu, Feb 21, 2013 at 11:58 AM, samir das mohapatra
> <sa...@gmail.com> wrote:
> > Hi All
> >     Can you plese tell me why I am getting error while loading data from
> > SAP HANA   to Hadoop HDFS using sqoop (4.1.2).
> >
> > Error Log:
> >
> > java.io.IOException: SQLException in nextKeyValue
> >       at
> > org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
> >       at
> > org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
> >       at
> > org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
> >       at
> > org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
> >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
> >       at
> > org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
> >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
> >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
> >       at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at javax.security.auth.Subject.doAs(Subject.java:416)
> >       at
> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
> >       at org.apache.hadoop.mapred.Child.main(Child.java:262)
> > Caused by: com.sap
> >
> > Regards,
> > samir.
> >
> >
> >
> > --
> >
> >
> >
> 
> 
> 
> --
> Harsh J
> 


Re: Newbie Debuggin Question

Posted by Sai Sai <sa...@yahoo.in>.
This may be a basic beginner debug question will appreciate if anyone can pour some light:

Here is the method i have in Eclipse:


*******************************

@Override
    protected void setup(Context context) throws java.io.IOException,
            InterruptedException {
        Path[] cacheFiles = DistributedCache.getLocalCacheFiles(context
                .getConfiguration());
        lookUp = cacheFiles[0];
    };
*******************************

I have put a breakpoint at the second line and inspected cacheFiles[0] and here is what i see:

[/tmp/hadoop-sai/mapred/local/archive/3401759285981873176_334405473_2022582449/fileinput/lookup.txt]

I went back to my local folders looking for these folders to see if there r in here but do not see them.

Just wondering where it is getting this file from.

Any help will be really appreciated.
Thanks
Sai

Re: ISSUE :Hadoop with HANA using sqoop

Posted by be...@gmail.com.
Hi Sameer

The query

"SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0"

Is first executed by SQOOP  to fetch the metadata. 

The actual data fetch happens as part of individual queries from each task which would be a sub query of the whole input query.

Regards 
Bejoy KS

Sent from remote device, Please excuse typos

-----Original Message-----
From: Alexander Alten-Lorenz <al...@cloudera.com>
Date: Thu, 21 Feb 2013 07:58:13 
To: cdh-user@cloudera.org<cd...@cloudera.org>
Reply-To: user@hadoop.apache.org
Subject: Re: ISSUE :Hadoop with HANA using sqoop

Hey Samir,

Since you've posted this already @CDH users, please go ahead there. 

Cheers
 Alex

On Feb 21, 2013, at 7:49 AM, samir das mohapatra <sa...@gmail.com> wrote:

> Harsh,
>     I copied whole logs and past here, It looks like only it is showing   "Caused by: com.sap" ,
> And One thing i did not get is why it is running  "SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0"  . It indicate where 1=0 means  no value . But I database we have records.
> 
> 
> Error:
> 
> hadoop@hadoophost2:~/Desktop$ sqoop import  --connect  jdbc:sap://sj1svm010.corp.adobe.com:30015/hd2  --driver com.sap.db.jdbc.Driver   --table  hgopalan.hana_training  -m  1 --username hgopalan     --password Adobe_23  --target-dir  /input/training
> 13/02/20 22:37:27 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
> 13/02/20 22:37:27 INFO manager.SqlManager: Using default fetchSize of 1000
> 13/02/20 22:37:27 INFO tool.CodeGenTool: Beginning code generation
> 13/02/20 22:37:28 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0
> 13/02/20 22:37:29 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0
> 13/02/20 22:37:30 INFO orm.CompilationManager: HADOOP_HOME is /usr/local/hadoop/hadoop-2.0.0-mr1-cdh4.1.2
> Note: /tmp/sqoop-hadoop/compile/d66ed0385ac93eb37215515e4e0c2caf/hgopalan_hana_training.java uses or overrides a deprecated API.
> Note: Recompile with -Xlint:deprecation for details.
> 13/02/20 22:37:32 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/d66ed0385ac93eb37215515e4e0c2caf/hgopalan.hana_training.jar
> 13/02/20 22:37:33 INFO mapreduce.ImportJobBase: Beginning import of hgopalan.hana_training
> 13/02/20 22:37:34 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0
> 13/02/20 22:37:36 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
> 13/02/20 22:37:39 INFO mapred.JobClient: Running job: job_201302202127_0014
> 13/02/20 22:37:40 INFO mapred.JobClient:  map 0% reduce 0%
> 13/02/20 22:38:06 INFO mapred.JobClient: Task Id : attempt_201302202127_0014_m_000000_0, Status : FAILED
> java.io.IOException: SQLException in nextKeyValue
>     at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
>     at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
>     at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
>     at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
>     at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
>     at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:416)
>     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
>     at org.apache.hadoop.mapred.Child.main(Child.java:262)
> Caused by: com.sap
> attempt_201302202127_0014_m_000000_0: log4j:WARN No appenders could be found for logger (org.apache.hadoop.hdfs.DFSClient).
> attempt_201302202127_0014_m_000000_0: log4j:WARN Please initialize the log4j system properly.
> attempt_201302202127_0014_m_000000_0: log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
> 13/02/20 22:38:22 INFO mapred.JobClient: Task Id : attempt_201302202127_0014_m_000000_1, Status : FAILED
> java.io.IOException: SQLException in nextKeyValue
>     at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
>     at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
>     at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
>     at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
>     at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
>     at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:416)
>     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
>     at org.apache.hadoop.mapred.Child.main(Child.java:262)
> Caused by: com.sap
> attempt_201302202127_0014_m_000000_1: log4j:WARN No appenders could be found for logger (org.apache.hadoop.hdfs.DFSClient).
> attempt_201302202127_0014_m_000000_1: log4j:WARN Please initialize the log4j system properly.
> attempt_201302202127_0014_m_000000_1: log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
> 13/02/20 22:38:32 INFO mapred.JobClient: Task Id : attempt_201302202127_0014_m_000000_2, Status : FAILED
> java.io.IOException: SQLException in nextKeyValue
>     at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
>     at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
>     at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
>     at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
>     at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
>     at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:416)
>     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
>     at org.apache.hadoop.mapred.Child.main(Child.java:262)
> Caused by: com.sap
> attempt_201302202127_0014_m_000000_2: log4j:WARN No appenders could be found for logger (org.apache.hadoop.hdfs.DFSClient).
> attempt_201302202127_0014_m_000000_2: log4j:WARN Please initialize the log4j system properly.
> attempt_201302202127_0014_m_000000_2: log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
> 13/02/20 22:38:46 INFO mapred.JobClient: Job complete: job_201302202127_0014
> 13/02/20 22:38:46 INFO mapred.JobClient: Counters: 6
> 13/02/20 22:38:46 INFO mapred.JobClient:   Job Counters 
> 13/02/20 22:38:46 INFO mapred.JobClient:     Failed map tasks=1
> 13/02/20 22:38:46 INFO mapred.JobClient:     Launched map tasks=4
> 13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all maps in occupied slots (ms)=56775
> 13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all reduces in occupied slots (ms)=0
> 13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=0
> 13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=0
> 13/02/20 22:38:46 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 70.5203 seconds (0 bytes/sec)
> 13/02/20 22:38:46 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
> 13/02/20 22:38:46 INFO mapreduce.ImportJobBase: Retrieved 0 records.
> 13/02/20 22:38:46 ERROR tool.ImportTool: Error during import: Import job failed!
> 
> 
> On Thu, Feb 21, 2013 at 12:03 PM, Harsh J <ha...@cloudera.com> wrote:
> The error is truncated, check the actual failed task's logs for complete info:
> 
> Caused by: com.sap… what?
> 
> Seems more like a SAP side fault than a Hadoop side one and you should
> ask on their forums with the stacktrace posted.
> 
> On Thu, Feb 21, 2013 at 11:58 AM, samir das mohapatra
> <sa...@gmail.com> wrote:
> > Hi All
> >     Can you plese tell me why I am getting error while loading data from
> > SAP HANA   to Hadoop HDFS using sqoop (4.1.2).
> >
> > Error Log:
> >
> > java.io.IOException: SQLException in nextKeyValue
> >       at
> > org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
> >       at
> > org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
> >       at
> > org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
> >       at
> > org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
> >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
> >       at
> > org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
> >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
> >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
> >       at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at javax.security.auth.Subject.doAs(Subject.java:416)
> >       at
> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
> >       at org.apache.hadoop.mapred.Child.main(Child.java:262)
> > Caused by: com.sap
> >
> > Regards,
> > samir.
> >
> >
> >
> > --
> >
> >
> >
> 
> 
> 
> --
> Harsh J
> 


Re: Newbie Debuggin Question

Posted by Sai Sai <sa...@yahoo.in>.
This may be a basic beginner debug question will appreciate if anyone can pour some light:

Here is the method i have in Eclipse:


*******************************

@Override
    protected void setup(Context context) throws java.io.IOException,
            InterruptedException {
        Path[] cacheFiles = DistributedCache.getLocalCacheFiles(context
                .getConfiguration());
        lookUp = cacheFiles[0];
    };
*******************************

I have put a breakpoint at the second line and inspected cacheFiles[0] and here is what i see:

[/tmp/hadoop-sai/mapred/local/archive/3401759285981873176_334405473_2022582449/fileinput/lookup.txt]

I went back to my local folders looking for these folders to see if there r in here but do not see them.

Just wondering where it is getting this file from.

Any help will be really appreciated.
Thanks
Sai

Re: ISSUE :Hadoop with HANA using sqoop

Posted by be...@gmail.com.
Hi Sameer

The query

"SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0"

Is first executed by SQOOP  to fetch the metadata. 

The actual data fetch happens as part of individual queries from each task which would be a sub query of the whole input query.

Regards 
Bejoy KS

Sent from remote device, Please excuse typos

-----Original Message-----
From: Alexander Alten-Lorenz <al...@cloudera.com>
Date: Thu, 21 Feb 2013 07:58:13 
To: cdh-user@cloudera.org<cd...@cloudera.org>
Reply-To: user@hadoop.apache.org
Subject: Re: ISSUE :Hadoop with HANA using sqoop

Hey Samir,

Since you've posted this already @CDH users, please go ahead there. 

Cheers
 Alex

On Feb 21, 2013, at 7:49 AM, samir das mohapatra <sa...@gmail.com> wrote:

> Harsh,
>     I copied whole logs and past here, It looks like only it is showing   "Caused by: com.sap" ,
> And One thing i did not get is why it is running  "SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0"  . It indicate where 1=0 means  no value . But I database we have records.
> 
> 
> Error:
> 
> hadoop@hadoophost2:~/Desktop$ sqoop import  --connect  jdbc:sap://sj1svm010.corp.adobe.com:30015/hd2  --driver com.sap.db.jdbc.Driver   --table  hgopalan.hana_training  -m  1 --username hgopalan     --password Adobe_23  --target-dir  /input/training
> 13/02/20 22:37:27 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
> 13/02/20 22:37:27 INFO manager.SqlManager: Using default fetchSize of 1000
> 13/02/20 22:37:27 INFO tool.CodeGenTool: Beginning code generation
> 13/02/20 22:37:28 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0
> 13/02/20 22:37:29 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0
> 13/02/20 22:37:30 INFO orm.CompilationManager: HADOOP_HOME is /usr/local/hadoop/hadoop-2.0.0-mr1-cdh4.1.2
> Note: /tmp/sqoop-hadoop/compile/d66ed0385ac93eb37215515e4e0c2caf/hgopalan_hana_training.java uses or overrides a deprecated API.
> Note: Recompile with -Xlint:deprecation for details.
> 13/02/20 22:37:32 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/d66ed0385ac93eb37215515e4e0c2caf/hgopalan.hana_training.jar
> 13/02/20 22:37:33 INFO mapreduce.ImportJobBase: Beginning import of hgopalan.hana_training
> 13/02/20 22:37:34 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0
> 13/02/20 22:37:36 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
> 13/02/20 22:37:39 INFO mapred.JobClient: Running job: job_201302202127_0014
> 13/02/20 22:37:40 INFO mapred.JobClient:  map 0% reduce 0%
> 13/02/20 22:38:06 INFO mapred.JobClient: Task Id : attempt_201302202127_0014_m_000000_0, Status : FAILED
> java.io.IOException: SQLException in nextKeyValue
>     at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
>     at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
>     at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
>     at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
>     at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
>     at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:416)
>     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
>     at org.apache.hadoop.mapred.Child.main(Child.java:262)
> Caused by: com.sap
> attempt_201302202127_0014_m_000000_0: log4j:WARN No appenders could be found for logger (org.apache.hadoop.hdfs.DFSClient).
> attempt_201302202127_0014_m_000000_0: log4j:WARN Please initialize the log4j system properly.
> attempt_201302202127_0014_m_000000_0: log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
> 13/02/20 22:38:22 INFO mapred.JobClient: Task Id : attempt_201302202127_0014_m_000000_1, Status : FAILED
> java.io.IOException: SQLException in nextKeyValue
>     at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
>     at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
>     at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
>     at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
>     at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
>     at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:416)
>     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
>     at org.apache.hadoop.mapred.Child.main(Child.java:262)
> Caused by: com.sap
> attempt_201302202127_0014_m_000000_1: log4j:WARN No appenders could be found for logger (org.apache.hadoop.hdfs.DFSClient).
> attempt_201302202127_0014_m_000000_1: log4j:WARN Please initialize the log4j system properly.
> attempt_201302202127_0014_m_000000_1: log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
> 13/02/20 22:38:32 INFO mapred.JobClient: Task Id : attempt_201302202127_0014_m_000000_2, Status : FAILED
> java.io.IOException: SQLException in nextKeyValue
>     at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
>     at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
>     at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
>     at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
>     at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
>     at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:416)
>     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
>     at org.apache.hadoop.mapred.Child.main(Child.java:262)
> Caused by: com.sap
> attempt_201302202127_0014_m_000000_2: log4j:WARN No appenders could be found for logger (org.apache.hadoop.hdfs.DFSClient).
> attempt_201302202127_0014_m_000000_2: log4j:WARN Please initialize the log4j system properly.
> attempt_201302202127_0014_m_000000_2: log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
> 13/02/20 22:38:46 INFO mapred.JobClient: Job complete: job_201302202127_0014
> 13/02/20 22:38:46 INFO mapred.JobClient: Counters: 6
> 13/02/20 22:38:46 INFO mapred.JobClient:   Job Counters 
> 13/02/20 22:38:46 INFO mapred.JobClient:     Failed map tasks=1
> 13/02/20 22:38:46 INFO mapred.JobClient:     Launched map tasks=4
> 13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all maps in occupied slots (ms)=56775
> 13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all reduces in occupied slots (ms)=0
> 13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=0
> 13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=0
> 13/02/20 22:38:46 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 70.5203 seconds (0 bytes/sec)
> 13/02/20 22:38:46 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
> 13/02/20 22:38:46 INFO mapreduce.ImportJobBase: Retrieved 0 records.
> 13/02/20 22:38:46 ERROR tool.ImportTool: Error during import: Import job failed!
> 
> 
> On Thu, Feb 21, 2013 at 12:03 PM, Harsh J <ha...@cloudera.com> wrote:
> The error is truncated, check the actual failed task's logs for complete info:
> 
> Caused by: com.sap… what?
> 
> Seems more like a SAP side fault than a Hadoop side one and you should
> ask on their forums with the stacktrace posted.
> 
> On Thu, Feb 21, 2013 at 11:58 AM, samir das mohapatra
> <sa...@gmail.com> wrote:
> > Hi All
> >     Can you plese tell me why I am getting error while loading data from
> > SAP HANA   to Hadoop HDFS using sqoop (4.1.2).
> >
> > Error Log:
> >
> > java.io.IOException: SQLException in nextKeyValue
> >       at
> > org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
> >       at
> > org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
> >       at
> > org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
> >       at
> > org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
> >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
> >       at
> > org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
> >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
> >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
> >       at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at javax.security.auth.Subject.doAs(Subject.java:416)
> >       at
> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
> >       at org.apache.hadoop.mapred.Child.main(Child.java:262)
> > Caused by: com.sap
> >
> > Regards,
> > samir.
> >
> >
> >
> > --
> >
> >
> >
> 
> 
> 
> --
> Harsh J
> 


Re: ISSUE :Hadoop with HANA using sqoop

Posted by Alexander Alten-Lorenz <al...@cloudera.com>.
Hey Samir,

Since you've posted this already @CDH users, please go ahead there. 

Cheers
 Alex

On Feb 21, 2013, at 7:49 AM, samir das mohapatra <sa...@gmail.com> wrote:

> Harsh,
>     I copied whole logs and past here, It looks like only it is showing   "Caused by: com.sap" ,
> And One thing i did not get is why it is running  "SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0"  . It indicate where 1=0 means  no value . But I database we have records.
> 
> 
> Error:
> 
> hadoop@hadoophost2:~/Desktop$ sqoop import  --connect  jdbc:sap://sj1svm010.corp.adobe.com:30015/hd2  --driver com.sap.db.jdbc.Driver   --table  hgopalan.hana_training  -m  1 --username hgopalan     --password Adobe_23  --target-dir  /input/training
> 13/02/20 22:37:27 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
> 13/02/20 22:37:27 INFO manager.SqlManager: Using default fetchSize of 1000
> 13/02/20 22:37:27 INFO tool.CodeGenTool: Beginning code generation
> 13/02/20 22:37:28 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0
> 13/02/20 22:37:29 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0
> 13/02/20 22:37:30 INFO orm.CompilationManager: HADOOP_HOME is /usr/local/hadoop/hadoop-2.0.0-mr1-cdh4.1.2
> Note: /tmp/sqoop-hadoop/compile/d66ed0385ac93eb37215515e4e0c2caf/hgopalan_hana_training.java uses or overrides a deprecated API.
> Note: Recompile with -Xlint:deprecation for details.
> 13/02/20 22:37:32 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/d66ed0385ac93eb37215515e4e0c2caf/hgopalan.hana_training.jar
> 13/02/20 22:37:33 INFO mapreduce.ImportJobBase: Beginning import of hgopalan.hana_training
> 13/02/20 22:37:34 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0
> 13/02/20 22:37:36 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
> 13/02/20 22:37:39 INFO mapred.JobClient: Running job: job_201302202127_0014
> 13/02/20 22:37:40 INFO mapred.JobClient:  map 0% reduce 0%
> 13/02/20 22:38:06 INFO mapred.JobClient: Task Id : attempt_201302202127_0014_m_000000_0, Status : FAILED
> java.io.IOException: SQLException in nextKeyValue
>     at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
>     at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
>     at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
>     at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
>     at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
>     at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:416)
>     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
>     at org.apache.hadoop.mapred.Child.main(Child.java:262)
> Caused by: com.sap
> attempt_201302202127_0014_m_000000_0: log4j:WARN No appenders could be found for logger (org.apache.hadoop.hdfs.DFSClient).
> attempt_201302202127_0014_m_000000_0: log4j:WARN Please initialize the log4j system properly.
> attempt_201302202127_0014_m_000000_0: log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
> 13/02/20 22:38:22 INFO mapred.JobClient: Task Id : attempt_201302202127_0014_m_000000_1, Status : FAILED
> java.io.IOException: SQLException in nextKeyValue
>     at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
>     at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
>     at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
>     at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
>     at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
>     at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:416)
>     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
>     at org.apache.hadoop.mapred.Child.main(Child.java:262)
> Caused by: com.sap
> attempt_201302202127_0014_m_000000_1: log4j:WARN No appenders could be found for logger (org.apache.hadoop.hdfs.DFSClient).
> attempt_201302202127_0014_m_000000_1: log4j:WARN Please initialize the log4j system properly.
> attempt_201302202127_0014_m_000000_1: log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
> 13/02/20 22:38:32 INFO mapred.JobClient: Task Id : attempt_201302202127_0014_m_000000_2, Status : FAILED
> java.io.IOException: SQLException in nextKeyValue
>     at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
>     at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
>     at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
>     at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
>     at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
>     at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:416)
>     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
>     at org.apache.hadoop.mapred.Child.main(Child.java:262)
> Caused by: com.sap
> attempt_201302202127_0014_m_000000_2: log4j:WARN No appenders could be found for logger (org.apache.hadoop.hdfs.DFSClient).
> attempt_201302202127_0014_m_000000_2: log4j:WARN Please initialize the log4j system properly.
> attempt_201302202127_0014_m_000000_2: log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
> 13/02/20 22:38:46 INFO mapred.JobClient: Job complete: job_201302202127_0014
> 13/02/20 22:38:46 INFO mapred.JobClient: Counters: 6
> 13/02/20 22:38:46 INFO mapred.JobClient:   Job Counters 
> 13/02/20 22:38:46 INFO mapred.JobClient:     Failed map tasks=1
> 13/02/20 22:38:46 INFO mapred.JobClient:     Launched map tasks=4
> 13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all maps in occupied slots (ms)=56775
> 13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all reduces in occupied slots (ms)=0
> 13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=0
> 13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=0
> 13/02/20 22:38:46 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 70.5203 seconds (0 bytes/sec)
> 13/02/20 22:38:46 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
> 13/02/20 22:38:46 INFO mapreduce.ImportJobBase: Retrieved 0 records.
> 13/02/20 22:38:46 ERROR tool.ImportTool: Error during import: Import job failed!
> 
> 
> On Thu, Feb 21, 2013 at 12:03 PM, Harsh J <ha...@cloudera.com> wrote:
> The error is truncated, check the actual failed task's logs for complete info:
> 
> Caused by: com.sap… what?
> 
> Seems more like a SAP side fault than a Hadoop side one and you should
> ask on their forums with the stacktrace posted.
> 
> On Thu, Feb 21, 2013 at 11:58 AM, samir das mohapatra
> <sa...@gmail.com> wrote:
> > Hi All
> >     Can you plese tell me why I am getting error while loading data from
> > SAP HANA   to Hadoop HDFS using sqoop (4.1.2).
> >
> > Error Log:
> >
> > java.io.IOException: SQLException in nextKeyValue
> >       at
> > org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
> >       at
> > org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
> >       at
> > org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
> >       at
> > org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
> >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
> >       at
> > org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
> >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
> >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
> >       at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at javax.security.auth.Subject.doAs(Subject.java:416)
> >       at
> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
> >       at org.apache.hadoop.mapred.Child.main(Child.java:262)
> > Caused by: com.sap
> >
> > Regards,
> > samir.
> >
> >
> >
> > --
> >
> >
> >
> 
> 
> 
> --
> Harsh J
> 


Re: ISSUE :Hadoop with HANA using sqoop

Posted by Alexander Alten-Lorenz <al...@cloudera.com>.
Hey Samir,

Since you've posted this already @CDH users, please go ahead there. 

Cheers
 Alex

On Feb 21, 2013, at 7:49 AM, samir das mohapatra <sa...@gmail.com> wrote:

> Harsh,
>     I copied whole logs and past here, It looks like only it is showing   "Caused by: com.sap" ,
> And One thing i did not get is why it is running  "SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0"  . It indicate where 1=0 means  no value . But I database we have records.
> 
> 
> Error:
> 
> hadoop@hadoophost2:~/Desktop$ sqoop import  --connect  jdbc:sap://sj1svm010.corp.adobe.com:30015/hd2  --driver com.sap.db.jdbc.Driver   --table  hgopalan.hana_training  -m  1 --username hgopalan     --password Adobe_23  --target-dir  /input/training
> 13/02/20 22:37:27 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
> 13/02/20 22:37:27 INFO manager.SqlManager: Using default fetchSize of 1000
> 13/02/20 22:37:27 INFO tool.CodeGenTool: Beginning code generation
> 13/02/20 22:37:28 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0
> 13/02/20 22:37:29 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0
> 13/02/20 22:37:30 INFO orm.CompilationManager: HADOOP_HOME is /usr/local/hadoop/hadoop-2.0.0-mr1-cdh4.1.2
> Note: /tmp/sqoop-hadoop/compile/d66ed0385ac93eb37215515e4e0c2caf/hgopalan_hana_training.java uses or overrides a deprecated API.
> Note: Recompile with -Xlint:deprecation for details.
> 13/02/20 22:37:32 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/d66ed0385ac93eb37215515e4e0c2caf/hgopalan.hana_training.jar
> 13/02/20 22:37:33 INFO mapreduce.ImportJobBase: Beginning import of hgopalan.hana_training
> 13/02/20 22:37:34 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0
> 13/02/20 22:37:36 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
> 13/02/20 22:37:39 INFO mapred.JobClient: Running job: job_201302202127_0014
> 13/02/20 22:37:40 INFO mapred.JobClient:  map 0% reduce 0%
> 13/02/20 22:38:06 INFO mapred.JobClient: Task Id : attempt_201302202127_0014_m_000000_0, Status : FAILED
> java.io.IOException: SQLException in nextKeyValue
>     at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
>     at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
>     at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
>     at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
>     at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
>     at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:416)
>     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
>     at org.apache.hadoop.mapred.Child.main(Child.java:262)
> Caused by: com.sap
> attempt_201302202127_0014_m_000000_0: log4j:WARN No appenders could be found for logger (org.apache.hadoop.hdfs.DFSClient).
> attempt_201302202127_0014_m_000000_0: log4j:WARN Please initialize the log4j system properly.
> attempt_201302202127_0014_m_000000_0: log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
> 13/02/20 22:38:22 INFO mapred.JobClient: Task Id : attempt_201302202127_0014_m_000000_1, Status : FAILED
> java.io.IOException: SQLException in nextKeyValue
>     at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
>     at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
>     at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
>     at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
>     at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
>     at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:416)
>     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
>     at org.apache.hadoop.mapred.Child.main(Child.java:262)
> Caused by: com.sap
> attempt_201302202127_0014_m_000000_1: log4j:WARN No appenders could be found for logger (org.apache.hadoop.hdfs.DFSClient).
> attempt_201302202127_0014_m_000000_1: log4j:WARN Please initialize the log4j system properly.
> attempt_201302202127_0014_m_000000_1: log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
> 13/02/20 22:38:32 INFO mapred.JobClient: Task Id : attempt_201302202127_0014_m_000000_2, Status : FAILED
> java.io.IOException: SQLException in nextKeyValue
>     at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
>     at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
>     at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
>     at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
>     at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
>     at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:416)
>     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
>     at org.apache.hadoop.mapred.Child.main(Child.java:262)
> Caused by: com.sap
> attempt_201302202127_0014_m_000000_2: log4j:WARN No appenders could be found for logger (org.apache.hadoop.hdfs.DFSClient).
> attempt_201302202127_0014_m_000000_2: log4j:WARN Please initialize the log4j system properly.
> attempt_201302202127_0014_m_000000_2: log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
> 13/02/20 22:38:46 INFO mapred.JobClient: Job complete: job_201302202127_0014
> 13/02/20 22:38:46 INFO mapred.JobClient: Counters: 6
> 13/02/20 22:38:46 INFO mapred.JobClient:   Job Counters 
> 13/02/20 22:38:46 INFO mapred.JobClient:     Failed map tasks=1
> 13/02/20 22:38:46 INFO mapred.JobClient:     Launched map tasks=4
> 13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all maps in occupied slots (ms)=56775
> 13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all reduces in occupied slots (ms)=0
> 13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=0
> 13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=0
> 13/02/20 22:38:46 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 70.5203 seconds (0 bytes/sec)
> 13/02/20 22:38:46 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
> 13/02/20 22:38:46 INFO mapreduce.ImportJobBase: Retrieved 0 records.
> 13/02/20 22:38:46 ERROR tool.ImportTool: Error during import: Import job failed!
> 
> 
> On Thu, Feb 21, 2013 at 12:03 PM, Harsh J <ha...@cloudera.com> wrote:
> The error is truncated, check the actual failed task's logs for complete info:
> 
> Caused by: com.sap… what?
> 
> Seems more like a SAP side fault than a Hadoop side one and you should
> ask on their forums with the stacktrace posted.
> 
> On Thu, Feb 21, 2013 at 11:58 AM, samir das mohapatra
> <sa...@gmail.com> wrote:
> > Hi All
> >     Can you plese tell me why I am getting error while loading data from
> > SAP HANA   to Hadoop HDFS using sqoop (4.1.2).
> >
> > Error Log:
> >
> > java.io.IOException: SQLException in nextKeyValue
> >       at
> > org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
> >       at
> > org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
> >       at
> > org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
> >       at
> > org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
> >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
> >       at
> > org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
> >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
> >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
> >       at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at javax.security.auth.Subject.doAs(Subject.java:416)
> >       at
> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
> >       at org.apache.hadoop.mapred.Child.main(Child.java:262)
> > Caused by: com.sap
> >
> > Regards,
> > samir.
> >
> >
> >
> > --
> >
> >
> >
> 
> 
> 
> --
> Harsh J
> 


Re: ISSUE :Hadoop with HANA using sqoop

Posted by Alexander Alten-Lorenz <al...@cloudera.com>.
Hey Samir,

Since you've posted this already @CDH users, please go ahead there. 

Cheers
 Alex

On Feb 21, 2013, at 7:49 AM, samir das mohapatra <sa...@gmail.com> wrote:

> Harsh,
>     I copied whole logs and past here, It looks like only it is showing   "Caused by: com.sap" ,
> And One thing i did not get is why it is running  "SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0"  . It indicate where 1=0 means  no value . But I database we have records.
> 
> 
> Error:
> 
> hadoop@hadoophost2:~/Desktop$ sqoop import  --connect  jdbc:sap://sj1svm010.corp.adobe.com:30015/hd2  --driver com.sap.db.jdbc.Driver   --table  hgopalan.hana_training  -m  1 --username hgopalan     --password Adobe_23  --target-dir  /input/training
> 13/02/20 22:37:27 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
> 13/02/20 22:37:27 INFO manager.SqlManager: Using default fetchSize of 1000
> 13/02/20 22:37:27 INFO tool.CodeGenTool: Beginning code generation
> 13/02/20 22:37:28 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0
> 13/02/20 22:37:29 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0
> 13/02/20 22:37:30 INFO orm.CompilationManager: HADOOP_HOME is /usr/local/hadoop/hadoop-2.0.0-mr1-cdh4.1.2
> Note: /tmp/sqoop-hadoop/compile/d66ed0385ac93eb37215515e4e0c2caf/hgopalan_hana_training.java uses or overrides a deprecated API.
> Note: Recompile with -Xlint:deprecation for details.
> 13/02/20 22:37:32 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/d66ed0385ac93eb37215515e4e0c2caf/hgopalan.hana_training.jar
> 13/02/20 22:37:33 INFO mapreduce.ImportJobBase: Beginning import of hgopalan.hana_training
> 13/02/20 22:37:34 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0
> 13/02/20 22:37:36 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
> 13/02/20 22:37:39 INFO mapred.JobClient: Running job: job_201302202127_0014
> 13/02/20 22:37:40 INFO mapred.JobClient:  map 0% reduce 0%
> 13/02/20 22:38:06 INFO mapred.JobClient: Task Id : attempt_201302202127_0014_m_000000_0, Status : FAILED
> java.io.IOException: SQLException in nextKeyValue
>     at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
>     at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
>     at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
>     at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
>     at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
>     at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:416)
>     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
>     at org.apache.hadoop.mapred.Child.main(Child.java:262)
> Caused by: com.sap
> attempt_201302202127_0014_m_000000_0: log4j:WARN No appenders could be found for logger (org.apache.hadoop.hdfs.DFSClient).
> attempt_201302202127_0014_m_000000_0: log4j:WARN Please initialize the log4j system properly.
> attempt_201302202127_0014_m_000000_0: log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
> 13/02/20 22:38:22 INFO mapred.JobClient: Task Id : attempt_201302202127_0014_m_000000_1, Status : FAILED
> java.io.IOException: SQLException in nextKeyValue
>     at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
>     at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
>     at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
>     at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
>     at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
>     at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:416)
>     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
>     at org.apache.hadoop.mapred.Child.main(Child.java:262)
> Caused by: com.sap
> attempt_201302202127_0014_m_000000_1: log4j:WARN No appenders could be found for logger (org.apache.hadoop.hdfs.DFSClient).
> attempt_201302202127_0014_m_000000_1: log4j:WARN Please initialize the log4j system properly.
> attempt_201302202127_0014_m_000000_1: log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
> 13/02/20 22:38:32 INFO mapred.JobClient: Task Id : attempt_201302202127_0014_m_000000_2, Status : FAILED
> java.io.IOException: SQLException in nextKeyValue
>     at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
>     at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
>     at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
>     at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
>     at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
>     at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:416)
>     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
>     at org.apache.hadoop.mapred.Child.main(Child.java:262)
> Caused by: com.sap
> attempt_201302202127_0014_m_000000_2: log4j:WARN No appenders could be found for logger (org.apache.hadoop.hdfs.DFSClient).
> attempt_201302202127_0014_m_000000_2: log4j:WARN Please initialize the log4j system properly.
> attempt_201302202127_0014_m_000000_2: log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
> 13/02/20 22:38:46 INFO mapred.JobClient: Job complete: job_201302202127_0014
> 13/02/20 22:38:46 INFO mapred.JobClient: Counters: 6
> 13/02/20 22:38:46 INFO mapred.JobClient:   Job Counters 
> 13/02/20 22:38:46 INFO mapred.JobClient:     Failed map tasks=1
> 13/02/20 22:38:46 INFO mapred.JobClient:     Launched map tasks=4
> 13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all maps in occupied slots (ms)=56775
> 13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all reduces in occupied slots (ms)=0
> 13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=0
> 13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=0
> 13/02/20 22:38:46 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 70.5203 seconds (0 bytes/sec)
> 13/02/20 22:38:46 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
> 13/02/20 22:38:46 INFO mapreduce.ImportJobBase: Retrieved 0 records.
> 13/02/20 22:38:46 ERROR tool.ImportTool: Error during import: Import job failed!
> 
> 
> On Thu, Feb 21, 2013 at 12:03 PM, Harsh J <ha...@cloudera.com> wrote:
> The error is truncated, check the actual failed task's logs for complete info:
> 
> Caused by: com.sap… what?
> 
> Seems more like a SAP side fault than a Hadoop side one and you should
> ask on their forums with the stacktrace posted.
> 
> On Thu, Feb 21, 2013 at 11:58 AM, samir das mohapatra
> <sa...@gmail.com> wrote:
> > Hi All
> >     Can you plese tell me why I am getting error while loading data from
> > SAP HANA   to Hadoop HDFS using sqoop (4.1.2).
> >
> > Error Log:
> >
> > java.io.IOException: SQLException in nextKeyValue
> >       at
> > org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
> >       at
> > org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
> >       at
> > org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
> >       at
> > org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
> >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
> >       at
> > org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
> >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
> >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
> >       at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at javax.security.auth.Subject.doAs(Subject.java:416)
> >       at
> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
> >       at org.apache.hadoop.mapred.Child.main(Child.java:262)
> > Caused by: com.sap
> >
> > Regards,
> > samir.
> >
> >
> >
> > --
> >
> >
> >
> 
> 
> 
> --
> Harsh J
> 


Re: ISSUE :Hadoop with HANA using sqoop

Posted by Alexander Alten-Lorenz <al...@cloudera.com>.
Hey Samir,

Since you've posted this already @CDH users, please go ahead there. 

Cheers
 Alex

On Feb 21, 2013, at 7:49 AM, samir das mohapatra <sa...@gmail.com> wrote:

> Harsh,
>     I copied whole logs and past here, It looks like only it is showing   "Caused by: com.sap" ,
> And One thing i did not get is why it is running  "SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0"  . It indicate where 1=0 means  no value . But I database we have records.
> 
> 
> Error:
> 
> hadoop@hadoophost2:~/Desktop$ sqoop import  --connect  jdbc:sap://sj1svm010.corp.adobe.com:30015/hd2  --driver com.sap.db.jdbc.Driver   --table  hgopalan.hana_training  -m  1 --username hgopalan     --password Adobe_23  --target-dir  /input/training
> 13/02/20 22:37:27 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
> 13/02/20 22:37:27 INFO manager.SqlManager: Using default fetchSize of 1000
> 13/02/20 22:37:27 INFO tool.CodeGenTool: Beginning code generation
> 13/02/20 22:37:28 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0
> 13/02/20 22:37:29 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0
> 13/02/20 22:37:30 INFO orm.CompilationManager: HADOOP_HOME is /usr/local/hadoop/hadoop-2.0.0-mr1-cdh4.1.2
> Note: /tmp/sqoop-hadoop/compile/d66ed0385ac93eb37215515e4e0c2caf/hgopalan_hana_training.java uses or overrides a deprecated API.
> Note: Recompile with -Xlint:deprecation for details.
> 13/02/20 22:37:32 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hadoop/compile/d66ed0385ac93eb37215515e4e0c2caf/hgopalan.hana_training.jar
> 13/02/20 22:37:33 INFO mapreduce.ImportJobBase: Beginning import of hgopalan.hana_training
> 13/02/20 22:37:34 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM hgopalan.hana_training AS t WHERE 1=0
> 13/02/20 22:37:36 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
> 13/02/20 22:37:39 INFO mapred.JobClient: Running job: job_201302202127_0014
> 13/02/20 22:37:40 INFO mapred.JobClient:  map 0% reduce 0%
> 13/02/20 22:38:06 INFO mapred.JobClient: Task Id : attempt_201302202127_0014_m_000000_0, Status : FAILED
> java.io.IOException: SQLException in nextKeyValue
>     at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
>     at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
>     at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
>     at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
>     at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
>     at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:416)
>     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
>     at org.apache.hadoop.mapred.Child.main(Child.java:262)
> Caused by: com.sap
> attempt_201302202127_0014_m_000000_0: log4j:WARN No appenders could be found for logger (org.apache.hadoop.hdfs.DFSClient).
> attempt_201302202127_0014_m_000000_0: log4j:WARN Please initialize the log4j system properly.
> attempt_201302202127_0014_m_000000_0: log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
> 13/02/20 22:38:22 INFO mapred.JobClient: Task Id : attempt_201302202127_0014_m_000000_1, Status : FAILED
> java.io.IOException: SQLException in nextKeyValue
>     at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
>     at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
>     at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
>     at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
>     at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
>     at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:416)
>     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
>     at org.apache.hadoop.mapred.Child.main(Child.java:262)
> Caused by: com.sap
> attempt_201302202127_0014_m_000000_1: log4j:WARN No appenders could be found for logger (org.apache.hadoop.hdfs.DFSClient).
> attempt_201302202127_0014_m_000000_1: log4j:WARN Please initialize the log4j system properly.
> attempt_201302202127_0014_m_000000_1: log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
> 13/02/20 22:38:32 INFO mapred.JobClient: Task Id : attempt_201302202127_0014_m_000000_2, Status : FAILED
> java.io.IOException: SQLException in nextKeyValue
>     at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
>     at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
>     at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
>     at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
>     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
>     at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
>     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
>     at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:416)
>     at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
>     at org.apache.hadoop.mapred.Child.main(Child.java:262)
> Caused by: com.sap
> attempt_201302202127_0014_m_000000_2: log4j:WARN No appenders could be found for logger (org.apache.hadoop.hdfs.DFSClient).
> attempt_201302202127_0014_m_000000_2: log4j:WARN Please initialize the log4j system properly.
> attempt_201302202127_0014_m_000000_2: log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
> 13/02/20 22:38:46 INFO mapred.JobClient: Job complete: job_201302202127_0014
> 13/02/20 22:38:46 INFO mapred.JobClient: Counters: 6
> 13/02/20 22:38:46 INFO mapred.JobClient:   Job Counters 
> 13/02/20 22:38:46 INFO mapred.JobClient:     Failed map tasks=1
> 13/02/20 22:38:46 INFO mapred.JobClient:     Launched map tasks=4
> 13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all maps in occupied slots (ms)=56775
> 13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all reduces in occupied slots (ms)=0
> 13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=0
> 13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=0
> 13/02/20 22:38:46 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 70.5203 seconds (0 bytes/sec)
> 13/02/20 22:38:46 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
> 13/02/20 22:38:46 INFO mapreduce.ImportJobBase: Retrieved 0 records.
> 13/02/20 22:38:46 ERROR tool.ImportTool: Error during import: Import job failed!
> 
> 
> On Thu, Feb 21, 2013 at 12:03 PM, Harsh J <ha...@cloudera.com> wrote:
> The error is truncated, check the actual failed task's logs for complete info:
> 
> Caused by: com.sap… what?
> 
> Seems more like a SAP side fault than a Hadoop side one and you should
> ask on their forums with the stacktrace posted.
> 
> On Thu, Feb 21, 2013 at 11:58 AM, samir das mohapatra
> <sa...@gmail.com> wrote:
> > Hi All
> >     Can you plese tell me why I am getting error while loading data from
> > SAP HANA   to Hadoop HDFS using sqoop (4.1.2).
> >
> > Error Log:
> >
> > java.io.IOException: SQLException in nextKeyValue
> >       at
> > org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
> >       at
> > org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
> >       at
> > org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
> >       at
> > org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
> >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
> >       at
> > org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
> >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
> >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
> >       at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at javax.security.auth.Subject.doAs(Subject.java:416)
> >       at
> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
> >       at org.apache.hadoop.mapred.Child.main(Child.java:262)
> > Caused by: com.sap
> >
> > Regards,
> > samir.
> >
> >
> >
> > --
> >
> >
> >
> 
> 
> 
> --
> Harsh J
> 


Re: ISSUE :Hadoop with HANA using sqoop

Posted by samir das mohapatra <sa...@gmail.com>.
Harsh,
    I copied whole logs and past here, It looks like only it is
showing   "Caused
by: com.sap" ,
And One thing i did not get is why it is running  "SELECT t.* FROM
hgopalan.hana_training AS t WHERE 1=0"  . It indicate where 1=0 means  no
value . But I database we have records.


Error:

hadoop@hadoophost2:~/Desktop$ sqoop import  --connect  jdbc:sap://
sj1svm010.corp.adobe.com:30015/hd2  --driver com.sap.db.jdbc.Driver
--table  hgopalan.hana_training  -m  1 --username hgopalan     --password
Adobe_23  --target-dir  /input/training
13/02/20 22:37:27 WARN tool.BaseSqoopTool: Setting your password on the
command-line is insecure. Consider using -P instead.
13/02/20 22:37:27 INFO manager.SqlManager: Using default fetchSize of 1000
13/02/20 22:37:27 INFO tool.CodeGenTool: Beginning code generation
13/02/20 22:37:28 INFO manager.SqlManager: Executing SQL statement: SELECT
t.* FROM hgopalan.hana_training AS t WHERE 1=0
13/02/20 22:37:29 INFO manager.SqlManager: Executing SQL statement: SELECT
t.* FROM hgopalan.hana_training AS t WHERE 1=0
13/02/20 22:37:30 INFO orm.CompilationManager: HADOOP_HOME is
/usr/local/hadoop/hadoop-2.0.0-mr1-cdh4.1.2
Note:
/tmp/sqoop-hadoop/compile/d66ed0385ac93eb37215515e4e0c2caf/hgopalan_hana_training.java
uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
13/02/20 22:37:32 INFO orm.CompilationManager: Writing jar file:
/tmp/sqoop-hadoop/compile/d66ed0385ac93eb37215515e4e0c2caf/hgopalan.hana_training.jar
13/02/20 22:37:33 INFO mapreduce.ImportJobBase: Beginning import of
hgopalan.hana_training
13/02/20 22:37:34 INFO manager.SqlManager: Executing SQL statement: SELECT
t.* FROM hgopalan.hana_training AS t WHERE 1=0
13/02/20 22:37:36 WARN mapred.JobClient: Use GenericOptionsParser for
parsing the arguments. Applications should implement Tool for the same.
13/02/20 22:37:39 INFO mapred.JobClient: Running job: job_201302202127_0014
13/02/20 22:37:40 INFO mapred.JobClient:  map 0% reduce 0%
13/02/20 22:38:06 INFO mapred.JobClient: Task Id :
attempt_201302202127_0014_m_000000_0, Status : FAILED
java.io.IOException: SQLException in nextKeyValue
    at
org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
    at
org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
    at
org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
    at
org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
    at
org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:416)
    at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
    at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: com.sap
attempt_201302202127_0014_m_000000_0: log4j:WARN No appenders could be
found for logger (org.apache.hadoop.hdfs.DFSClient).
attempt_201302202127_0014_m_000000_0: log4j:WARN Please initialize the
log4j system properly.
attempt_201302202127_0014_m_000000_0: log4j:WARN See
http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
13/02/20 22:38:22 INFO mapred.JobClient: Task Id :
attempt_201302202127_0014_m_000000_1, Status : FAILED
java.io.IOException: SQLException in nextKeyValue
    at
org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
    at
org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
    at
org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
    at
org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
    at
org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:416)
    at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
    at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: com.sap
attempt_201302202127_0014_m_000000_1: log4j:WARN No appenders could be
found for logger (org.apache.hadoop.hdfs.DFSClient).
attempt_201302202127_0014_m_000000_1: log4j:WARN Please initialize the
log4j system properly.
attempt_201302202127_0014_m_000000_1: log4j:WARN See
http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
13/02/20 22:38:32 INFO mapred.JobClient: Task Id :
attempt_201302202127_0014_m_000000_2, Status : FAILED
java.io.IOException: SQLException in nextKeyValue
    at
org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
    at
org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
    at
org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
    at
org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
    at
org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:416)
    at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
    at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: com.sap
attempt_201302202127_0014_m_000000_2: log4j:WARN No appenders could be
found for logger (org.apache.hadoop.hdfs.DFSClient).
attempt_201302202127_0014_m_000000_2: log4j:WARN Please initialize the
log4j system properly.
attempt_201302202127_0014_m_000000_2: log4j:WARN See
http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
13/02/20 22:38:46 INFO mapred.JobClient: Job complete: job_201302202127_0014
13/02/20 22:38:46 INFO mapred.JobClient: Counters: 6
13/02/20 22:38:46 INFO mapred.JobClient:   Job Counters
13/02/20 22:38:46 INFO mapred.JobClient:     Failed map tasks=1
13/02/20 22:38:46 INFO mapred.JobClient:     Launched map tasks=4
13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all maps
in occupied slots (ms)=56775
13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all
reduces in occupied slots (ms)=0
13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all maps
waiting after reserving slots (ms)=0
13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all
reduces waiting after reserving slots (ms)=0
13/02/20 22:38:46 INFO mapreduce.ImportJobBase: Transferred 0 bytes in
70.5203 seconds (0 bytes/sec)
13/02/20 22:38:46 WARN mapreduce.Counters: Group
org.apache.hadoop.mapred.Task$Counter is deprecated. Use
org.apache.hadoop.mapreduce.TaskCounter instead
13/02/20 22:38:46 INFO mapreduce.ImportJobBase: Retrieved 0 records.
13/02/20 22:38:46 ERROR tool.ImportTool: Error during import: Import job
failed!


On Thu, Feb 21, 2013 at 12:03 PM, Harsh J <ha...@cloudera.com> wrote:

> The error is truncated, check the actual failed task's logs for complete
> info:
>
> Caused by: com.sap… what?
>
> Seems more like a SAP side fault than a Hadoop side one and you should
> ask on their forums with the stacktrace posted.
>
> On Thu, Feb 21, 2013 at 11:58 AM, samir das mohapatra
> <sa...@gmail.com> wrote:
> > Hi All
> >     Can you plese tell me why I am getting error while loading data from
> > SAP HANA   to Hadoop HDFS using sqoop (4.1.2).
> >
> > Error Log:
> >
> > java.io.IOException: SQLException in nextKeyValue
> >       at
> >
> org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
> >       at
> >
> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
> >       at
> >
> org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
> >       at
> >
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
> >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
> >       at
> >
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
> >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
> >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
> >       at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at javax.security.auth.Subject.doAs(Subject.java:416)
> >       at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
> >       at org.apache.hadoop.mapred.Child.main(Child.java:262)
> > Caused by: com.sap
> >
> > Regards,
> > samir.
> >
> >
> >
> > --
> >
> >
> >
>
>
>
> --
> Harsh J
>

Re: ISSUE :Hadoop with HANA using sqoop

Posted by samir das mohapatra <sa...@gmail.com>.
Harsh,
    I copied whole logs and past here, It looks like only it is
showing   "Caused
by: com.sap" ,
And One thing i did not get is why it is running  "SELECT t.* FROM
hgopalan.hana_training AS t WHERE 1=0"  . It indicate where 1=0 means  no
value . But I database we have records.


Error:

hadoop@hadoophost2:~/Desktop$ sqoop import  --connect  jdbc:sap://
sj1svm010.corp.adobe.com:30015/hd2  --driver com.sap.db.jdbc.Driver
--table  hgopalan.hana_training  -m  1 --username hgopalan     --password
Adobe_23  --target-dir  /input/training
13/02/20 22:37:27 WARN tool.BaseSqoopTool: Setting your password on the
command-line is insecure. Consider using -P instead.
13/02/20 22:37:27 INFO manager.SqlManager: Using default fetchSize of 1000
13/02/20 22:37:27 INFO tool.CodeGenTool: Beginning code generation
13/02/20 22:37:28 INFO manager.SqlManager: Executing SQL statement: SELECT
t.* FROM hgopalan.hana_training AS t WHERE 1=0
13/02/20 22:37:29 INFO manager.SqlManager: Executing SQL statement: SELECT
t.* FROM hgopalan.hana_training AS t WHERE 1=0
13/02/20 22:37:30 INFO orm.CompilationManager: HADOOP_HOME is
/usr/local/hadoop/hadoop-2.0.0-mr1-cdh4.1.2
Note:
/tmp/sqoop-hadoop/compile/d66ed0385ac93eb37215515e4e0c2caf/hgopalan_hana_training.java
uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
13/02/20 22:37:32 INFO orm.CompilationManager: Writing jar file:
/tmp/sqoop-hadoop/compile/d66ed0385ac93eb37215515e4e0c2caf/hgopalan.hana_training.jar
13/02/20 22:37:33 INFO mapreduce.ImportJobBase: Beginning import of
hgopalan.hana_training
13/02/20 22:37:34 INFO manager.SqlManager: Executing SQL statement: SELECT
t.* FROM hgopalan.hana_training AS t WHERE 1=0
13/02/20 22:37:36 WARN mapred.JobClient: Use GenericOptionsParser for
parsing the arguments. Applications should implement Tool for the same.
13/02/20 22:37:39 INFO mapred.JobClient: Running job: job_201302202127_0014
13/02/20 22:37:40 INFO mapred.JobClient:  map 0% reduce 0%
13/02/20 22:38:06 INFO mapred.JobClient: Task Id :
attempt_201302202127_0014_m_000000_0, Status : FAILED
java.io.IOException: SQLException in nextKeyValue
    at
org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
    at
org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
    at
org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
    at
org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
    at
org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:416)
    at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
    at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: com.sap
attempt_201302202127_0014_m_000000_0: log4j:WARN No appenders could be
found for logger (org.apache.hadoop.hdfs.DFSClient).
attempt_201302202127_0014_m_000000_0: log4j:WARN Please initialize the
log4j system properly.
attempt_201302202127_0014_m_000000_0: log4j:WARN See
http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
13/02/20 22:38:22 INFO mapred.JobClient: Task Id :
attempt_201302202127_0014_m_000000_1, Status : FAILED
java.io.IOException: SQLException in nextKeyValue
    at
org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
    at
org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
    at
org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
    at
org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
    at
org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:416)
    at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
    at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: com.sap
attempt_201302202127_0014_m_000000_1: log4j:WARN No appenders could be
found for logger (org.apache.hadoop.hdfs.DFSClient).
attempt_201302202127_0014_m_000000_1: log4j:WARN Please initialize the
log4j system properly.
attempt_201302202127_0014_m_000000_1: log4j:WARN See
http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
13/02/20 22:38:32 INFO mapred.JobClient: Task Id :
attempt_201302202127_0014_m_000000_2, Status : FAILED
java.io.IOException: SQLException in nextKeyValue
    at
org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
    at
org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
    at
org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
    at
org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
    at
org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:416)
    at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
    at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: com.sap
attempt_201302202127_0014_m_000000_2: log4j:WARN No appenders could be
found for logger (org.apache.hadoop.hdfs.DFSClient).
attempt_201302202127_0014_m_000000_2: log4j:WARN Please initialize the
log4j system properly.
attempt_201302202127_0014_m_000000_2: log4j:WARN See
http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
13/02/20 22:38:46 INFO mapred.JobClient: Job complete: job_201302202127_0014
13/02/20 22:38:46 INFO mapred.JobClient: Counters: 6
13/02/20 22:38:46 INFO mapred.JobClient:   Job Counters
13/02/20 22:38:46 INFO mapred.JobClient:     Failed map tasks=1
13/02/20 22:38:46 INFO mapred.JobClient:     Launched map tasks=4
13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all maps
in occupied slots (ms)=56775
13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all
reduces in occupied slots (ms)=0
13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all maps
waiting after reserving slots (ms)=0
13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all
reduces waiting after reserving slots (ms)=0
13/02/20 22:38:46 INFO mapreduce.ImportJobBase: Transferred 0 bytes in
70.5203 seconds (0 bytes/sec)
13/02/20 22:38:46 WARN mapreduce.Counters: Group
org.apache.hadoop.mapred.Task$Counter is deprecated. Use
org.apache.hadoop.mapreduce.TaskCounter instead
13/02/20 22:38:46 INFO mapreduce.ImportJobBase: Retrieved 0 records.
13/02/20 22:38:46 ERROR tool.ImportTool: Error during import: Import job
failed!


On Thu, Feb 21, 2013 at 12:03 PM, Harsh J <ha...@cloudera.com> wrote:

> The error is truncated, check the actual failed task's logs for complete
> info:
>
> Caused by: com.sap… what?
>
> Seems more like a SAP side fault than a Hadoop side one and you should
> ask on their forums with the stacktrace posted.
>
> On Thu, Feb 21, 2013 at 11:58 AM, samir das mohapatra
> <sa...@gmail.com> wrote:
> > Hi All
> >     Can you plese tell me why I am getting error while loading data from
> > SAP HANA   to Hadoop HDFS using sqoop (4.1.2).
> >
> > Error Log:
> >
> > java.io.IOException: SQLException in nextKeyValue
> >       at
> >
> org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
> >       at
> >
> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
> >       at
> >
> org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
> >       at
> >
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
> >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
> >       at
> >
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
> >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
> >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
> >       at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at javax.security.auth.Subject.doAs(Subject.java:416)
> >       at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
> >       at org.apache.hadoop.mapred.Child.main(Child.java:262)
> > Caused by: com.sap
> >
> > Regards,
> > samir.
> >
> >
> >
> > --
> >
> >
> >
>
>
>
> --
> Harsh J
>

Re: ISSUE :Hadoop with HANA using sqoop

Posted by be...@gmail.com.
Hi Samir

Looks like there is some syntax issue with the sql query generated internally .

Can you try doing a Sqoop import by specifying the query with -query option.

Regards 
Bejoy KS

Sent from remote device, Please excuse typos

-----Original Message-----
From: samir das mohapatra <sa...@gmail.com>
Date: Thu, 21 Feb 2013 13:58:37 
To: <us...@hadoop.apache.org>
Reply-To: user@hadoop.apache.org
Subject: Re: ISSUE :Hadoop with HANA using sqoop

Putting whole Logs from Task now
------------------------------
-----------------------

Task Logs: 'attempt_201302202127_0021_m_000000_0'

*stdout logs*
------------------------------


*stderr logs*

log4j:WARN No appenders could be found for logger (org.apache.hadoop.hdfs.
DFSClient).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig
for more info.

------------------------------


*syslog logs*

2013-02-20 23:10:19,391 WARN mapreduce.Counters: Group
org.apache.hadoop.mapred.Task$Counter is deprecated. Use
org.apache.hadoop.mapreduce.TaskCounter instead
2013-02-20 23:10:19,525 WARN org.apache.hadoop.util.NativeCodeLoader:
Unable to load native-hadoop library for your platform... using
builtin-java classes where applicable
2013-02-20 23:10:20,262 WARN org.apache.hadoop.conf.Configuration:
session.id is deprecated. Instead, use dfs.metrics.session-id
2013-02-20 23:10:20,270 INFO org.apache.hadoop.metrics.jvm.JvmMetrics:
Initializing JVM Metrics with processName=MAP, sessionId=
2013-02-20 23:10:20,919 INFO org.apache.hadoop.util.ProcessTree:
setsid exited with exit code 0
2013-02-20 23:10:20,955 INFO org.apache.hadoop.mapred.Task:  Using
ResourceCalculatorPlugin :
org.apache.hadoop.util.LinuxResourceCalculatorPlugin@64a7c45e
2013-02-20 23:10:23,134 ERROR
org.apache.sqoop.mapreduce.db.DBRecordReader: Top level exception:
com.sap.db.jdbc.exceptions.JDBCDriverException: SAP DBTech JDBC:
[257]: sql syntax error: incorrect syntax near ".": line 1 col 46 (at
pos 46)
	at com.sap.db.jdbc.exceptions.SQLExceptionSapDB.createException(SQLExceptionSapDB.java:334)
	at com.sap.db.jdbc.exceptions.SQLExceptionSapDB.generateDatabaseException(SQLExceptionSapDB.java:174)
	at com.sap.db.jdbc.packet.ReplyPacket.buildExceptionChain(ReplyPacket.java:103)
	at com.sap.db.jdbc.ConnectionSapDB.execute(ConnectionSapDB.java:848)
	at com.sap.db.jdbc.CallableStatementSapDB.sendCommand(CallableStatementSapDB.java:1874)
	at com.sap.db.jdbc.StatementSapDB.sendSQL(StatementSapDB.java:945)
	at com.sap.db.jdbc.CallableStatementSapDB.doParse(CallableStatementSapDB.java:230)
	at com.sap.db.jdbc.CallableStatementSapDB.constructor(CallableStatementSapDB.java:190)
	at com.sap.db.jdbc.CallableStatementSapDB.<init>(CallableStatementSapDB.java:101)
	at com.sap.db.jdbc.CallableStatementSapDBFinalize.<init>(CallableStatementSapDBFinalize.java:31)
	at com.sap.db.jdbc.ConnectionSapDB.prepareStatement(ConnectionSapDB.java:1088)
	at com.sap.db.jdbc.trace.Connection.prepareStatement(Connection.java:347)
	at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:101)
	at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:236)
	at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
	at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
	at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
	at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:416)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
	at org.apache.hadoop.mapred.Child.main(Child.java:262)
2013-02-20 23:10:23,688 INFO
org.apache.sqoop.mapreduce.AutoProgressMapper: Auto-progress thread is
finished. keepGoing=false
2013-02-20 23:10:23,738 INFO
org.apache.hadoop.mapred.TaskLogsTruncater: Initializing logs'
truncater with mapRetainSize=-1 and reduceRetainSize=-1
2013-02-20 23:10:23,861 ERROR
org.apache.hadoop.security.UserGroupInformation:
PriviledgedActionException as:hadoop (auth:SIMPLE)
cause:java.io.IOException: SQLException in nextKeyValue
2013-02-20 23:10:23,861 WARN org.apache.hadoop.mapred.Child: Error running child
java.io.IOException: SQLException in nextKeyValue
	at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
	at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
	at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
	at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
	at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:416)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
	at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: com.sap.db.jdbc.exceptions.JDBCDriverException: SAP DBTech
JDBC: [257]: sql syntax error: incorrect syntax near ".": line 1 col
46 (at pos 46)
	at com.sap.db.jdbc.exceptions.SQLExceptionSapDB.createException(SQLExceptionSapDB.java:334)
	at com.sap.db.jdbc.exceptions.SQLExceptionSapDB.generateDatabaseException(SQLExceptionSapDB.java:174)
	at com.sap.db.jdbc.packet.ReplyPacket.buildExceptionChain(ReplyPacket.java:103)
	at com.sap.db.jdbc.ConnectionSapDB.execute(ConnectionSapDB.java:848)
	at com.sap.db.jdbc.CallableStatementSapDB.sendCommand(CallableStatementSapDB.java:1874)
	at com.sap.db.jdbc.StatementSapDB.sendSQL(StatementSapDB.java:945)
	at com.sap.db.jdbc.CallableStatementSapDB.doParse(CallableStatementSapDB.java:230)
	at com.sap.db.jdbc.CallableStatementSapDB.constructor(CallableStatementSapDB.java:190)
	at com.sap.db.jdbc.CallableStatementSapDB.<init>(CallableStatementSapDB.java:101)
	at com.sap.db.jdbc.CallableStatementSapDBFinalize.<init>(CallableStatementSapDBFinalize.java:31)
	at com.sap.db.jdbc.ConnectionSapDB.prepareStatement(ConnectionSapDB.java:1088)
	at com.sap.db.jdbc.trace.Connection.prepareStatement(Connection.java:347)
	at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:101)
	at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:236)
	... 12 more
2013-02-20 23:10:23,906 INFO org.apache.hadoop.mapred.Task: Runnning
cleanup for the task



On Thu, Feb 21, 2013 at 12:03 PM, Harsh J <ha...@cloudera.com> wrote:

> The error is truncated, check the actual failed task's logs for complete
> info:
>
> Caused by: com.sap… what?
>
> Seems more like a SAP side fault than a Hadoop side one and you should
> ask on their forums with the stacktrace posted.
>
> On Thu, Feb 21, 2013 at 11:58 AM, samir das mohapatra
> <sa...@gmail.com> wrote:
> > Hi All
> >     Can you plese tell me why I am getting error while loading data from
> > SAP HANA   to Hadoop HDFS using sqoop (4.1.2).
> >
> > Error Log:
> >
> > java.io.IOException: SQLException in nextKeyValue
> >       at
> >
> org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
> >       at
> >
> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
> >       at
> >
> org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
> >       at
> >
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
> >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
> >       at
> >
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
> >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
> >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
> >       at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at javax.security.auth.Subject.doAs(Subject.java:416)
> >       at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
> >       at org.apache.hadoop.mapred.Child.main(Child.java:262)
> > Caused by: com.sap
> >
> > Regards,
> > samir.
> >
> >
> >
> > --
> >
> >
> >
>
>
>
> --
> Harsh J
>


Re: ISSUE :Hadoop with HANA using sqoop

Posted by be...@gmail.com.
Hi Samir

Looks like there is some syntax issue with the sql query generated internally .

Can you try doing a Sqoop import by specifying the query with -query option.

Regards 
Bejoy KS

Sent from remote device, Please excuse typos

-----Original Message-----
From: samir das mohapatra <sa...@gmail.com>
Date: Thu, 21 Feb 2013 13:58:37 
To: <us...@hadoop.apache.org>
Reply-To: user@hadoop.apache.org
Subject: Re: ISSUE :Hadoop with HANA using sqoop

Putting whole Logs from Task now
------------------------------
-----------------------

Task Logs: 'attempt_201302202127_0021_m_000000_0'

*stdout logs*
------------------------------


*stderr logs*

log4j:WARN No appenders could be found for logger (org.apache.hadoop.hdfs.
DFSClient).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig
for more info.

------------------------------


*syslog logs*

2013-02-20 23:10:19,391 WARN mapreduce.Counters: Group
org.apache.hadoop.mapred.Task$Counter is deprecated. Use
org.apache.hadoop.mapreduce.TaskCounter instead
2013-02-20 23:10:19,525 WARN org.apache.hadoop.util.NativeCodeLoader:
Unable to load native-hadoop library for your platform... using
builtin-java classes where applicable
2013-02-20 23:10:20,262 WARN org.apache.hadoop.conf.Configuration:
session.id is deprecated. Instead, use dfs.metrics.session-id
2013-02-20 23:10:20,270 INFO org.apache.hadoop.metrics.jvm.JvmMetrics:
Initializing JVM Metrics with processName=MAP, sessionId=
2013-02-20 23:10:20,919 INFO org.apache.hadoop.util.ProcessTree:
setsid exited with exit code 0
2013-02-20 23:10:20,955 INFO org.apache.hadoop.mapred.Task:  Using
ResourceCalculatorPlugin :
org.apache.hadoop.util.LinuxResourceCalculatorPlugin@64a7c45e
2013-02-20 23:10:23,134 ERROR
org.apache.sqoop.mapreduce.db.DBRecordReader: Top level exception:
com.sap.db.jdbc.exceptions.JDBCDriverException: SAP DBTech JDBC:
[257]: sql syntax error: incorrect syntax near ".": line 1 col 46 (at
pos 46)
	at com.sap.db.jdbc.exceptions.SQLExceptionSapDB.createException(SQLExceptionSapDB.java:334)
	at com.sap.db.jdbc.exceptions.SQLExceptionSapDB.generateDatabaseException(SQLExceptionSapDB.java:174)
	at com.sap.db.jdbc.packet.ReplyPacket.buildExceptionChain(ReplyPacket.java:103)
	at com.sap.db.jdbc.ConnectionSapDB.execute(ConnectionSapDB.java:848)
	at com.sap.db.jdbc.CallableStatementSapDB.sendCommand(CallableStatementSapDB.java:1874)
	at com.sap.db.jdbc.StatementSapDB.sendSQL(StatementSapDB.java:945)
	at com.sap.db.jdbc.CallableStatementSapDB.doParse(CallableStatementSapDB.java:230)
	at com.sap.db.jdbc.CallableStatementSapDB.constructor(CallableStatementSapDB.java:190)
	at com.sap.db.jdbc.CallableStatementSapDB.<init>(CallableStatementSapDB.java:101)
	at com.sap.db.jdbc.CallableStatementSapDBFinalize.<init>(CallableStatementSapDBFinalize.java:31)
	at com.sap.db.jdbc.ConnectionSapDB.prepareStatement(ConnectionSapDB.java:1088)
	at com.sap.db.jdbc.trace.Connection.prepareStatement(Connection.java:347)
	at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:101)
	at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:236)
	at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
	at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
	at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
	at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:416)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
	at org.apache.hadoop.mapred.Child.main(Child.java:262)
2013-02-20 23:10:23,688 INFO
org.apache.sqoop.mapreduce.AutoProgressMapper: Auto-progress thread is
finished. keepGoing=false
2013-02-20 23:10:23,738 INFO
org.apache.hadoop.mapred.TaskLogsTruncater: Initializing logs'
truncater with mapRetainSize=-1 and reduceRetainSize=-1
2013-02-20 23:10:23,861 ERROR
org.apache.hadoop.security.UserGroupInformation:
PriviledgedActionException as:hadoop (auth:SIMPLE)
cause:java.io.IOException: SQLException in nextKeyValue
2013-02-20 23:10:23,861 WARN org.apache.hadoop.mapred.Child: Error running child
java.io.IOException: SQLException in nextKeyValue
	at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
	at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
	at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
	at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
	at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:416)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
	at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: com.sap.db.jdbc.exceptions.JDBCDriverException: SAP DBTech
JDBC: [257]: sql syntax error: incorrect syntax near ".": line 1 col
46 (at pos 46)
	at com.sap.db.jdbc.exceptions.SQLExceptionSapDB.createException(SQLExceptionSapDB.java:334)
	at com.sap.db.jdbc.exceptions.SQLExceptionSapDB.generateDatabaseException(SQLExceptionSapDB.java:174)
	at com.sap.db.jdbc.packet.ReplyPacket.buildExceptionChain(ReplyPacket.java:103)
	at com.sap.db.jdbc.ConnectionSapDB.execute(ConnectionSapDB.java:848)
	at com.sap.db.jdbc.CallableStatementSapDB.sendCommand(CallableStatementSapDB.java:1874)
	at com.sap.db.jdbc.StatementSapDB.sendSQL(StatementSapDB.java:945)
	at com.sap.db.jdbc.CallableStatementSapDB.doParse(CallableStatementSapDB.java:230)
	at com.sap.db.jdbc.CallableStatementSapDB.constructor(CallableStatementSapDB.java:190)
	at com.sap.db.jdbc.CallableStatementSapDB.<init>(CallableStatementSapDB.java:101)
	at com.sap.db.jdbc.CallableStatementSapDBFinalize.<init>(CallableStatementSapDBFinalize.java:31)
	at com.sap.db.jdbc.ConnectionSapDB.prepareStatement(ConnectionSapDB.java:1088)
	at com.sap.db.jdbc.trace.Connection.prepareStatement(Connection.java:347)
	at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:101)
	at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:236)
	... 12 more
2013-02-20 23:10:23,906 INFO org.apache.hadoop.mapred.Task: Runnning
cleanup for the task



On Thu, Feb 21, 2013 at 12:03 PM, Harsh J <ha...@cloudera.com> wrote:

> The error is truncated, check the actual failed task's logs for complete
> info:
>
> Caused by: com.sap… what?
>
> Seems more like a SAP side fault than a Hadoop side one and you should
> ask on their forums with the stacktrace posted.
>
> On Thu, Feb 21, 2013 at 11:58 AM, samir das mohapatra
> <sa...@gmail.com> wrote:
> > Hi All
> >     Can you plese tell me why I am getting error while loading data from
> > SAP HANA   to Hadoop HDFS using sqoop (4.1.2).
> >
> > Error Log:
> >
> > java.io.IOException: SQLException in nextKeyValue
> >       at
> >
> org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
> >       at
> >
> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
> >       at
> >
> org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
> >       at
> >
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
> >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
> >       at
> >
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
> >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
> >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
> >       at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at javax.security.auth.Subject.doAs(Subject.java:416)
> >       at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
> >       at org.apache.hadoop.mapred.Child.main(Child.java:262)
> > Caused by: com.sap
> >
> > Regards,
> > samir.
> >
> >
> >
> > --
> >
> >
> >
>
>
>
> --
> Harsh J
>


Re: ISSUE :Hadoop with HANA using sqoop

Posted by be...@gmail.com.
Hi Samir

Looks like there is some syntax issue with the sql query generated internally .

Can you try doing a Sqoop import by specifying the query with -query option.

Regards 
Bejoy KS

Sent from remote device, Please excuse typos

-----Original Message-----
From: samir das mohapatra <sa...@gmail.com>
Date: Thu, 21 Feb 2013 13:58:37 
To: <us...@hadoop.apache.org>
Reply-To: user@hadoop.apache.org
Subject: Re: ISSUE :Hadoop with HANA using sqoop

Putting whole Logs from Task now
------------------------------
-----------------------

Task Logs: 'attempt_201302202127_0021_m_000000_0'

*stdout logs*
------------------------------


*stderr logs*

log4j:WARN No appenders could be found for logger (org.apache.hadoop.hdfs.
DFSClient).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig
for more info.

------------------------------


*syslog logs*

2013-02-20 23:10:19,391 WARN mapreduce.Counters: Group
org.apache.hadoop.mapred.Task$Counter is deprecated. Use
org.apache.hadoop.mapreduce.TaskCounter instead
2013-02-20 23:10:19,525 WARN org.apache.hadoop.util.NativeCodeLoader:
Unable to load native-hadoop library for your platform... using
builtin-java classes where applicable
2013-02-20 23:10:20,262 WARN org.apache.hadoop.conf.Configuration:
session.id is deprecated. Instead, use dfs.metrics.session-id
2013-02-20 23:10:20,270 INFO org.apache.hadoop.metrics.jvm.JvmMetrics:
Initializing JVM Metrics with processName=MAP, sessionId=
2013-02-20 23:10:20,919 INFO org.apache.hadoop.util.ProcessTree:
setsid exited with exit code 0
2013-02-20 23:10:20,955 INFO org.apache.hadoop.mapred.Task:  Using
ResourceCalculatorPlugin :
org.apache.hadoop.util.LinuxResourceCalculatorPlugin@64a7c45e
2013-02-20 23:10:23,134 ERROR
org.apache.sqoop.mapreduce.db.DBRecordReader: Top level exception:
com.sap.db.jdbc.exceptions.JDBCDriverException: SAP DBTech JDBC:
[257]: sql syntax error: incorrect syntax near ".": line 1 col 46 (at
pos 46)
	at com.sap.db.jdbc.exceptions.SQLExceptionSapDB.createException(SQLExceptionSapDB.java:334)
	at com.sap.db.jdbc.exceptions.SQLExceptionSapDB.generateDatabaseException(SQLExceptionSapDB.java:174)
	at com.sap.db.jdbc.packet.ReplyPacket.buildExceptionChain(ReplyPacket.java:103)
	at com.sap.db.jdbc.ConnectionSapDB.execute(ConnectionSapDB.java:848)
	at com.sap.db.jdbc.CallableStatementSapDB.sendCommand(CallableStatementSapDB.java:1874)
	at com.sap.db.jdbc.StatementSapDB.sendSQL(StatementSapDB.java:945)
	at com.sap.db.jdbc.CallableStatementSapDB.doParse(CallableStatementSapDB.java:230)
	at com.sap.db.jdbc.CallableStatementSapDB.constructor(CallableStatementSapDB.java:190)
	at com.sap.db.jdbc.CallableStatementSapDB.<init>(CallableStatementSapDB.java:101)
	at com.sap.db.jdbc.CallableStatementSapDBFinalize.<init>(CallableStatementSapDBFinalize.java:31)
	at com.sap.db.jdbc.ConnectionSapDB.prepareStatement(ConnectionSapDB.java:1088)
	at com.sap.db.jdbc.trace.Connection.prepareStatement(Connection.java:347)
	at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:101)
	at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:236)
	at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
	at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
	at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
	at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:416)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
	at org.apache.hadoop.mapred.Child.main(Child.java:262)
2013-02-20 23:10:23,688 INFO
org.apache.sqoop.mapreduce.AutoProgressMapper: Auto-progress thread is
finished. keepGoing=false
2013-02-20 23:10:23,738 INFO
org.apache.hadoop.mapred.TaskLogsTruncater: Initializing logs'
truncater with mapRetainSize=-1 and reduceRetainSize=-1
2013-02-20 23:10:23,861 ERROR
org.apache.hadoop.security.UserGroupInformation:
PriviledgedActionException as:hadoop (auth:SIMPLE)
cause:java.io.IOException: SQLException in nextKeyValue
2013-02-20 23:10:23,861 WARN org.apache.hadoop.mapred.Child: Error running child
java.io.IOException: SQLException in nextKeyValue
	at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
	at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
	at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
	at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
	at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:416)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
	at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: com.sap.db.jdbc.exceptions.JDBCDriverException: SAP DBTech
JDBC: [257]: sql syntax error: incorrect syntax near ".": line 1 col
46 (at pos 46)
	at com.sap.db.jdbc.exceptions.SQLExceptionSapDB.createException(SQLExceptionSapDB.java:334)
	at com.sap.db.jdbc.exceptions.SQLExceptionSapDB.generateDatabaseException(SQLExceptionSapDB.java:174)
	at com.sap.db.jdbc.packet.ReplyPacket.buildExceptionChain(ReplyPacket.java:103)
	at com.sap.db.jdbc.ConnectionSapDB.execute(ConnectionSapDB.java:848)
	at com.sap.db.jdbc.CallableStatementSapDB.sendCommand(CallableStatementSapDB.java:1874)
	at com.sap.db.jdbc.StatementSapDB.sendSQL(StatementSapDB.java:945)
	at com.sap.db.jdbc.CallableStatementSapDB.doParse(CallableStatementSapDB.java:230)
	at com.sap.db.jdbc.CallableStatementSapDB.constructor(CallableStatementSapDB.java:190)
	at com.sap.db.jdbc.CallableStatementSapDB.<init>(CallableStatementSapDB.java:101)
	at com.sap.db.jdbc.CallableStatementSapDBFinalize.<init>(CallableStatementSapDBFinalize.java:31)
	at com.sap.db.jdbc.ConnectionSapDB.prepareStatement(ConnectionSapDB.java:1088)
	at com.sap.db.jdbc.trace.Connection.prepareStatement(Connection.java:347)
	at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:101)
	at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:236)
	... 12 more
2013-02-20 23:10:23,906 INFO org.apache.hadoop.mapred.Task: Runnning
cleanup for the task



On Thu, Feb 21, 2013 at 12:03 PM, Harsh J <ha...@cloudera.com> wrote:

> The error is truncated, check the actual failed task's logs for complete
> info:
>
> Caused by: com.sap… what?
>
> Seems more like a SAP side fault than a Hadoop side one and you should
> ask on their forums with the stacktrace posted.
>
> On Thu, Feb 21, 2013 at 11:58 AM, samir das mohapatra
> <sa...@gmail.com> wrote:
> > Hi All
> >     Can you plese tell me why I am getting error while loading data from
> > SAP HANA   to Hadoop HDFS using sqoop (4.1.2).
> >
> > Error Log:
> >
> > java.io.IOException: SQLException in nextKeyValue
> >       at
> >
> org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
> >       at
> >
> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
> >       at
> >
> org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
> >       at
> >
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
> >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
> >       at
> >
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
> >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
> >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
> >       at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at javax.security.auth.Subject.doAs(Subject.java:416)
> >       at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
> >       at org.apache.hadoop.mapred.Child.main(Child.java:262)
> > Caused by: com.sap
> >
> > Regards,
> > samir.
> >
> >
> >
> > --
> >
> >
> >
>
>
>
> --
> Harsh J
>


Re: ISSUE :Hadoop with HANA using sqoop

Posted by be...@gmail.com.
Hi Samir

Looks like there is some syntax issue with the sql query generated internally .

Can you try doing a Sqoop import by specifying the query with -query option.

Regards 
Bejoy KS

Sent from remote device, Please excuse typos

-----Original Message-----
From: samir das mohapatra <sa...@gmail.com>
Date: Thu, 21 Feb 2013 13:58:37 
To: <us...@hadoop.apache.org>
Reply-To: user@hadoop.apache.org
Subject: Re: ISSUE :Hadoop with HANA using sqoop

Putting whole Logs from Task now
------------------------------
-----------------------

Task Logs: 'attempt_201302202127_0021_m_000000_0'

*stdout logs*
------------------------------


*stderr logs*

log4j:WARN No appenders could be found for logger (org.apache.hadoop.hdfs.
DFSClient).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig
for more info.

------------------------------


*syslog logs*

2013-02-20 23:10:19,391 WARN mapreduce.Counters: Group
org.apache.hadoop.mapred.Task$Counter is deprecated. Use
org.apache.hadoop.mapreduce.TaskCounter instead
2013-02-20 23:10:19,525 WARN org.apache.hadoop.util.NativeCodeLoader:
Unable to load native-hadoop library for your platform... using
builtin-java classes where applicable
2013-02-20 23:10:20,262 WARN org.apache.hadoop.conf.Configuration:
session.id is deprecated. Instead, use dfs.metrics.session-id
2013-02-20 23:10:20,270 INFO org.apache.hadoop.metrics.jvm.JvmMetrics:
Initializing JVM Metrics with processName=MAP, sessionId=
2013-02-20 23:10:20,919 INFO org.apache.hadoop.util.ProcessTree:
setsid exited with exit code 0
2013-02-20 23:10:20,955 INFO org.apache.hadoop.mapred.Task:  Using
ResourceCalculatorPlugin :
org.apache.hadoop.util.LinuxResourceCalculatorPlugin@64a7c45e
2013-02-20 23:10:23,134 ERROR
org.apache.sqoop.mapreduce.db.DBRecordReader: Top level exception:
com.sap.db.jdbc.exceptions.JDBCDriverException: SAP DBTech JDBC:
[257]: sql syntax error: incorrect syntax near ".": line 1 col 46 (at
pos 46)
	at com.sap.db.jdbc.exceptions.SQLExceptionSapDB.createException(SQLExceptionSapDB.java:334)
	at com.sap.db.jdbc.exceptions.SQLExceptionSapDB.generateDatabaseException(SQLExceptionSapDB.java:174)
	at com.sap.db.jdbc.packet.ReplyPacket.buildExceptionChain(ReplyPacket.java:103)
	at com.sap.db.jdbc.ConnectionSapDB.execute(ConnectionSapDB.java:848)
	at com.sap.db.jdbc.CallableStatementSapDB.sendCommand(CallableStatementSapDB.java:1874)
	at com.sap.db.jdbc.StatementSapDB.sendSQL(StatementSapDB.java:945)
	at com.sap.db.jdbc.CallableStatementSapDB.doParse(CallableStatementSapDB.java:230)
	at com.sap.db.jdbc.CallableStatementSapDB.constructor(CallableStatementSapDB.java:190)
	at com.sap.db.jdbc.CallableStatementSapDB.<init>(CallableStatementSapDB.java:101)
	at com.sap.db.jdbc.CallableStatementSapDBFinalize.<init>(CallableStatementSapDBFinalize.java:31)
	at com.sap.db.jdbc.ConnectionSapDB.prepareStatement(ConnectionSapDB.java:1088)
	at com.sap.db.jdbc.trace.Connection.prepareStatement(Connection.java:347)
	at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:101)
	at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:236)
	at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
	at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
	at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
	at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:416)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
	at org.apache.hadoop.mapred.Child.main(Child.java:262)
2013-02-20 23:10:23,688 INFO
org.apache.sqoop.mapreduce.AutoProgressMapper: Auto-progress thread is
finished. keepGoing=false
2013-02-20 23:10:23,738 INFO
org.apache.hadoop.mapred.TaskLogsTruncater: Initializing logs'
truncater with mapRetainSize=-1 and reduceRetainSize=-1
2013-02-20 23:10:23,861 ERROR
org.apache.hadoop.security.UserGroupInformation:
PriviledgedActionException as:hadoop (auth:SIMPLE)
cause:java.io.IOException: SQLException in nextKeyValue
2013-02-20 23:10:23,861 WARN org.apache.hadoop.mapred.Child: Error running child
java.io.IOException: SQLException in nextKeyValue
	at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
	at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
	at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
	at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
	at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:416)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
	at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: com.sap.db.jdbc.exceptions.JDBCDriverException: SAP DBTech
JDBC: [257]: sql syntax error: incorrect syntax near ".": line 1 col
46 (at pos 46)
	at com.sap.db.jdbc.exceptions.SQLExceptionSapDB.createException(SQLExceptionSapDB.java:334)
	at com.sap.db.jdbc.exceptions.SQLExceptionSapDB.generateDatabaseException(SQLExceptionSapDB.java:174)
	at com.sap.db.jdbc.packet.ReplyPacket.buildExceptionChain(ReplyPacket.java:103)
	at com.sap.db.jdbc.ConnectionSapDB.execute(ConnectionSapDB.java:848)
	at com.sap.db.jdbc.CallableStatementSapDB.sendCommand(CallableStatementSapDB.java:1874)
	at com.sap.db.jdbc.StatementSapDB.sendSQL(StatementSapDB.java:945)
	at com.sap.db.jdbc.CallableStatementSapDB.doParse(CallableStatementSapDB.java:230)
	at com.sap.db.jdbc.CallableStatementSapDB.constructor(CallableStatementSapDB.java:190)
	at com.sap.db.jdbc.CallableStatementSapDB.<init>(CallableStatementSapDB.java:101)
	at com.sap.db.jdbc.CallableStatementSapDBFinalize.<init>(CallableStatementSapDBFinalize.java:31)
	at com.sap.db.jdbc.ConnectionSapDB.prepareStatement(ConnectionSapDB.java:1088)
	at com.sap.db.jdbc.trace.Connection.prepareStatement(Connection.java:347)
	at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:101)
	at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:236)
	... 12 more
2013-02-20 23:10:23,906 INFO org.apache.hadoop.mapred.Task: Runnning
cleanup for the task



On Thu, Feb 21, 2013 at 12:03 PM, Harsh J <ha...@cloudera.com> wrote:

> The error is truncated, check the actual failed task's logs for complete
> info:
>
> Caused by: com.sap… what?
>
> Seems more like a SAP side fault than a Hadoop side one and you should
> ask on their forums with the stacktrace posted.
>
> On Thu, Feb 21, 2013 at 11:58 AM, samir das mohapatra
> <sa...@gmail.com> wrote:
> > Hi All
> >     Can you plese tell me why I am getting error while loading data from
> > SAP HANA   to Hadoop HDFS using sqoop (4.1.2).
> >
> > Error Log:
> >
> > java.io.IOException: SQLException in nextKeyValue
> >       at
> >
> org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
> >       at
> >
> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
> >       at
> >
> org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
> >       at
> >
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
> >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
> >       at
> >
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
> >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
> >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
> >       at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at javax.security.auth.Subject.doAs(Subject.java:416)
> >       at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
> >       at org.apache.hadoop.mapred.Child.main(Child.java:262)
> > Caused by: com.sap
> >
> > Regards,
> > samir.
> >
> >
> >
> > --
> >
> >
> >
>
>
>
> --
> Harsh J
>


Re: ISSUE :Hadoop with HANA using sqoop

Posted by samir das mohapatra <sa...@gmail.com>.
Putting whole Logs from Task now
------------------------------
-----------------------

Task Logs: 'attempt_201302202127_0021_m_000000_0'

*stdout logs*
------------------------------


*stderr logs*

log4j:WARN No appenders could be found for logger (org.apache.hadoop.hdfs.
DFSClient).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig
for more info.

------------------------------


*syslog logs*

2013-02-20 23:10:19,391 WARN mapreduce.Counters: Group
org.apache.hadoop.mapred.Task$Counter is deprecated. Use
org.apache.hadoop.mapreduce.TaskCounter instead
2013-02-20 23:10:19,525 WARN org.apache.hadoop.util.NativeCodeLoader:
Unable to load native-hadoop library for your platform... using
builtin-java classes where applicable
2013-02-20 23:10:20,262 WARN org.apache.hadoop.conf.Configuration:
session.id is deprecated. Instead, use dfs.metrics.session-id
2013-02-20 23:10:20,270 INFO org.apache.hadoop.metrics.jvm.JvmMetrics:
Initializing JVM Metrics with processName=MAP, sessionId=
2013-02-20 23:10:20,919 INFO org.apache.hadoop.util.ProcessTree:
setsid exited with exit code 0
2013-02-20 23:10:20,955 INFO org.apache.hadoop.mapred.Task:  Using
ResourceCalculatorPlugin :
org.apache.hadoop.util.LinuxResourceCalculatorPlugin@64a7c45e
2013-02-20 23:10:23,134 ERROR
org.apache.sqoop.mapreduce.db.DBRecordReader: Top level exception:
com.sap.db.jdbc.exceptions.JDBCDriverException: SAP DBTech JDBC:
[257]: sql syntax error: incorrect syntax near ".": line 1 col 46 (at
pos 46)
	at com.sap.db.jdbc.exceptions.SQLExceptionSapDB.createException(SQLExceptionSapDB.java:334)
	at com.sap.db.jdbc.exceptions.SQLExceptionSapDB.generateDatabaseException(SQLExceptionSapDB.java:174)
	at com.sap.db.jdbc.packet.ReplyPacket.buildExceptionChain(ReplyPacket.java:103)
	at com.sap.db.jdbc.ConnectionSapDB.execute(ConnectionSapDB.java:848)
	at com.sap.db.jdbc.CallableStatementSapDB.sendCommand(CallableStatementSapDB.java:1874)
	at com.sap.db.jdbc.StatementSapDB.sendSQL(StatementSapDB.java:945)
	at com.sap.db.jdbc.CallableStatementSapDB.doParse(CallableStatementSapDB.java:230)
	at com.sap.db.jdbc.CallableStatementSapDB.constructor(CallableStatementSapDB.java:190)
	at com.sap.db.jdbc.CallableStatementSapDB.<init>(CallableStatementSapDB.java:101)
	at com.sap.db.jdbc.CallableStatementSapDBFinalize.<init>(CallableStatementSapDBFinalize.java:31)
	at com.sap.db.jdbc.ConnectionSapDB.prepareStatement(ConnectionSapDB.java:1088)
	at com.sap.db.jdbc.trace.Connection.prepareStatement(Connection.java:347)
	at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:101)
	at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:236)
	at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
	at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
	at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
	at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:416)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
	at org.apache.hadoop.mapred.Child.main(Child.java:262)
2013-02-20 23:10:23,688 INFO
org.apache.sqoop.mapreduce.AutoProgressMapper: Auto-progress thread is
finished. keepGoing=false
2013-02-20 23:10:23,738 INFO
org.apache.hadoop.mapred.TaskLogsTruncater: Initializing logs'
truncater with mapRetainSize=-1 and reduceRetainSize=-1
2013-02-20 23:10:23,861 ERROR
org.apache.hadoop.security.UserGroupInformation:
PriviledgedActionException as:hadoop (auth:SIMPLE)
cause:java.io.IOException: SQLException in nextKeyValue
2013-02-20 23:10:23,861 WARN org.apache.hadoop.mapred.Child: Error running child
java.io.IOException: SQLException in nextKeyValue
	at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
	at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
	at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
	at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
	at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:416)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
	at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: com.sap.db.jdbc.exceptions.JDBCDriverException: SAP DBTech
JDBC: [257]: sql syntax error: incorrect syntax near ".": line 1 col
46 (at pos 46)
	at com.sap.db.jdbc.exceptions.SQLExceptionSapDB.createException(SQLExceptionSapDB.java:334)
	at com.sap.db.jdbc.exceptions.SQLExceptionSapDB.generateDatabaseException(SQLExceptionSapDB.java:174)
	at com.sap.db.jdbc.packet.ReplyPacket.buildExceptionChain(ReplyPacket.java:103)
	at com.sap.db.jdbc.ConnectionSapDB.execute(ConnectionSapDB.java:848)
	at com.sap.db.jdbc.CallableStatementSapDB.sendCommand(CallableStatementSapDB.java:1874)
	at com.sap.db.jdbc.StatementSapDB.sendSQL(StatementSapDB.java:945)
	at com.sap.db.jdbc.CallableStatementSapDB.doParse(CallableStatementSapDB.java:230)
	at com.sap.db.jdbc.CallableStatementSapDB.constructor(CallableStatementSapDB.java:190)
	at com.sap.db.jdbc.CallableStatementSapDB.<init>(CallableStatementSapDB.java:101)
	at com.sap.db.jdbc.CallableStatementSapDBFinalize.<init>(CallableStatementSapDBFinalize.java:31)
	at com.sap.db.jdbc.ConnectionSapDB.prepareStatement(ConnectionSapDB.java:1088)
	at com.sap.db.jdbc.trace.Connection.prepareStatement(Connection.java:347)
	at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:101)
	at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:236)
	... 12 more
2013-02-20 23:10:23,906 INFO org.apache.hadoop.mapred.Task: Runnning
cleanup for the task



On Thu, Feb 21, 2013 at 12:03 PM, Harsh J <ha...@cloudera.com> wrote:

> The error is truncated, check the actual failed task's logs for complete
> info:
>
> Caused by: com.sap… what?
>
> Seems more like a SAP side fault than a Hadoop side one and you should
> ask on their forums with the stacktrace posted.
>
> On Thu, Feb 21, 2013 at 11:58 AM, samir das mohapatra
> <sa...@gmail.com> wrote:
> > Hi All
> >     Can you plese tell me why I am getting error while loading data from
> > SAP HANA   to Hadoop HDFS using sqoop (4.1.2).
> >
> > Error Log:
> >
> > java.io.IOException: SQLException in nextKeyValue
> >       at
> >
> org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
> >       at
> >
> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
> >       at
> >
> org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
> >       at
> >
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
> >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
> >       at
> >
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
> >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
> >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
> >       at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at javax.security.auth.Subject.doAs(Subject.java:416)
> >       at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
> >       at org.apache.hadoop.mapred.Child.main(Child.java:262)
> > Caused by: com.sap
> >
> > Regards,
> > samir.
> >
> >
> >
> > --
> >
> >
> >
>
>
>
> --
> Harsh J
>

Re: ISSUE :Hadoop with HANA using sqoop

Posted by samir das mohapatra <sa...@gmail.com>.
Putting whole Logs from Task now
------------------------------
-----------------------

Task Logs: 'attempt_201302202127_0021_m_000000_0'

*stdout logs*
------------------------------


*stderr logs*

log4j:WARN No appenders could be found for logger (org.apache.hadoop.hdfs.
DFSClient).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig
for more info.

------------------------------


*syslog logs*

2013-02-20 23:10:19,391 WARN mapreduce.Counters: Group
org.apache.hadoop.mapred.Task$Counter is deprecated. Use
org.apache.hadoop.mapreduce.TaskCounter instead
2013-02-20 23:10:19,525 WARN org.apache.hadoop.util.NativeCodeLoader:
Unable to load native-hadoop library for your platform... using
builtin-java classes where applicable
2013-02-20 23:10:20,262 WARN org.apache.hadoop.conf.Configuration:
session.id is deprecated. Instead, use dfs.metrics.session-id
2013-02-20 23:10:20,270 INFO org.apache.hadoop.metrics.jvm.JvmMetrics:
Initializing JVM Metrics with processName=MAP, sessionId=
2013-02-20 23:10:20,919 INFO org.apache.hadoop.util.ProcessTree:
setsid exited with exit code 0
2013-02-20 23:10:20,955 INFO org.apache.hadoop.mapred.Task:  Using
ResourceCalculatorPlugin :
org.apache.hadoop.util.LinuxResourceCalculatorPlugin@64a7c45e
2013-02-20 23:10:23,134 ERROR
org.apache.sqoop.mapreduce.db.DBRecordReader: Top level exception:
com.sap.db.jdbc.exceptions.JDBCDriverException: SAP DBTech JDBC:
[257]: sql syntax error: incorrect syntax near ".": line 1 col 46 (at
pos 46)
	at com.sap.db.jdbc.exceptions.SQLExceptionSapDB.createException(SQLExceptionSapDB.java:334)
	at com.sap.db.jdbc.exceptions.SQLExceptionSapDB.generateDatabaseException(SQLExceptionSapDB.java:174)
	at com.sap.db.jdbc.packet.ReplyPacket.buildExceptionChain(ReplyPacket.java:103)
	at com.sap.db.jdbc.ConnectionSapDB.execute(ConnectionSapDB.java:848)
	at com.sap.db.jdbc.CallableStatementSapDB.sendCommand(CallableStatementSapDB.java:1874)
	at com.sap.db.jdbc.StatementSapDB.sendSQL(StatementSapDB.java:945)
	at com.sap.db.jdbc.CallableStatementSapDB.doParse(CallableStatementSapDB.java:230)
	at com.sap.db.jdbc.CallableStatementSapDB.constructor(CallableStatementSapDB.java:190)
	at com.sap.db.jdbc.CallableStatementSapDB.<init>(CallableStatementSapDB.java:101)
	at com.sap.db.jdbc.CallableStatementSapDBFinalize.<init>(CallableStatementSapDBFinalize.java:31)
	at com.sap.db.jdbc.ConnectionSapDB.prepareStatement(ConnectionSapDB.java:1088)
	at com.sap.db.jdbc.trace.Connection.prepareStatement(Connection.java:347)
	at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:101)
	at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:236)
	at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
	at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
	at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
	at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:416)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
	at org.apache.hadoop.mapred.Child.main(Child.java:262)
2013-02-20 23:10:23,688 INFO
org.apache.sqoop.mapreduce.AutoProgressMapper: Auto-progress thread is
finished. keepGoing=false
2013-02-20 23:10:23,738 INFO
org.apache.hadoop.mapred.TaskLogsTruncater: Initializing logs'
truncater with mapRetainSize=-1 and reduceRetainSize=-1
2013-02-20 23:10:23,861 ERROR
org.apache.hadoop.security.UserGroupInformation:
PriviledgedActionException as:hadoop (auth:SIMPLE)
cause:java.io.IOException: SQLException in nextKeyValue
2013-02-20 23:10:23,861 WARN org.apache.hadoop.mapred.Child: Error running child
java.io.IOException: SQLException in nextKeyValue
	at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
	at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
	at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
	at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
	at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:416)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
	at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: com.sap.db.jdbc.exceptions.JDBCDriverException: SAP DBTech
JDBC: [257]: sql syntax error: incorrect syntax near ".": line 1 col
46 (at pos 46)
	at com.sap.db.jdbc.exceptions.SQLExceptionSapDB.createException(SQLExceptionSapDB.java:334)
	at com.sap.db.jdbc.exceptions.SQLExceptionSapDB.generateDatabaseException(SQLExceptionSapDB.java:174)
	at com.sap.db.jdbc.packet.ReplyPacket.buildExceptionChain(ReplyPacket.java:103)
	at com.sap.db.jdbc.ConnectionSapDB.execute(ConnectionSapDB.java:848)
	at com.sap.db.jdbc.CallableStatementSapDB.sendCommand(CallableStatementSapDB.java:1874)
	at com.sap.db.jdbc.StatementSapDB.sendSQL(StatementSapDB.java:945)
	at com.sap.db.jdbc.CallableStatementSapDB.doParse(CallableStatementSapDB.java:230)
	at com.sap.db.jdbc.CallableStatementSapDB.constructor(CallableStatementSapDB.java:190)
	at com.sap.db.jdbc.CallableStatementSapDB.<init>(CallableStatementSapDB.java:101)
	at com.sap.db.jdbc.CallableStatementSapDBFinalize.<init>(CallableStatementSapDBFinalize.java:31)
	at com.sap.db.jdbc.ConnectionSapDB.prepareStatement(ConnectionSapDB.java:1088)
	at com.sap.db.jdbc.trace.Connection.prepareStatement(Connection.java:347)
	at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:101)
	at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:236)
	... 12 more
2013-02-20 23:10:23,906 INFO org.apache.hadoop.mapred.Task: Runnning
cleanup for the task



On Thu, Feb 21, 2013 at 12:03 PM, Harsh J <ha...@cloudera.com> wrote:

> The error is truncated, check the actual failed task's logs for complete
> info:
>
> Caused by: com.sap… what?
>
> Seems more like a SAP side fault than a Hadoop side one and you should
> ask on their forums with the stacktrace posted.
>
> On Thu, Feb 21, 2013 at 11:58 AM, samir das mohapatra
> <sa...@gmail.com> wrote:
> > Hi All
> >     Can you plese tell me why I am getting error while loading data from
> > SAP HANA   to Hadoop HDFS using sqoop (4.1.2).
> >
> > Error Log:
> >
> > java.io.IOException: SQLException in nextKeyValue
> >       at
> >
> org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
> >       at
> >
> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
> >       at
> >
> org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
> >       at
> >
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
> >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
> >       at
> >
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
> >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
> >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
> >       at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at javax.security.auth.Subject.doAs(Subject.java:416)
> >       at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
> >       at org.apache.hadoop.mapred.Child.main(Child.java:262)
> > Caused by: com.sap
> >
> > Regards,
> > samir.
> >
> >
> >
> > --
> >
> >
> >
>
>
>
> --
> Harsh J
>

Re: ISSUE :Hadoop with HANA using sqoop

Posted by samir das mohapatra <sa...@gmail.com>.
Harsh,
    I copied whole logs and past here, It looks like only it is
showing   "Caused
by: com.sap" ,
And One thing i did not get is why it is running  "SELECT t.* FROM
hgopalan.hana_training AS t WHERE 1=0"  . It indicate where 1=0 means  no
value . But I database we have records.


Error:

hadoop@hadoophost2:~/Desktop$ sqoop import  --connect  jdbc:sap://
sj1svm010.corp.adobe.com:30015/hd2  --driver com.sap.db.jdbc.Driver
--table  hgopalan.hana_training  -m  1 --username hgopalan     --password
Adobe_23  --target-dir  /input/training
13/02/20 22:37:27 WARN tool.BaseSqoopTool: Setting your password on the
command-line is insecure. Consider using -P instead.
13/02/20 22:37:27 INFO manager.SqlManager: Using default fetchSize of 1000
13/02/20 22:37:27 INFO tool.CodeGenTool: Beginning code generation
13/02/20 22:37:28 INFO manager.SqlManager: Executing SQL statement: SELECT
t.* FROM hgopalan.hana_training AS t WHERE 1=0
13/02/20 22:37:29 INFO manager.SqlManager: Executing SQL statement: SELECT
t.* FROM hgopalan.hana_training AS t WHERE 1=0
13/02/20 22:37:30 INFO orm.CompilationManager: HADOOP_HOME is
/usr/local/hadoop/hadoop-2.0.0-mr1-cdh4.1.2
Note:
/tmp/sqoop-hadoop/compile/d66ed0385ac93eb37215515e4e0c2caf/hgopalan_hana_training.java
uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
13/02/20 22:37:32 INFO orm.CompilationManager: Writing jar file:
/tmp/sqoop-hadoop/compile/d66ed0385ac93eb37215515e4e0c2caf/hgopalan.hana_training.jar
13/02/20 22:37:33 INFO mapreduce.ImportJobBase: Beginning import of
hgopalan.hana_training
13/02/20 22:37:34 INFO manager.SqlManager: Executing SQL statement: SELECT
t.* FROM hgopalan.hana_training AS t WHERE 1=0
13/02/20 22:37:36 WARN mapred.JobClient: Use GenericOptionsParser for
parsing the arguments. Applications should implement Tool for the same.
13/02/20 22:37:39 INFO mapred.JobClient: Running job: job_201302202127_0014
13/02/20 22:37:40 INFO mapred.JobClient:  map 0% reduce 0%
13/02/20 22:38:06 INFO mapred.JobClient: Task Id :
attempt_201302202127_0014_m_000000_0, Status : FAILED
java.io.IOException: SQLException in nextKeyValue
    at
org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
    at
org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
    at
org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
    at
org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
    at
org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:416)
    at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
    at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: com.sap
attempt_201302202127_0014_m_000000_0: log4j:WARN No appenders could be
found for logger (org.apache.hadoop.hdfs.DFSClient).
attempt_201302202127_0014_m_000000_0: log4j:WARN Please initialize the
log4j system properly.
attempt_201302202127_0014_m_000000_0: log4j:WARN See
http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
13/02/20 22:38:22 INFO mapred.JobClient: Task Id :
attempt_201302202127_0014_m_000000_1, Status : FAILED
java.io.IOException: SQLException in nextKeyValue
    at
org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
    at
org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
    at
org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
    at
org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
    at
org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:416)
    at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
    at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: com.sap
attempt_201302202127_0014_m_000000_1: log4j:WARN No appenders could be
found for logger (org.apache.hadoop.hdfs.DFSClient).
attempt_201302202127_0014_m_000000_1: log4j:WARN Please initialize the
log4j system properly.
attempt_201302202127_0014_m_000000_1: log4j:WARN See
http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
13/02/20 22:38:32 INFO mapred.JobClient: Task Id :
attempt_201302202127_0014_m_000000_2, Status : FAILED
java.io.IOException: SQLException in nextKeyValue
    at
org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
    at
org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
    at
org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
    at
org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
    at
org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:416)
    at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
    at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: com.sap
attempt_201302202127_0014_m_000000_2: log4j:WARN No appenders could be
found for logger (org.apache.hadoop.hdfs.DFSClient).
attempt_201302202127_0014_m_000000_2: log4j:WARN Please initialize the
log4j system properly.
attempt_201302202127_0014_m_000000_2: log4j:WARN See
http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
13/02/20 22:38:46 INFO mapred.JobClient: Job complete: job_201302202127_0014
13/02/20 22:38:46 INFO mapred.JobClient: Counters: 6
13/02/20 22:38:46 INFO mapred.JobClient:   Job Counters
13/02/20 22:38:46 INFO mapred.JobClient:     Failed map tasks=1
13/02/20 22:38:46 INFO mapred.JobClient:     Launched map tasks=4
13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all maps
in occupied slots (ms)=56775
13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all
reduces in occupied slots (ms)=0
13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all maps
waiting after reserving slots (ms)=0
13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all
reduces waiting after reserving slots (ms)=0
13/02/20 22:38:46 INFO mapreduce.ImportJobBase: Transferred 0 bytes in
70.5203 seconds (0 bytes/sec)
13/02/20 22:38:46 WARN mapreduce.Counters: Group
org.apache.hadoop.mapred.Task$Counter is deprecated. Use
org.apache.hadoop.mapreduce.TaskCounter instead
13/02/20 22:38:46 INFO mapreduce.ImportJobBase: Retrieved 0 records.
13/02/20 22:38:46 ERROR tool.ImportTool: Error during import: Import job
failed!


On Thu, Feb 21, 2013 at 12:03 PM, Harsh J <ha...@cloudera.com> wrote:

> The error is truncated, check the actual failed task's logs for complete
> info:
>
> Caused by: com.sap… what?
>
> Seems more like a SAP side fault than a Hadoop side one and you should
> ask on their forums with the stacktrace posted.
>
> On Thu, Feb 21, 2013 at 11:58 AM, samir das mohapatra
> <sa...@gmail.com> wrote:
> > Hi All
> >     Can you plese tell me why I am getting error while loading data from
> > SAP HANA   to Hadoop HDFS using sqoop (4.1.2).
> >
> > Error Log:
> >
> > java.io.IOException: SQLException in nextKeyValue
> >       at
> >
> org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
> >       at
> >
> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
> >       at
> >
> org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
> >       at
> >
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
> >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
> >       at
> >
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
> >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
> >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
> >       at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at javax.security.auth.Subject.doAs(Subject.java:416)
> >       at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
> >       at org.apache.hadoop.mapred.Child.main(Child.java:262)
> > Caused by: com.sap
> >
> > Regards,
> > samir.
> >
> >
> >
> > --
> >
> >
> >
>
>
>
> --
> Harsh J
>

Re: ISSUE :Hadoop with HANA using sqoop

Posted by samir das mohapatra <sa...@gmail.com>.
Putting whole Logs from Task now
------------------------------
-----------------------

Task Logs: 'attempt_201302202127_0021_m_000000_0'

*stdout logs*
------------------------------


*stderr logs*

log4j:WARN No appenders could be found for logger (org.apache.hadoop.hdfs.
DFSClient).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig
for more info.

------------------------------


*syslog logs*

2013-02-20 23:10:19,391 WARN mapreduce.Counters: Group
org.apache.hadoop.mapred.Task$Counter is deprecated. Use
org.apache.hadoop.mapreduce.TaskCounter instead
2013-02-20 23:10:19,525 WARN org.apache.hadoop.util.NativeCodeLoader:
Unable to load native-hadoop library for your platform... using
builtin-java classes where applicable
2013-02-20 23:10:20,262 WARN org.apache.hadoop.conf.Configuration:
session.id is deprecated. Instead, use dfs.metrics.session-id
2013-02-20 23:10:20,270 INFO org.apache.hadoop.metrics.jvm.JvmMetrics:
Initializing JVM Metrics with processName=MAP, sessionId=
2013-02-20 23:10:20,919 INFO org.apache.hadoop.util.ProcessTree:
setsid exited with exit code 0
2013-02-20 23:10:20,955 INFO org.apache.hadoop.mapred.Task:  Using
ResourceCalculatorPlugin :
org.apache.hadoop.util.LinuxResourceCalculatorPlugin@64a7c45e
2013-02-20 23:10:23,134 ERROR
org.apache.sqoop.mapreduce.db.DBRecordReader: Top level exception:
com.sap.db.jdbc.exceptions.JDBCDriverException: SAP DBTech JDBC:
[257]: sql syntax error: incorrect syntax near ".": line 1 col 46 (at
pos 46)
	at com.sap.db.jdbc.exceptions.SQLExceptionSapDB.createException(SQLExceptionSapDB.java:334)
	at com.sap.db.jdbc.exceptions.SQLExceptionSapDB.generateDatabaseException(SQLExceptionSapDB.java:174)
	at com.sap.db.jdbc.packet.ReplyPacket.buildExceptionChain(ReplyPacket.java:103)
	at com.sap.db.jdbc.ConnectionSapDB.execute(ConnectionSapDB.java:848)
	at com.sap.db.jdbc.CallableStatementSapDB.sendCommand(CallableStatementSapDB.java:1874)
	at com.sap.db.jdbc.StatementSapDB.sendSQL(StatementSapDB.java:945)
	at com.sap.db.jdbc.CallableStatementSapDB.doParse(CallableStatementSapDB.java:230)
	at com.sap.db.jdbc.CallableStatementSapDB.constructor(CallableStatementSapDB.java:190)
	at com.sap.db.jdbc.CallableStatementSapDB.<init>(CallableStatementSapDB.java:101)
	at com.sap.db.jdbc.CallableStatementSapDBFinalize.<init>(CallableStatementSapDBFinalize.java:31)
	at com.sap.db.jdbc.ConnectionSapDB.prepareStatement(ConnectionSapDB.java:1088)
	at com.sap.db.jdbc.trace.Connection.prepareStatement(Connection.java:347)
	at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:101)
	at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:236)
	at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
	at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
	at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
	at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:416)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
	at org.apache.hadoop.mapred.Child.main(Child.java:262)
2013-02-20 23:10:23,688 INFO
org.apache.sqoop.mapreduce.AutoProgressMapper: Auto-progress thread is
finished. keepGoing=false
2013-02-20 23:10:23,738 INFO
org.apache.hadoop.mapred.TaskLogsTruncater: Initializing logs'
truncater with mapRetainSize=-1 and reduceRetainSize=-1
2013-02-20 23:10:23,861 ERROR
org.apache.hadoop.security.UserGroupInformation:
PriviledgedActionException as:hadoop (auth:SIMPLE)
cause:java.io.IOException: SQLException in nextKeyValue
2013-02-20 23:10:23,861 WARN org.apache.hadoop.mapred.Child: Error running child
java.io.IOException: SQLException in nextKeyValue
	at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
	at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
	at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
	at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
	at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:416)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
	at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: com.sap.db.jdbc.exceptions.JDBCDriverException: SAP DBTech
JDBC: [257]: sql syntax error: incorrect syntax near ".": line 1 col
46 (at pos 46)
	at com.sap.db.jdbc.exceptions.SQLExceptionSapDB.createException(SQLExceptionSapDB.java:334)
	at com.sap.db.jdbc.exceptions.SQLExceptionSapDB.generateDatabaseException(SQLExceptionSapDB.java:174)
	at com.sap.db.jdbc.packet.ReplyPacket.buildExceptionChain(ReplyPacket.java:103)
	at com.sap.db.jdbc.ConnectionSapDB.execute(ConnectionSapDB.java:848)
	at com.sap.db.jdbc.CallableStatementSapDB.sendCommand(CallableStatementSapDB.java:1874)
	at com.sap.db.jdbc.StatementSapDB.sendSQL(StatementSapDB.java:945)
	at com.sap.db.jdbc.CallableStatementSapDB.doParse(CallableStatementSapDB.java:230)
	at com.sap.db.jdbc.CallableStatementSapDB.constructor(CallableStatementSapDB.java:190)
	at com.sap.db.jdbc.CallableStatementSapDB.<init>(CallableStatementSapDB.java:101)
	at com.sap.db.jdbc.CallableStatementSapDBFinalize.<init>(CallableStatementSapDBFinalize.java:31)
	at com.sap.db.jdbc.ConnectionSapDB.prepareStatement(ConnectionSapDB.java:1088)
	at com.sap.db.jdbc.trace.Connection.prepareStatement(Connection.java:347)
	at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:101)
	at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:236)
	... 12 more
2013-02-20 23:10:23,906 INFO org.apache.hadoop.mapred.Task: Runnning
cleanup for the task



On Thu, Feb 21, 2013 at 12:03 PM, Harsh J <ha...@cloudera.com> wrote:

> The error is truncated, check the actual failed task's logs for complete
> info:
>
> Caused by: com.sap… what?
>
> Seems more like a SAP side fault than a Hadoop side one and you should
> ask on their forums with the stacktrace posted.
>
> On Thu, Feb 21, 2013 at 11:58 AM, samir das mohapatra
> <sa...@gmail.com> wrote:
> > Hi All
> >     Can you plese tell me why I am getting error while loading data from
> > SAP HANA   to Hadoop HDFS using sqoop (4.1.2).
> >
> > Error Log:
> >
> > java.io.IOException: SQLException in nextKeyValue
> >       at
> >
> org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
> >       at
> >
> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
> >       at
> >
> org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
> >       at
> >
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
> >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
> >       at
> >
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
> >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
> >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
> >       at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at javax.security.auth.Subject.doAs(Subject.java:416)
> >       at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
> >       at org.apache.hadoop.mapred.Child.main(Child.java:262)
> > Caused by: com.sap
> >
> > Regards,
> > samir.
> >
> >
> >
> > --
> >
> >
> >
>
>
>
> --
> Harsh J
>

Re: ISSUE :Hadoop with HANA using sqoop

Posted by samir das mohapatra <sa...@gmail.com>.
Putting whole Logs from Task now
------------------------------
-----------------------

Task Logs: 'attempt_201302202127_0021_m_000000_0'

*stdout logs*
------------------------------


*stderr logs*

log4j:WARN No appenders could be found for logger (org.apache.hadoop.hdfs.
DFSClient).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig
for more info.

------------------------------


*syslog logs*

2013-02-20 23:10:19,391 WARN mapreduce.Counters: Group
org.apache.hadoop.mapred.Task$Counter is deprecated. Use
org.apache.hadoop.mapreduce.TaskCounter instead
2013-02-20 23:10:19,525 WARN org.apache.hadoop.util.NativeCodeLoader:
Unable to load native-hadoop library for your platform... using
builtin-java classes where applicable
2013-02-20 23:10:20,262 WARN org.apache.hadoop.conf.Configuration:
session.id is deprecated. Instead, use dfs.metrics.session-id
2013-02-20 23:10:20,270 INFO org.apache.hadoop.metrics.jvm.JvmMetrics:
Initializing JVM Metrics with processName=MAP, sessionId=
2013-02-20 23:10:20,919 INFO org.apache.hadoop.util.ProcessTree:
setsid exited with exit code 0
2013-02-20 23:10:20,955 INFO org.apache.hadoop.mapred.Task:  Using
ResourceCalculatorPlugin :
org.apache.hadoop.util.LinuxResourceCalculatorPlugin@64a7c45e
2013-02-20 23:10:23,134 ERROR
org.apache.sqoop.mapreduce.db.DBRecordReader: Top level exception:
com.sap.db.jdbc.exceptions.JDBCDriverException: SAP DBTech JDBC:
[257]: sql syntax error: incorrect syntax near ".": line 1 col 46 (at
pos 46)
	at com.sap.db.jdbc.exceptions.SQLExceptionSapDB.createException(SQLExceptionSapDB.java:334)
	at com.sap.db.jdbc.exceptions.SQLExceptionSapDB.generateDatabaseException(SQLExceptionSapDB.java:174)
	at com.sap.db.jdbc.packet.ReplyPacket.buildExceptionChain(ReplyPacket.java:103)
	at com.sap.db.jdbc.ConnectionSapDB.execute(ConnectionSapDB.java:848)
	at com.sap.db.jdbc.CallableStatementSapDB.sendCommand(CallableStatementSapDB.java:1874)
	at com.sap.db.jdbc.StatementSapDB.sendSQL(StatementSapDB.java:945)
	at com.sap.db.jdbc.CallableStatementSapDB.doParse(CallableStatementSapDB.java:230)
	at com.sap.db.jdbc.CallableStatementSapDB.constructor(CallableStatementSapDB.java:190)
	at com.sap.db.jdbc.CallableStatementSapDB.<init>(CallableStatementSapDB.java:101)
	at com.sap.db.jdbc.CallableStatementSapDBFinalize.<init>(CallableStatementSapDBFinalize.java:31)
	at com.sap.db.jdbc.ConnectionSapDB.prepareStatement(ConnectionSapDB.java:1088)
	at com.sap.db.jdbc.trace.Connection.prepareStatement(Connection.java:347)
	at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:101)
	at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:236)
	at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
	at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
	at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
	at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:416)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
	at org.apache.hadoop.mapred.Child.main(Child.java:262)
2013-02-20 23:10:23,688 INFO
org.apache.sqoop.mapreduce.AutoProgressMapper: Auto-progress thread is
finished. keepGoing=false
2013-02-20 23:10:23,738 INFO
org.apache.hadoop.mapred.TaskLogsTruncater: Initializing logs'
truncater with mapRetainSize=-1 and reduceRetainSize=-1
2013-02-20 23:10:23,861 ERROR
org.apache.hadoop.security.UserGroupInformation:
PriviledgedActionException as:hadoop (auth:SIMPLE)
cause:java.io.IOException: SQLException in nextKeyValue
2013-02-20 23:10:23,861 WARN org.apache.hadoop.mapred.Child: Error running child
java.io.IOException: SQLException in nextKeyValue
	at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
	at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
	at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
	at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
	at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:416)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
	at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: com.sap.db.jdbc.exceptions.JDBCDriverException: SAP DBTech
JDBC: [257]: sql syntax error: incorrect syntax near ".": line 1 col
46 (at pos 46)
	at com.sap.db.jdbc.exceptions.SQLExceptionSapDB.createException(SQLExceptionSapDB.java:334)
	at com.sap.db.jdbc.exceptions.SQLExceptionSapDB.generateDatabaseException(SQLExceptionSapDB.java:174)
	at com.sap.db.jdbc.packet.ReplyPacket.buildExceptionChain(ReplyPacket.java:103)
	at com.sap.db.jdbc.ConnectionSapDB.execute(ConnectionSapDB.java:848)
	at com.sap.db.jdbc.CallableStatementSapDB.sendCommand(CallableStatementSapDB.java:1874)
	at com.sap.db.jdbc.StatementSapDB.sendSQL(StatementSapDB.java:945)
	at com.sap.db.jdbc.CallableStatementSapDB.doParse(CallableStatementSapDB.java:230)
	at com.sap.db.jdbc.CallableStatementSapDB.constructor(CallableStatementSapDB.java:190)
	at com.sap.db.jdbc.CallableStatementSapDB.<init>(CallableStatementSapDB.java:101)
	at com.sap.db.jdbc.CallableStatementSapDBFinalize.<init>(CallableStatementSapDBFinalize.java:31)
	at com.sap.db.jdbc.ConnectionSapDB.prepareStatement(ConnectionSapDB.java:1088)
	at com.sap.db.jdbc.trace.Connection.prepareStatement(Connection.java:347)
	at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:101)
	at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:236)
	... 12 more
2013-02-20 23:10:23,906 INFO org.apache.hadoop.mapred.Task: Runnning
cleanup for the task



On Thu, Feb 21, 2013 at 12:03 PM, Harsh J <ha...@cloudera.com> wrote:

> The error is truncated, check the actual failed task's logs for complete
> info:
>
> Caused by: com.sap… what?
>
> Seems more like a SAP side fault than a Hadoop side one and you should
> ask on their forums with the stacktrace posted.
>
> On Thu, Feb 21, 2013 at 11:58 AM, samir das mohapatra
> <sa...@gmail.com> wrote:
> > Hi All
> >     Can you plese tell me why I am getting error while loading data from
> > SAP HANA   to Hadoop HDFS using sqoop (4.1.2).
> >
> > Error Log:
> >
> > java.io.IOException: SQLException in nextKeyValue
> >       at
> >
> org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
> >       at
> >
> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
> >       at
> >
> org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
> >       at
> >
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
> >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
> >       at
> >
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
> >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
> >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
> >       at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at javax.security.auth.Subject.doAs(Subject.java:416)
> >       at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
> >       at org.apache.hadoop.mapred.Child.main(Child.java:262)
> > Caused by: com.sap
> >
> > Regards,
> > samir.
> >
> >
> >
> > --
> >
> >
> >
>
>
>
> --
> Harsh J
>

Re: ISSUE :Hadoop with HANA using sqoop

Posted by samir das mohapatra <sa...@gmail.com>.
Harsh,
    I copied whole logs and past here, It looks like only it is
showing   "Caused
by: com.sap" ,
And One thing i did not get is why it is running  "SELECT t.* FROM
hgopalan.hana_training AS t WHERE 1=0"  . It indicate where 1=0 means  no
value . But I database we have records.


Error:

hadoop@hadoophost2:~/Desktop$ sqoop import  --connect  jdbc:sap://
sj1svm010.corp.adobe.com:30015/hd2  --driver com.sap.db.jdbc.Driver
--table  hgopalan.hana_training  -m  1 --username hgopalan     --password
Adobe_23  --target-dir  /input/training
13/02/20 22:37:27 WARN tool.BaseSqoopTool: Setting your password on the
command-line is insecure. Consider using -P instead.
13/02/20 22:37:27 INFO manager.SqlManager: Using default fetchSize of 1000
13/02/20 22:37:27 INFO tool.CodeGenTool: Beginning code generation
13/02/20 22:37:28 INFO manager.SqlManager: Executing SQL statement: SELECT
t.* FROM hgopalan.hana_training AS t WHERE 1=0
13/02/20 22:37:29 INFO manager.SqlManager: Executing SQL statement: SELECT
t.* FROM hgopalan.hana_training AS t WHERE 1=0
13/02/20 22:37:30 INFO orm.CompilationManager: HADOOP_HOME is
/usr/local/hadoop/hadoop-2.0.0-mr1-cdh4.1.2
Note:
/tmp/sqoop-hadoop/compile/d66ed0385ac93eb37215515e4e0c2caf/hgopalan_hana_training.java
uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
13/02/20 22:37:32 INFO orm.CompilationManager: Writing jar file:
/tmp/sqoop-hadoop/compile/d66ed0385ac93eb37215515e4e0c2caf/hgopalan.hana_training.jar
13/02/20 22:37:33 INFO mapreduce.ImportJobBase: Beginning import of
hgopalan.hana_training
13/02/20 22:37:34 INFO manager.SqlManager: Executing SQL statement: SELECT
t.* FROM hgopalan.hana_training AS t WHERE 1=0
13/02/20 22:37:36 WARN mapred.JobClient: Use GenericOptionsParser for
parsing the arguments. Applications should implement Tool for the same.
13/02/20 22:37:39 INFO mapred.JobClient: Running job: job_201302202127_0014
13/02/20 22:37:40 INFO mapred.JobClient:  map 0% reduce 0%
13/02/20 22:38:06 INFO mapred.JobClient: Task Id :
attempt_201302202127_0014_m_000000_0, Status : FAILED
java.io.IOException: SQLException in nextKeyValue
    at
org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
    at
org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
    at
org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
    at
org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
    at
org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:416)
    at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
    at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: com.sap
attempt_201302202127_0014_m_000000_0: log4j:WARN No appenders could be
found for logger (org.apache.hadoop.hdfs.DFSClient).
attempt_201302202127_0014_m_000000_0: log4j:WARN Please initialize the
log4j system properly.
attempt_201302202127_0014_m_000000_0: log4j:WARN See
http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
13/02/20 22:38:22 INFO mapred.JobClient: Task Id :
attempt_201302202127_0014_m_000000_1, Status : FAILED
java.io.IOException: SQLException in nextKeyValue
    at
org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
    at
org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
    at
org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
    at
org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
    at
org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:416)
    at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
    at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: com.sap
attempt_201302202127_0014_m_000000_1: log4j:WARN No appenders could be
found for logger (org.apache.hadoop.hdfs.DFSClient).
attempt_201302202127_0014_m_000000_1: log4j:WARN Please initialize the
log4j system properly.
attempt_201302202127_0014_m_000000_1: log4j:WARN See
http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
13/02/20 22:38:32 INFO mapred.JobClient: Task Id :
attempt_201302202127_0014_m_000000_2, Status : FAILED
java.io.IOException: SQLException in nextKeyValue
    at
org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
    at
org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
    at
org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
    at
org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
    at
org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:416)
    at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
    at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: com.sap
attempt_201302202127_0014_m_000000_2: log4j:WARN No appenders could be
found for logger (org.apache.hadoop.hdfs.DFSClient).
attempt_201302202127_0014_m_000000_2: log4j:WARN Please initialize the
log4j system properly.
attempt_201302202127_0014_m_000000_2: log4j:WARN See
http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
13/02/20 22:38:46 INFO mapred.JobClient: Job complete: job_201302202127_0014
13/02/20 22:38:46 INFO mapred.JobClient: Counters: 6
13/02/20 22:38:46 INFO mapred.JobClient:   Job Counters
13/02/20 22:38:46 INFO mapred.JobClient:     Failed map tasks=1
13/02/20 22:38:46 INFO mapred.JobClient:     Launched map tasks=4
13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all maps
in occupied slots (ms)=56775
13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all
reduces in occupied slots (ms)=0
13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all maps
waiting after reserving slots (ms)=0
13/02/20 22:38:46 INFO mapred.JobClient:     Total time spent by all
reduces waiting after reserving slots (ms)=0
13/02/20 22:38:46 INFO mapreduce.ImportJobBase: Transferred 0 bytes in
70.5203 seconds (0 bytes/sec)
13/02/20 22:38:46 WARN mapreduce.Counters: Group
org.apache.hadoop.mapred.Task$Counter is deprecated. Use
org.apache.hadoop.mapreduce.TaskCounter instead
13/02/20 22:38:46 INFO mapreduce.ImportJobBase: Retrieved 0 records.
13/02/20 22:38:46 ERROR tool.ImportTool: Error during import: Import job
failed!


On Thu, Feb 21, 2013 at 12:03 PM, Harsh J <ha...@cloudera.com> wrote:

> The error is truncated, check the actual failed task's logs for complete
> info:
>
> Caused by: com.sap… what?
>
> Seems more like a SAP side fault than a Hadoop side one and you should
> ask on their forums with the stacktrace posted.
>
> On Thu, Feb 21, 2013 at 11:58 AM, samir das mohapatra
> <sa...@gmail.com> wrote:
> > Hi All
> >     Can you plese tell me why I am getting error while loading data from
> > SAP HANA   to Hadoop HDFS using sqoop (4.1.2).
> >
> > Error Log:
> >
> > java.io.IOException: SQLException in nextKeyValue
> >       at
> >
> org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
> >       at
> >
> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
> >       at
> >
> org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
> >       at
> >
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
> >       at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
> >       at
> >
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
> >       at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
> >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
> >       at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at javax.security.auth.Subject.doAs(Subject.java:416)
> >       at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
> >       at org.apache.hadoop.mapred.Child.main(Child.java:262)
> > Caused by: com.sap
> >
> > Regards,
> > samir.
> >
> >
> >
> > --
> >
> >
> >
>
>
>
> --
> Harsh J
>

Re: ISSUE :Hadoop with HANA using sqoop

Posted by Harsh J <ha...@cloudera.com>.
The error is truncated, check the actual failed task's logs for complete info:

Caused by: com.sap… what?

Seems more like a SAP side fault than a Hadoop side one and you should
ask on their forums with the stacktrace posted.

On Thu, Feb 21, 2013 at 11:58 AM, samir das mohapatra
<sa...@gmail.com> wrote:
> Hi All
>     Can you plese tell me why I am getting error while loading data from
> SAP HANA   to Hadoop HDFS using sqoop (4.1.2).
>
> Error Log:
>
> java.io.IOException: SQLException in nextKeyValue
> 	at
> org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
> 	at
> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
> 	at
> org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
> 	at
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
> 	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
> 	at
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
> 	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
> 	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
> 	at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:416)
> 	at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
> 	at org.apache.hadoop.mapred.Child.main(Child.java:262)
> Caused by: com.sap
>
> Regards,
> samir.
>
>
>
> --
>
>
>



--
Harsh J

Re: ISSUE :Hadoop with HANA using sqoop

Posted by Harsh J <ha...@cloudera.com>.
The error is truncated, check the actual failed task's logs for complete info:

Caused by: com.sap… what?

Seems more like a SAP side fault than a Hadoop side one and you should
ask on their forums with the stacktrace posted.

On Thu, Feb 21, 2013 at 11:58 AM, samir das mohapatra
<sa...@gmail.com> wrote:
> Hi All
>     Can you plese tell me why I am getting error while loading data from
> SAP HANA   to Hadoop HDFS using sqoop (4.1.2).
>
> Error Log:
>
> java.io.IOException: SQLException in nextKeyValue
> 	at
> org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
> 	at
> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
> 	at
> org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
> 	at
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
> 	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
> 	at
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
> 	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
> 	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
> 	at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:416)
> 	at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
> 	at org.apache.hadoop.mapred.Child.main(Child.java:262)
> Caused by: com.sap
>
> Regards,
> samir.
>
>
>
> --
>
>
>



--
Harsh J

Re: ISSUE :Hadoop with HANA using sqoop

Posted by Harsh J <ha...@cloudera.com>.
The error is truncated, check the actual failed task's logs for complete info:

Caused by: com.sap… what?

Seems more like a SAP side fault than a Hadoop side one and you should
ask on their forums with the stacktrace posted.

On Thu, Feb 21, 2013 at 11:58 AM, samir das mohapatra
<sa...@gmail.com> wrote:
> Hi All
>     Can you plese tell me why I am getting error while loading data from
> SAP HANA   to Hadoop HDFS using sqoop (4.1.2).
>
> Error Log:
>
> java.io.IOException: SQLException in nextKeyValue
> 	at
> org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
> 	at
> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
> 	at
> org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
> 	at
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
> 	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
> 	at
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
> 	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
> 	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
> 	at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:416)
> 	at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
> 	at org.apache.hadoop.mapred.Child.main(Child.java:262)
> Caused by: com.sap
>
> Regards,
> samir.
>
>
>
> --
>
>
>



--
Harsh J

Re: ISSUE :Hadoop with HANA using sqoop

Posted by Harsh J <ha...@cloudera.com>.
The error is truncated, check the actual failed task's logs for complete info:

Caused by: com.sap… what?

Seems more like a SAP side fault than a Hadoop side one and you should
ask on their forums with the stacktrace posted.

On Thu, Feb 21, 2013 at 11:58 AM, samir das mohapatra
<sa...@gmail.com> wrote:
> Hi All
>     Can you plese tell me why I am getting error while loading data from
> SAP HANA   to Hadoop HDFS using sqoop (4.1.2).
>
> Error Log:
>
> java.io.IOException: SQLException in nextKeyValue
> 	at
> org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:265)
> 	at
> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:458)
> 	at
> org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:76)
> 	at
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:85)
> 	at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:139)
> 	at
> org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:182)
> 	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:645)
> 	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:325)
> 	at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:416)
> 	at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1332)
> 	at org.apache.hadoop.mapred.Child.main(Child.java:262)
> Caused by: com.sap
>
> Regards,
> samir.
>
>
>
> --
>
>
>



--
Harsh J