You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Kumar Jayapal <kj...@gmail.com> on 2015/04/25 07:21:45 UTC

YARN Exceptions

Hi,

I am getting the following error while running sqoop import script.

can any one please help in resolving this issue.



15/04/25 03:12:34 INFO mapreduce.Job: Running job: job_1429930456969_0006
15/04/25 03:12:48 INFO mapreduce.Job: Job job_1429930456969_0006 running in
uber mode : false
15/04/25 03:12:48 INFO mapreduce.Job:  map 0% reduce 0%
15/04/25 03:12:48 INFO mapreduce.Job: Job job_1429930456969_0006 failed
with state FAILED due to: Application application_1429930456969_0006 failed
2 times due to AM Container for appattempt_1429930456969_0006_000002 exited
with  exitCode: -1000 due to: Application application_1429930456969_0006
initialization failed (exitCode=255) with output: User edhdtaesvc not found

.Failing this attempt.. Failing the application.
15/04/25 03:12:48 INFO mapreduce.Job: Counters: 0
15/04/25 03:12:48 WARN mapreduce.Counters: Group FileSystemCounters is
deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
15/04/25 03:12:48 INFO mapreduce.ImportJobBase: Transferred 0 bytes in
72.8944 seconds (0 bytes/sec)
15/04/25 03:12:48 WARN mapreduce.Counters: Group
org.apache.hadoop.mapred.Task$Counter is deprecated. Use
org.apache.hadoop.mapreduce.TaskCounter instead
15/04/25 03:12:48 INFO mapreduce.ImportJobBase: Retrieved 0 records.
15/04/25 03:12:48 ERROR tool.ImportTool: Error during import: Import job
failed!

Thanks
Kumar

RE: YARN Exceptions

Posted by Rohith Sharma K S <ro...@huawei.com>.
Are you running Secured Hadoop cluster( Kerberos ) and at YARN – container executor as LinuxContainerExecutor?

Thanks & Regards
Rohith Sharma K S
From: Kumar Jayapal [mailto:kjayapal17@gmail.com]
Sent: 25 April 2015 20:10
To: user@hadoop.apache.org
Subject: Re: YARN Exceptions

Yes Here is the complete log and sqoop import command to get the data from oracle.

[root@sqpcdh01094p001 ~]# sqoop import  --connect "jdbc:oracle:thin:@lorct101094t01a.qat.np.costco.com:1521/CT1<http://jdbc:oracle:thin:@lorct101094t01a.qat.np.costco.com:1521/CT1>" --username "edhdtaesvc" --password "xxxxxxxx" --table "SAPSR3.AUSP"  --target-dir "/data/crmdq/CT1" --table "SAPSR3.AUSP" --split-by PARTNER_GUID --as-avrodatafile --compression-codec org.apache.hadoop.io.compress.SnappyCodec --m 1

Warning: /opt/cloudera/parcels/CDH-5.3.2-1.cdh5.3.2.p654.326/bin/../lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
15/04/25 13:37:19 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.3.2
15/04/25 13:37:19 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
15/04/25 13:37:20 INFO oracle.OraOopManagerFactory: Data Connector for Oracle and Hadoop is disabled.
15/04/25 13:37:20 INFO manager.SqlManager: Using default fetchSize of 1000
15/04/25 13:37:20 INFO tool.CodeGenTool: Beginning code generation
15/04/25 13:37:20 INFO manager.OracleManager: Time zone has been set to GMT
15/04/25 13:37:20 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM SAPSR3.AUSP t WHERE 1=0
15/04/25 13:37:20 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /opt/cloudera/parcels/CDH/lib/hadoop-mapreduce
Note: /tmp/sqoop-root/compile/dbe5b6d69507ee60c249062c54813557/SAPSR3_AUSP.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
15/04/25 13:37:22 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/dbe5b6d69507ee60c249062c54813557/SAPSR3.AUSP.jar
15/04/25 13:37:22 INFO mapreduce.ImportJobBase: Beginning import of SAPSR3.AUSP
15/04/25 13:37:22 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
15/04/25 13:37:22 INFO manager.OracleManager: Time zone has been set to GMT
15/04/25 13:37:23 INFO manager.OracleManager: Time zone has been set to GMT
15/04/25 13:37:23 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM SAPSR3.AUSP t WHERE 1=0
15/04/25 13:37:23 INFO mapreduce.DataDrivenImportJob: Writing Avro schema file: /tmp/sqoop-root/compile/dbe5b6d69507ee60c249062c54813557/sqoop_import_SAPSR3_AUSP.avsc
15/04/25 13:37:23 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
15/04/25 13:37:23 INFO hdfs.DFSClient: Created HDFS_DELEGATION_TOKEN token 14047 for edhdtaesvc on ha-hdfs:nameservice1
15/04/25 13:37:23 ERROR hdfs.KeyProviderCache: Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!
15/04/25 13:37:23 INFO security.TokenCache: Got dt for hdfs://nameservice1; Kind: HDFS_DELEGATION_TOKEN, Service: ha-hdfs:nameservice1, Ident: (HDFS_DELEGATION_TOKEN token 14047 for edhdtaesvc)
15/04/25 13:37:23 ERROR hdfs.KeyProviderCache: Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!
15/04/25 13:37:23 ERROR hdfs.KeyProviderCache: Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!
15/04/25 13:37:25 ERROR hdfs.KeyProviderCache: Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!
15/04/25 13:37:25 INFO db.DBInputFormat: Using read commited transaction isolation
15/04/25 13:37:25 INFO mapreduce.JobSubmitter: number of splits:1
15/04/25 13:37:26 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1429968417065_0004
15/04/25 13:37:26 INFO mapreduce.JobSubmitter: Kind: HDFS_DELEGATION_TOKEN, Service: ha-hdfs:nameservice1, Ident: (HDFS_DELEGATION_TOKEN token 14047 for edhdtaesvc)
15/04/25 13:37:26 INFO impl.YarnClientImpl: Submitted application application_1429968417065_0004
15/04/25 13:37:26 INFO mapreduce.Job: The url to track the job: http://yrncdh01094p001.corp.costco.com:8088/proxy/application_1429968417065_0004/
15/04/25 13:37:26 INFO mapreduce.Job: Running job: job_1429968417065_0004
15/04/25 13:37:40 INFO mapreduce.Job: Job job_1429968417065_0004 running in uber mode : false
15/04/25 13:37:40 INFO mapreduce.Job:  map 0% reduce 0%
15/04/25 13:37:40 INFO mapreduce.Job: Job job_1429968417065_0004 failed with state FAILED due to: Application application_1429968417065_0004 failed 2 times due to AM Container for appattempt_1429968417065_0004_000002 exited with  exitCode: -1000 due to: Application application_1429968417065_0004 initialization failed (exitCode=255) with output: User edhdtaesvc not found

.Failing this attempt.. Failing the application.
15/04/25 13:37:40 INFO mapreduce.Job: Counters: 0
15/04/25 13:37:40 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
15/04/25 13:37:40 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 17.5273 seconds (0 bytes/sec)
15/04/25 13:37:40 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
15/04/25 13:37:40 INFO mapreduce.ImportJobBase: Retrieved 0 records.
15/04/25 13:37:40 ERROR tool.ImportTool: Error during import: Import job failed!

thanks
Sajid

On Fri, Apr 24, 2015 at 10:52 PM, Jagat Singh <ja...@gmail.com>> wrote:
What is this error

User edhdtaesvc not found

Are you using any user with that name ?



On Sat, Apr 25, 2015 at 3:21 PM, Kumar Jayapal <kj...@gmail.com>> wrote:
Hi,

I am getting the following error while running sqoop import script.

can any one please help in resolving this issue.



15/04/25 03:12:34 INFO mapreduce.Job: Running job: job_1429930456969_0006
15/04/25 03:12:48 INFO mapreduce.Job: Job job_1429930456969_0006 running in uber mode : false
15/04/25 03:12:48 INFO mapreduce.Job:  map 0% reduce 0%
15/04/25 03:12:48 INFO mapreduce.Job: Job job_1429930456969_0006 failed with state FAILED due to: Application application_1429930456969_0006 failed 2 times due to AM Container for appattempt_1429930456969_0006_000002 exited with  exitCode: -1000 due to: Application application_1429930456969_0006 initialization failed (exitCode=255) with output: User edhdtaesvc not found

.Failing this attempt.. Failing the application.
15/04/25 03:12:48 INFO mapreduce.Job: Counters: 0
15/04/25 03:12:48 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
15/04/25 03:12:48 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 72.8944 seconds (0 bytes/sec)
15/04/25 03:12:48 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
15/04/25 03:12:48 INFO mapreduce.ImportJobBase: Retrieved 0 records.
15/04/25 03:12:48 ERROR tool.ImportTool: Error during import: Import job failed!

Thanks
Kumar




Re: YARN Exceptions

Posted by Alexander Alten-Lorenz <wg...@gmail.com>.
Please have a closer look at the quoted error - the user (User edhdtaesvc not found) doesnt exists in your hadoop installation, which let the MR job fail.

BR,
 AL

--
Alexander Alten-Lorenz
m: wget.null@gmail.com
b: mapredit.blogspot.com

> On Apr 25, 2015, at 4:40 PM, Kumar Jayapal <kj...@gmail.com> wrote:
> 
> 15/04/25 13:37:40 INFO mapreduce.Job: Job job_1429968417065_0004 failed with state FAILED due to: Application application_1429968417065_0004 failed 2 times due to AM Container for appattempt_1429968417065_0004_000002 exited with  exitCode: -1000 due to: Application application_1429968417065_0004 initialization failed (exitCode=255) with output: User edhdtaesvc not found


RE: YARN Exceptions

Posted by Rohith Sharma K S <ro...@huawei.com>.
Are you running Secured Hadoop cluster( Kerberos ) and at YARN – container executor as LinuxContainerExecutor?

Thanks & Regards
Rohith Sharma K S
From: Kumar Jayapal [mailto:kjayapal17@gmail.com]
Sent: 25 April 2015 20:10
To: user@hadoop.apache.org
Subject: Re: YARN Exceptions

Yes Here is the complete log and sqoop import command to get the data from oracle.

[root@sqpcdh01094p001 ~]# sqoop import  --connect "jdbc:oracle:thin:@lorct101094t01a.qat.np.costco.com:1521/CT1<http://jdbc:oracle:thin:@lorct101094t01a.qat.np.costco.com:1521/CT1>" --username "edhdtaesvc" --password "xxxxxxxx" --table "SAPSR3.AUSP"  --target-dir "/data/crmdq/CT1" --table "SAPSR3.AUSP" --split-by PARTNER_GUID --as-avrodatafile --compression-codec org.apache.hadoop.io.compress.SnappyCodec --m 1

Warning: /opt/cloudera/parcels/CDH-5.3.2-1.cdh5.3.2.p654.326/bin/../lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
15/04/25 13:37:19 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.3.2
15/04/25 13:37:19 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
15/04/25 13:37:20 INFO oracle.OraOopManagerFactory: Data Connector for Oracle and Hadoop is disabled.
15/04/25 13:37:20 INFO manager.SqlManager: Using default fetchSize of 1000
15/04/25 13:37:20 INFO tool.CodeGenTool: Beginning code generation
15/04/25 13:37:20 INFO manager.OracleManager: Time zone has been set to GMT
15/04/25 13:37:20 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM SAPSR3.AUSP t WHERE 1=0
15/04/25 13:37:20 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /opt/cloudera/parcels/CDH/lib/hadoop-mapreduce
Note: /tmp/sqoop-root/compile/dbe5b6d69507ee60c249062c54813557/SAPSR3_AUSP.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
15/04/25 13:37:22 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/dbe5b6d69507ee60c249062c54813557/SAPSR3.AUSP.jar
15/04/25 13:37:22 INFO mapreduce.ImportJobBase: Beginning import of SAPSR3.AUSP
15/04/25 13:37:22 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
15/04/25 13:37:22 INFO manager.OracleManager: Time zone has been set to GMT
15/04/25 13:37:23 INFO manager.OracleManager: Time zone has been set to GMT
15/04/25 13:37:23 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM SAPSR3.AUSP t WHERE 1=0
15/04/25 13:37:23 INFO mapreduce.DataDrivenImportJob: Writing Avro schema file: /tmp/sqoop-root/compile/dbe5b6d69507ee60c249062c54813557/sqoop_import_SAPSR3_AUSP.avsc
15/04/25 13:37:23 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
15/04/25 13:37:23 INFO hdfs.DFSClient: Created HDFS_DELEGATION_TOKEN token 14047 for edhdtaesvc on ha-hdfs:nameservice1
15/04/25 13:37:23 ERROR hdfs.KeyProviderCache: Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!
15/04/25 13:37:23 INFO security.TokenCache: Got dt for hdfs://nameservice1; Kind: HDFS_DELEGATION_TOKEN, Service: ha-hdfs:nameservice1, Ident: (HDFS_DELEGATION_TOKEN token 14047 for edhdtaesvc)
15/04/25 13:37:23 ERROR hdfs.KeyProviderCache: Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!
15/04/25 13:37:23 ERROR hdfs.KeyProviderCache: Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!
15/04/25 13:37:25 ERROR hdfs.KeyProviderCache: Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!
15/04/25 13:37:25 INFO db.DBInputFormat: Using read commited transaction isolation
15/04/25 13:37:25 INFO mapreduce.JobSubmitter: number of splits:1
15/04/25 13:37:26 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1429968417065_0004
15/04/25 13:37:26 INFO mapreduce.JobSubmitter: Kind: HDFS_DELEGATION_TOKEN, Service: ha-hdfs:nameservice1, Ident: (HDFS_DELEGATION_TOKEN token 14047 for edhdtaesvc)
15/04/25 13:37:26 INFO impl.YarnClientImpl: Submitted application application_1429968417065_0004
15/04/25 13:37:26 INFO mapreduce.Job: The url to track the job: http://yrncdh01094p001.corp.costco.com:8088/proxy/application_1429968417065_0004/
15/04/25 13:37:26 INFO mapreduce.Job: Running job: job_1429968417065_0004
15/04/25 13:37:40 INFO mapreduce.Job: Job job_1429968417065_0004 running in uber mode : false
15/04/25 13:37:40 INFO mapreduce.Job:  map 0% reduce 0%
15/04/25 13:37:40 INFO mapreduce.Job: Job job_1429968417065_0004 failed with state FAILED due to: Application application_1429968417065_0004 failed 2 times due to AM Container for appattempt_1429968417065_0004_000002 exited with  exitCode: -1000 due to: Application application_1429968417065_0004 initialization failed (exitCode=255) with output: User edhdtaesvc not found

.Failing this attempt.. Failing the application.
15/04/25 13:37:40 INFO mapreduce.Job: Counters: 0
15/04/25 13:37:40 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
15/04/25 13:37:40 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 17.5273 seconds (0 bytes/sec)
15/04/25 13:37:40 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
15/04/25 13:37:40 INFO mapreduce.ImportJobBase: Retrieved 0 records.
15/04/25 13:37:40 ERROR tool.ImportTool: Error during import: Import job failed!

thanks
Sajid

On Fri, Apr 24, 2015 at 10:52 PM, Jagat Singh <ja...@gmail.com>> wrote:
What is this error

User edhdtaesvc not found

Are you using any user with that name ?



On Sat, Apr 25, 2015 at 3:21 PM, Kumar Jayapal <kj...@gmail.com>> wrote:
Hi,

I am getting the following error while running sqoop import script.

can any one please help in resolving this issue.



15/04/25 03:12:34 INFO mapreduce.Job: Running job: job_1429930456969_0006
15/04/25 03:12:48 INFO mapreduce.Job: Job job_1429930456969_0006 running in uber mode : false
15/04/25 03:12:48 INFO mapreduce.Job:  map 0% reduce 0%
15/04/25 03:12:48 INFO mapreduce.Job: Job job_1429930456969_0006 failed with state FAILED due to: Application application_1429930456969_0006 failed 2 times due to AM Container for appattempt_1429930456969_0006_000002 exited with  exitCode: -1000 due to: Application application_1429930456969_0006 initialization failed (exitCode=255) with output: User edhdtaesvc not found

.Failing this attempt.. Failing the application.
15/04/25 03:12:48 INFO mapreduce.Job: Counters: 0
15/04/25 03:12:48 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
15/04/25 03:12:48 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 72.8944 seconds (0 bytes/sec)
15/04/25 03:12:48 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
15/04/25 03:12:48 INFO mapreduce.ImportJobBase: Retrieved 0 records.
15/04/25 03:12:48 ERROR tool.ImportTool: Error during import: Import job failed!

Thanks
Kumar




RE: YARN Exceptions

Posted by Rohith Sharma K S <ro...@huawei.com>.
Are you running Secured Hadoop cluster( Kerberos ) and at YARN – container executor as LinuxContainerExecutor?

Thanks & Regards
Rohith Sharma K S
From: Kumar Jayapal [mailto:kjayapal17@gmail.com]
Sent: 25 April 2015 20:10
To: user@hadoop.apache.org
Subject: Re: YARN Exceptions

Yes Here is the complete log and sqoop import command to get the data from oracle.

[root@sqpcdh01094p001 ~]# sqoop import  --connect "jdbc:oracle:thin:@lorct101094t01a.qat.np.costco.com:1521/CT1<http://jdbc:oracle:thin:@lorct101094t01a.qat.np.costco.com:1521/CT1>" --username "edhdtaesvc" --password "xxxxxxxx" --table "SAPSR3.AUSP"  --target-dir "/data/crmdq/CT1" --table "SAPSR3.AUSP" --split-by PARTNER_GUID --as-avrodatafile --compression-codec org.apache.hadoop.io.compress.SnappyCodec --m 1

Warning: /opt/cloudera/parcels/CDH-5.3.2-1.cdh5.3.2.p654.326/bin/../lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
15/04/25 13:37:19 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.3.2
15/04/25 13:37:19 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
15/04/25 13:37:20 INFO oracle.OraOopManagerFactory: Data Connector for Oracle and Hadoop is disabled.
15/04/25 13:37:20 INFO manager.SqlManager: Using default fetchSize of 1000
15/04/25 13:37:20 INFO tool.CodeGenTool: Beginning code generation
15/04/25 13:37:20 INFO manager.OracleManager: Time zone has been set to GMT
15/04/25 13:37:20 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM SAPSR3.AUSP t WHERE 1=0
15/04/25 13:37:20 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /opt/cloudera/parcels/CDH/lib/hadoop-mapreduce
Note: /tmp/sqoop-root/compile/dbe5b6d69507ee60c249062c54813557/SAPSR3_AUSP.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
15/04/25 13:37:22 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/dbe5b6d69507ee60c249062c54813557/SAPSR3.AUSP.jar
15/04/25 13:37:22 INFO mapreduce.ImportJobBase: Beginning import of SAPSR3.AUSP
15/04/25 13:37:22 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
15/04/25 13:37:22 INFO manager.OracleManager: Time zone has been set to GMT
15/04/25 13:37:23 INFO manager.OracleManager: Time zone has been set to GMT
15/04/25 13:37:23 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM SAPSR3.AUSP t WHERE 1=0
15/04/25 13:37:23 INFO mapreduce.DataDrivenImportJob: Writing Avro schema file: /tmp/sqoop-root/compile/dbe5b6d69507ee60c249062c54813557/sqoop_import_SAPSR3_AUSP.avsc
15/04/25 13:37:23 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
15/04/25 13:37:23 INFO hdfs.DFSClient: Created HDFS_DELEGATION_TOKEN token 14047 for edhdtaesvc on ha-hdfs:nameservice1
15/04/25 13:37:23 ERROR hdfs.KeyProviderCache: Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!
15/04/25 13:37:23 INFO security.TokenCache: Got dt for hdfs://nameservice1; Kind: HDFS_DELEGATION_TOKEN, Service: ha-hdfs:nameservice1, Ident: (HDFS_DELEGATION_TOKEN token 14047 for edhdtaesvc)
15/04/25 13:37:23 ERROR hdfs.KeyProviderCache: Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!
15/04/25 13:37:23 ERROR hdfs.KeyProviderCache: Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!
15/04/25 13:37:25 ERROR hdfs.KeyProviderCache: Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!
15/04/25 13:37:25 INFO db.DBInputFormat: Using read commited transaction isolation
15/04/25 13:37:25 INFO mapreduce.JobSubmitter: number of splits:1
15/04/25 13:37:26 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1429968417065_0004
15/04/25 13:37:26 INFO mapreduce.JobSubmitter: Kind: HDFS_DELEGATION_TOKEN, Service: ha-hdfs:nameservice1, Ident: (HDFS_DELEGATION_TOKEN token 14047 for edhdtaesvc)
15/04/25 13:37:26 INFO impl.YarnClientImpl: Submitted application application_1429968417065_0004
15/04/25 13:37:26 INFO mapreduce.Job: The url to track the job: http://yrncdh01094p001.corp.costco.com:8088/proxy/application_1429968417065_0004/
15/04/25 13:37:26 INFO mapreduce.Job: Running job: job_1429968417065_0004
15/04/25 13:37:40 INFO mapreduce.Job: Job job_1429968417065_0004 running in uber mode : false
15/04/25 13:37:40 INFO mapreduce.Job:  map 0% reduce 0%
15/04/25 13:37:40 INFO mapreduce.Job: Job job_1429968417065_0004 failed with state FAILED due to: Application application_1429968417065_0004 failed 2 times due to AM Container for appattempt_1429968417065_0004_000002 exited with  exitCode: -1000 due to: Application application_1429968417065_0004 initialization failed (exitCode=255) with output: User edhdtaesvc not found

.Failing this attempt.. Failing the application.
15/04/25 13:37:40 INFO mapreduce.Job: Counters: 0
15/04/25 13:37:40 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
15/04/25 13:37:40 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 17.5273 seconds (0 bytes/sec)
15/04/25 13:37:40 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
15/04/25 13:37:40 INFO mapreduce.ImportJobBase: Retrieved 0 records.
15/04/25 13:37:40 ERROR tool.ImportTool: Error during import: Import job failed!

thanks
Sajid

On Fri, Apr 24, 2015 at 10:52 PM, Jagat Singh <ja...@gmail.com>> wrote:
What is this error

User edhdtaesvc not found

Are you using any user with that name ?



On Sat, Apr 25, 2015 at 3:21 PM, Kumar Jayapal <kj...@gmail.com>> wrote:
Hi,

I am getting the following error while running sqoop import script.

can any one please help in resolving this issue.



15/04/25 03:12:34 INFO mapreduce.Job: Running job: job_1429930456969_0006
15/04/25 03:12:48 INFO mapreduce.Job: Job job_1429930456969_0006 running in uber mode : false
15/04/25 03:12:48 INFO mapreduce.Job:  map 0% reduce 0%
15/04/25 03:12:48 INFO mapreduce.Job: Job job_1429930456969_0006 failed with state FAILED due to: Application application_1429930456969_0006 failed 2 times due to AM Container for appattempt_1429930456969_0006_000002 exited with  exitCode: -1000 due to: Application application_1429930456969_0006 initialization failed (exitCode=255) with output: User edhdtaesvc not found

.Failing this attempt.. Failing the application.
15/04/25 03:12:48 INFO mapreduce.Job: Counters: 0
15/04/25 03:12:48 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
15/04/25 03:12:48 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 72.8944 seconds (0 bytes/sec)
15/04/25 03:12:48 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
15/04/25 03:12:48 INFO mapreduce.ImportJobBase: Retrieved 0 records.
15/04/25 03:12:48 ERROR tool.ImportTool: Error during import: Import job failed!

Thanks
Kumar




RE: YARN Exceptions

Posted by yves callaert <yv...@hotmail.com>.
Hi,I think you are running into the following error: https://issues.apache.org/jira/browse/HDFS-7931The problem is probably due to the error in the hdfs.KeyProviderCache which you get before you get the name error.with Regards,Yves

Date: Sat, 25 Apr 2015 07:40:08 -0700
Subject: Re: YARN Exceptions
From: kjayapal17@gmail.com
To: user@hadoop.apache.org

Yes Here is the complete log and sqoop import command to get the data from oracle.
[root@sqpcdh01094p001 ~]# sqoop import  --connect "jdbc:oracle:thin:@lorct101094t01a.qat.np.costco.com:1521/CT1" --username "edhdtaesvc" --password "xxxxxxxx" --table "SAPSR3.AUSP"  --target-dir "/data/crmdq/CT1" --table "SAPSR3.AUSP" --split-by PARTNER_GUID --as-avrodatafile --compression-codec org.apache.hadoop.io.compress.SnappyCodec --m 1
Warning: /opt/cloudera/parcels/CDH-5.3.2-1.cdh5.3.2.p654.326/bin/../lib/sqoop/../accumulo does not exist! Accumulo imports will fail.Please set $ACCUMULO_HOME to the root of your Accumulo installation.15/04/25 13:37:19 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.3.215/04/25 13:37:19 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.15/04/25 13:37:20 INFO oracle.OraOopManagerFactory: Data Connector for Oracle and Hadoop is disabled.15/04/25 13:37:20 INFO manager.SqlManager: Using default fetchSize of 100015/04/25 13:37:20 INFO tool.CodeGenTool: Beginning code generation15/04/25 13:37:20 INFO manager.OracleManager: Time zone has been set to GMT15/04/25 13:37:20 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM SAPSR3.AUSP t WHERE 1=015/04/25 13:37:20 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /opt/cloudera/parcels/CDH/lib/hadoop-mapreduceNote: /tmp/sqoop-root/compile/dbe5b6d69507ee60c249062c54813557/SAPSR3_AUSP.java uses or overrides a deprecated API.Note: Recompile with -Xlint:deprecation for details.15/04/25 13:37:22 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/dbe5b6d69507ee60c249062c54813557/SAPSR3.AUSP.jar15/04/25 13:37:22 INFO mapreduce.ImportJobBase: Beginning import of SAPSR3.AUSP15/04/25 13:37:22 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar15/04/25 13:37:22 INFO manager.OracleManager: Time zone has been set to GMT15/04/25 13:37:23 INFO manager.OracleManager: Time zone has been set to GMT15/04/25 13:37:23 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM SAPSR3.AUSP t WHERE 1=015/04/25 13:37:23 INFO mapreduce.DataDrivenImportJob: Writing Avro schema file: /tmp/sqoop-root/compile/dbe5b6d69507ee60c249062c54813557/sqoop_import_SAPSR3_AUSP.avsc15/04/25 13:37:23 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps15/04/25 13:37:23 INFO hdfs.DFSClient: Created HDFS_DELEGATION_TOKEN token 14047 for edhdtaesvc on ha-hdfs:nameservice115/04/25 13:37:23 ERROR hdfs.KeyProviderCache: Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!15/04/25 13:37:23 INFO security.TokenCache: Got dt for hdfs://nameservice1; Kind: HDFS_DELEGATION_TOKEN, Service: ha-hdfs:nameservice1, Ident: (HDFS_DELEGATION_TOKEN token 14047 for edhdtaesvc)15/04/25 13:37:23 ERROR hdfs.KeyProviderCache: Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!15/04/25 13:37:23 ERROR hdfs.KeyProviderCache: Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!15/04/25 13:37:25 ERROR hdfs.KeyProviderCache: Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!15/04/25 13:37:25 INFO db.DBInputFormat: Using read commited transaction isolation15/04/25 13:37:25 INFO mapreduce.JobSubmitter: number of splits:115/04/25 13:37:26 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1429968417065_000415/04/25 13:37:26 INFO mapreduce.JobSubmitter: Kind: HDFS_DELEGATION_TOKEN, Service: ha-hdfs:nameservice1, Ident: (HDFS_DELEGATION_TOKEN token 14047 for edhdtaesvc)15/04/25 13:37:26 INFO impl.YarnClientImpl: Submitted application application_1429968417065_000415/04/25 13:37:26 INFO mapreduce.Job: The url to track the job: http://yrncdh01094p001.corp.costco.com:8088/proxy/application_1429968417065_0004/15/04/25 13:37:26 INFO mapreduce.Job: Running job: job_1429968417065_000415/04/25 13:37:40 INFO mapreduce.Job: Job job_1429968417065_0004 running in uber mode : false15/04/25 13:37:40 INFO mapreduce.Job:  map 0% reduce 0%15/04/25 13:37:40 INFO mapreduce.Job: Job job_1429968417065_0004 failed with state FAILED due to: Application application_1429968417065_0004 failed 2 times due to AM Container for appattempt_1429968417065_0004_000002 exited with  exitCode: -1000 due to: Application application_1429968417065_0004 initialization failed (exitCode=255) with output: User edhdtaesvc not found
.Failing this attempt.. Failing the application.15/04/25 13:37:40 INFO mapreduce.Job: Counters: 015/04/25 13:37:40 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead15/04/25 13:37:40 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 17.5273 seconds (0 bytes/sec)15/04/25 13:37:40 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead15/04/25 13:37:40 INFO mapreduce.ImportJobBase: Retrieved 0 records.15/04/25 13:37:40 ERROR tool.ImportTool: Error during import: Import job failed!
thanksSajid
On Fri, Apr 24, 2015 at 10:52 PM, Jagat Singh <ja...@gmail.com> wrote:
What is this error
User edhdtaesvc not found

Are you using any user with that name ?


On Sat, Apr 25, 2015 at 3:21 PM, Kumar Jayapal <kj...@gmail.com> wrote:
Hi,
I am getting the following error while running sqoop import script.
can any one please help in resolving this issue.


15/04/25 03:12:34 INFO mapreduce.Job: Running job: job_1429930456969_000615/04/25 03:12:48 INFO mapreduce.Job: Job job_1429930456969_0006 running in uber mode : false15/04/25 03:12:48 INFO mapreduce.Job:  map 0% reduce 0%15/04/25 03:12:48 INFO mapreduce.Job: Job job_1429930456969_0006 failed with state FAILED due to: Application application_1429930456969_0006 failed 2 times due to AM Container for appattempt_1429930456969_0006_000002 exited with  exitCode: -1000 due to: Application application_1429930456969_0006 initialization failed (exitCode=255) with output: User edhdtaesvc not found
.Failing this attempt.. Failing the application.15/04/25 03:12:48 INFO mapreduce.Job: Counters: 015/04/25 03:12:48 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead15/04/25 03:12:48 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 72.8944 seconds (0 bytes/sec)15/04/25 03:12:48 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead15/04/25 03:12:48 INFO mapreduce.ImportJobBase: Retrieved 0 records.15/04/25 03:12:48 ERROR tool.ImportTool: Error during import: Import job failed!
ThanksKumar




 		 	   		  

RE: YARN Exceptions

Posted by yves callaert <yv...@hotmail.com>.
Hi,I think you are running into the following error: https://issues.apache.org/jira/browse/HDFS-7931The problem is probably due to the error in the hdfs.KeyProviderCache which you get before you get the name error.with Regards,Yves

Date: Sat, 25 Apr 2015 07:40:08 -0700
Subject: Re: YARN Exceptions
From: kjayapal17@gmail.com
To: user@hadoop.apache.org

Yes Here is the complete log and sqoop import command to get the data from oracle.
[root@sqpcdh01094p001 ~]# sqoop import  --connect "jdbc:oracle:thin:@lorct101094t01a.qat.np.costco.com:1521/CT1" --username "edhdtaesvc" --password "xxxxxxxx" --table "SAPSR3.AUSP"  --target-dir "/data/crmdq/CT1" --table "SAPSR3.AUSP" --split-by PARTNER_GUID --as-avrodatafile --compression-codec org.apache.hadoop.io.compress.SnappyCodec --m 1
Warning: /opt/cloudera/parcels/CDH-5.3.2-1.cdh5.3.2.p654.326/bin/../lib/sqoop/../accumulo does not exist! Accumulo imports will fail.Please set $ACCUMULO_HOME to the root of your Accumulo installation.15/04/25 13:37:19 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.3.215/04/25 13:37:19 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.15/04/25 13:37:20 INFO oracle.OraOopManagerFactory: Data Connector for Oracle and Hadoop is disabled.15/04/25 13:37:20 INFO manager.SqlManager: Using default fetchSize of 100015/04/25 13:37:20 INFO tool.CodeGenTool: Beginning code generation15/04/25 13:37:20 INFO manager.OracleManager: Time zone has been set to GMT15/04/25 13:37:20 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM SAPSR3.AUSP t WHERE 1=015/04/25 13:37:20 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /opt/cloudera/parcels/CDH/lib/hadoop-mapreduceNote: /tmp/sqoop-root/compile/dbe5b6d69507ee60c249062c54813557/SAPSR3_AUSP.java uses or overrides a deprecated API.Note: Recompile with -Xlint:deprecation for details.15/04/25 13:37:22 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/dbe5b6d69507ee60c249062c54813557/SAPSR3.AUSP.jar15/04/25 13:37:22 INFO mapreduce.ImportJobBase: Beginning import of SAPSR3.AUSP15/04/25 13:37:22 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar15/04/25 13:37:22 INFO manager.OracleManager: Time zone has been set to GMT15/04/25 13:37:23 INFO manager.OracleManager: Time zone has been set to GMT15/04/25 13:37:23 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM SAPSR3.AUSP t WHERE 1=015/04/25 13:37:23 INFO mapreduce.DataDrivenImportJob: Writing Avro schema file: /tmp/sqoop-root/compile/dbe5b6d69507ee60c249062c54813557/sqoop_import_SAPSR3_AUSP.avsc15/04/25 13:37:23 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps15/04/25 13:37:23 INFO hdfs.DFSClient: Created HDFS_DELEGATION_TOKEN token 14047 for edhdtaesvc on ha-hdfs:nameservice115/04/25 13:37:23 ERROR hdfs.KeyProviderCache: Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!15/04/25 13:37:23 INFO security.TokenCache: Got dt for hdfs://nameservice1; Kind: HDFS_DELEGATION_TOKEN, Service: ha-hdfs:nameservice1, Ident: (HDFS_DELEGATION_TOKEN token 14047 for edhdtaesvc)15/04/25 13:37:23 ERROR hdfs.KeyProviderCache: Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!15/04/25 13:37:23 ERROR hdfs.KeyProviderCache: Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!15/04/25 13:37:25 ERROR hdfs.KeyProviderCache: Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!15/04/25 13:37:25 INFO db.DBInputFormat: Using read commited transaction isolation15/04/25 13:37:25 INFO mapreduce.JobSubmitter: number of splits:115/04/25 13:37:26 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1429968417065_000415/04/25 13:37:26 INFO mapreduce.JobSubmitter: Kind: HDFS_DELEGATION_TOKEN, Service: ha-hdfs:nameservice1, Ident: (HDFS_DELEGATION_TOKEN token 14047 for edhdtaesvc)15/04/25 13:37:26 INFO impl.YarnClientImpl: Submitted application application_1429968417065_000415/04/25 13:37:26 INFO mapreduce.Job: The url to track the job: http://yrncdh01094p001.corp.costco.com:8088/proxy/application_1429968417065_0004/15/04/25 13:37:26 INFO mapreduce.Job: Running job: job_1429968417065_000415/04/25 13:37:40 INFO mapreduce.Job: Job job_1429968417065_0004 running in uber mode : false15/04/25 13:37:40 INFO mapreduce.Job:  map 0% reduce 0%15/04/25 13:37:40 INFO mapreduce.Job: Job job_1429968417065_0004 failed with state FAILED due to: Application application_1429968417065_0004 failed 2 times due to AM Container for appattempt_1429968417065_0004_000002 exited with  exitCode: -1000 due to: Application application_1429968417065_0004 initialization failed (exitCode=255) with output: User edhdtaesvc not found
.Failing this attempt.. Failing the application.15/04/25 13:37:40 INFO mapreduce.Job: Counters: 015/04/25 13:37:40 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead15/04/25 13:37:40 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 17.5273 seconds (0 bytes/sec)15/04/25 13:37:40 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead15/04/25 13:37:40 INFO mapreduce.ImportJobBase: Retrieved 0 records.15/04/25 13:37:40 ERROR tool.ImportTool: Error during import: Import job failed!
thanksSajid
On Fri, Apr 24, 2015 at 10:52 PM, Jagat Singh <ja...@gmail.com> wrote:
What is this error
User edhdtaesvc not found

Are you using any user with that name ?


On Sat, Apr 25, 2015 at 3:21 PM, Kumar Jayapal <kj...@gmail.com> wrote:
Hi,
I am getting the following error while running sqoop import script.
can any one please help in resolving this issue.


15/04/25 03:12:34 INFO mapreduce.Job: Running job: job_1429930456969_000615/04/25 03:12:48 INFO mapreduce.Job: Job job_1429930456969_0006 running in uber mode : false15/04/25 03:12:48 INFO mapreduce.Job:  map 0% reduce 0%15/04/25 03:12:48 INFO mapreduce.Job: Job job_1429930456969_0006 failed with state FAILED due to: Application application_1429930456969_0006 failed 2 times due to AM Container for appattempt_1429930456969_0006_000002 exited with  exitCode: -1000 due to: Application application_1429930456969_0006 initialization failed (exitCode=255) with output: User edhdtaesvc not found
.Failing this attempt.. Failing the application.15/04/25 03:12:48 INFO mapreduce.Job: Counters: 015/04/25 03:12:48 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead15/04/25 03:12:48 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 72.8944 seconds (0 bytes/sec)15/04/25 03:12:48 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead15/04/25 03:12:48 INFO mapreduce.ImportJobBase: Retrieved 0 records.15/04/25 03:12:48 ERROR tool.ImportTool: Error during import: Import job failed!
ThanksKumar




 		 	   		  

Re: YARN Exceptions

Posted by Alexander Alten-Lorenz <wg...@gmail.com>.
Please have a closer look at the quoted error - the user (User edhdtaesvc not found) doesnt exists in your hadoop installation, which let the MR job fail.

BR,
 AL

--
Alexander Alten-Lorenz
m: wget.null@gmail.com
b: mapredit.blogspot.com

> On Apr 25, 2015, at 4:40 PM, Kumar Jayapal <kj...@gmail.com> wrote:
> 
> 15/04/25 13:37:40 INFO mapreduce.Job: Job job_1429968417065_0004 failed with state FAILED due to: Application application_1429968417065_0004 failed 2 times due to AM Container for appattempt_1429968417065_0004_000002 exited with  exitCode: -1000 due to: Application application_1429968417065_0004 initialization failed (exitCode=255) with output: User edhdtaesvc not found


Re: YARN Exceptions

Posted by Alexander Alten-Lorenz <wg...@gmail.com>.
Please have a closer look at the quoted error - the user (User edhdtaesvc not found) doesnt exists in your hadoop installation, which let the MR job fail.

BR,
 AL

--
Alexander Alten-Lorenz
m: wget.null@gmail.com
b: mapredit.blogspot.com

> On Apr 25, 2015, at 4:40 PM, Kumar Jayapal <kj...@gmail.com> wrote:
> 
> 15/04/25 13:37:40 INFO mapreduce.Job: Job job_1429968417065_0004 failed with state FAILED due to: Application application_1429968417065_0004 failed 2 times due to AM Container for appattempt_1429968417065_0004_000002 exited with  exitCode: -1000 due to: Application application_1429968417065_0004 initialization failed (exitCode=255) with output: User edhdtaesvc not found


Re: YARN Exceptions

Posted by Alexander Alten-Lorenz <wg...@gmail.com>.
Please have a closer look at the quoted error - the user (User edhdtaesvc not found) doesnt exists in your hadoop installation, which let the MR job fail.

BR,
 AL

--
Alexander Alten-Lorenz
m: wget.null@gmail.com
b: mapredit.blogspot.com

> On Apr 25, 2015, at 4:40 PM, Kumar Jayapal <kj...@gmail.com> wrote:
> 
> 15/04/25 13:37:40 INFO mapreduce.Job: Job job_1429968417065_0004 failed with state FAILED due to: Application application_1429968417065_0004 failed 2 times due to AM Container for appattempt_1429968417065_0004_000002 exited with  exitCode: -1000 due to: Application application_1429968417065_0004 initialization failed (exitCode=255) with output: User edhdtaesvc not found


RE: YARN Exceptions

Posted by Rohith Sharma K S <ro...@huawei.com>.
Are you running Secured Hadoop cluster( Kerberos ) and at YARN – container executor as LinuxContainerExecutor?

Thanks & Regards
Rohith Sharma K S
From: Kumar Jayapal [mailto:kjayapal17@gmail.com]
Sent: 25 April 2015 20:10
To: user@hadoop.apache.org
Subject: Re: YARN Exceptions

Yes Here is the complete log and sqoop import command to get the data from oracle.

[root@sqpcdh01094p001 ~]# sqoop import  --connect "jdbc:oracle:thin:@lorct101094t01a.qat.np.costco.com:1521/CT1<http://jdbc:oracle:thin:@lorct101094t01a.qat.np.costco.com:1521/CT1>" --username "edhdtaesvc" --password "xxxxxxxx" --table "SAPSR3.AUSP"  --target-dir "/data/crmdq/CT1" --table "SAPSR3.AUSP" --split-by PARTNER_GUID --as-avrodatafile --compression-codec org.apache.hadoop.io.compress.SnappyCodec --m 1

Warning: /opt/cloudera/parcels/CDH-5.3.2-1.cdh5.3.2.p654.326/bin/../lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
15/04/25 13:37:19 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.3.2
15/04/25 13:37:19 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
15/04/25 13:37:20 INFO oracle.OraOopManagerFactory: Data Connector for Oracle and Hadoop is disabled.
15/04/25 13:37:20 INFO manager.SqlManager: Using default fetchSize of 1000
15/04/25 13:37:20 INFO tool.CodeGenTool: Beginning code generation
15/04/25 13:37:20 INFO manager.OracleManager: Time zone has been set to GMT
15/04/25 13:37:20 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM SAPSR3.AUSP t WHERE 1=0
15/04/25 13:37:20 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /opt/cloudera/parcels/CDH/lib/hadoop-mapreduce
Note: /tmp/sqoop-root/compile/dbe5b6d69507ee60c249062c54813557/SAPSR3_AUSP.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
15/04/25 13:37:22 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/dbe5b6d69507ee60c249062c54813557/SAPSR3.AUSP.jar
15/04/25 13:37:22 INFO mapreduce.ImportJobBase: Beginning import of SAPSR3.AUSP
15/04/25 13:37:22 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
15/04/25 13:37:22 INFO manager.OracleManager: Time zone has been set to GMT
15/04/25 13:37:23 INFO manager.OracleManager: Time zone has been set to GMT
15/04/25 13:37:23 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM SAPSR3.AUSP t WHERE 1=0
15/04/25 13:37:23 INFO mapreduce.DataDrivenImportJob: Writing Avro schema file: /tmp/sqoop-root/compile/dbe5b6d69507ee60c249062c54813557/sqoop_import_SAPSR3_AUSP.avsc
15/04/25 13:37:23 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
15/04/25 13:37:23 INFO hdfs.DFSClient: Created HDFS_DELEGATION_TOKEN token 14047 for edhdtaesvc on ha-hdfs:nameservice1
15/04/25 13:37:23 ERROR hdfs.KeyProviderCache: Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!
15/04/25 13:37:23 INFO security.TokenCache: Got dt for hdfs://nameservice1; Kind: HDFS_DELEGATION_TOKEN, Service: ha-hdfs:nameservice1, Ident: (HDFS_DELEGATION_TOKEN token 14047 for edhdtaesvc)
15/04/25 13:37:23 ERROR hdfs.KeyProviderCache: Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!
15/04/25 13:37:23 ERROR hdfs.KeyProviderCache: Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!
15/04/25 13:37:25 ERROR hdfs.KeyProviderCache: Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!
15/04/25 13:37:25 INFO db.DBInputFormat: Using read commited transaction isolation
15/04/25 13:37:25 INFO mapreduce.JobSubmitter: number of splits:1
15/04/25 13:37:26 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1429968417065_0004
15/04/25 13:37:26 INFO mapreduce.JobSubmitter: Kind: HDFS_DELEGATION_TOKEN, Service: ha-hdfs:nameservice1, Ident: (HDFS_DELEGATION_TOKEN token 14047 for edhdtaesvc)
15/04/25 13:37:26 INFO impl.YarnClientImpl: Submitted application application_1429968417065_0004
15/04/25 13:37:26 INFO mapreduce.Job: The url to track the job: http://yrncdh01094p001.corp.costco.com:8088/proxy/application_1429968417065_0004/
15/04/25 13:37:26 INFO mapreduce.Job: Running job: job_1429968417065_0004
15/04/25 13:37:40 INFO mapreduce.Job: Job job_1429968417065_0004 running in uber mode : false
15/04/25 13:37:40 INFO mapreduce.Job:  map 0% reduce 0%
15/04/25 13:37:40 INFO mapreduce.Job: Job job_1429968417065_0004 failed with state FAILED due to: Application application_1429968417065_0004 failed 2 times due to AM Container for appattempt_1429968417065_0004_000002 exited with  exitCode: -1000 due to: Application application_1429968417065_0004 initialization failed (exitCode=255) with output: User edhdtaesvc not found

.Failing this attempt.. Failing the application.
15/04/25 13:37:40 INFO mapreduce.Job: Counters: 0
15/04/25 13:37:40 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
15/04/25 13:37:40 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 17.5273 seconds (0 bytes/sec)
15/04/25 13:37:40 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
15/04/25 13:37:40 INFO mapreduce.ImportJobBase: Retrieved 0 records.
15/04/25 13:37:40 ERROR tool.ImportTool: Error during import: Import job failed!

thanks
Sajid

On Fri, Apr 24, 2015 at 10:52 PM, Jagat Singh <ja...@gmail.com>> wrote:
What is this error

User edhdtaesvc not found

Are you using any user with that name ?



On Sat, Apr 25, 2015 at 3:21 PM, Kumar Jayapal <kj...@gmail.com>> wrote:
Hi,

I am getting the following error while running sqoop import script.

can any one please help in resolving this issue.



15/04/25 03:12:34 INFO mapreduce.Job: Running job: job_1429930456969_0006
15/04/25 03:12:48 INFO mapreduce.Job: Job job_1429930456969_0006 running in uber mode : false
15/04/25 03:12:48 INFO mapreduce.Job:  map 0% reduce 0%
15/04/25 03:12:48 INFO mapreduce.Job: Job job_1429930456969_0006 failed with state FAILED due to: Application application_1429930456969_0006 failed 2 times due to AM Container for appattempt_1429930456969_0006_000002 exited with  exitCode: -1000 due to: Application application_1429930456969_0006 initialization failed (exitCode=255) with output: User edhdtaesvc not found

.Failing this attempt.. Failing the application.
15/04/25 03:12:48 INFO mapreduce.Job: Counters: 0
15/04/25 03:12:48 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
15/04/25 03:12:48 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 72.8944 seconds (0 bytes/sec)
15/04/25 03:12:48 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
15/04/25 03:12:48 INFO mapreduce.ImportJobBase: Retrieved 0 records.
15/04/25 03:12:48 ERROR tool.ImportTool: Error during import: Import job failed!

Thanks
Kumar




RE: YARN Exceptions

Posted by yves callaert <yv...@hotmail.com>.
Hi,I think you are running into the following error: https://issues.apache.org/jira/browse/HDFS-7931The problem is probably due to the error in the hdfs.KeyProviderCache which you get before you get the name error.with Regards,Yves

Date: Sat, 25 Apr 2015 07:40:08 -0700
Subject: Re: YARN Exceptions
From: kjayapal17@gmail.com
To: user@hadoop.apache.org

Yes Here is the complete log and sqoop import command to get the data from oracle.
[root@sqpcdh01094p001 ~]# sqoop import  --connect "jdbc:oracle:thin:@lorct101094t01a.qat.np.costco.com:1521/CT1" --username "edhdtaesvc" --password "xxxxxxxx" --table "SAPSR3.AUSP"  --target-dir "/data/crmdq/CT1" --table "SAPSR3.AUSP" --split-by PARTNER_GUID --as-avrodatafile --compression-codec org.apache.hadoop.io.compress.SnappyCodec --m 1
Warning: /opt/cloudera/parcels/CDH-5.3.2-1.cdh5.3.2.p654.326/bin/../lib/sqoop/../accumulo does not exist! Accumulo imports will fail.Please set $ACCUMULO_HOME to the root of your Accumulo installation.15/04/25 13:37:19 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.3.215/04/25 13:37:19 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.15/04/25 13:37:20 INFO oracle.OraOopManagerFactory: Data Connector for Oracle and Hadoop is disabled.15/04/25 13:37:20 INFO manager.SqlManager: Using default fetchSize of 100015/04/25 13:37:20 INFO tool.CodeGenTool: Beginning code generation15/04/25 13:37:20 INFO manager.OracleManager: Time zone has been set to GMT15/04/25 13:37:20 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM SAPSR3.AUSP t WHERE 1=015/04/25 13:37:20 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /opt/cloudera/parcels/CDH/lib/hadoop-mapreduceNote: /tmp/sqoop-root/compile/dbe5b6d69507ee60c249062c54813557/SAPSR3_AUSP.java uses or overrides a deprecated API.Note: Recompile with -Xlint:deprecation for details.15/04/25 13:37:22 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/dbe5b6d69507ee60c249062c54813557/SAPSR3.AUSP.jar15/04/25 13:37:22 INFO mapreduce.ImportJobBase: Beginning import of SAPSR3.AUSP15/04/25 13:37:22 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar15/04/25 13:37:22 INFO manager.OracleManager: Time zone has been set to GMT15/04/25 13:37:23 INFO manager.OracleManager: Time zone has been set to GMT15/04/25 13:37:23 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM SAPSR3.AUSP t WHERE 1=015/04/25 13:37:23 INFO mapreduce.DataDrivenImportJob: Writing Avro schema file: /tmp/sqoop-root/compile/dbe5b6d69507ee60c249062c54813557/sqoop_import_SAPSR3_AUSP.avsc15/04/25 13:37:23 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps15/04/25 13:37:23 INFO hdfs.DFSClient: Created HDFS_DELEGATION_TOKEN token 14047 for edhdtaesvc on ha-hdfs:nameservice115/04/25 13:37:23 ERROR hdfs.KeyProviderCache: Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!15/04/25 13:37:23 INFO security.TokenCache: Got dt for hdfs://nameservice1; Kind: HDFS_DELEGATION_TOKEN, Service: ha-hdfs:nameservice1, Ident: (HDFS_DELEGATION_TOKEN token 14047 for edhdtaesvc)15/04/25 13:37:23 ERROR hdfs.KeyProviderCache: Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!15/04/25 13:37:23 ERROR hdfs.KeyProviderCache: Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!15/04/25 13:37:25 ERROR hdfs.KeyProviderCache: Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!15/04/25 13:37:25 INFO db.DBInputFormat: Using read commited transaction isolation15/04/25 13:37:25 INFO mapreduce.JobSubmitter: number of splits:115/04/25 13:37:26 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1429968417065_000415/04/25 13:37:26 INFO mapreduce.JobSubmitter: Kind: HDFS_DELEGATION_TOKEN, Service: ha-hdfs:nameservice1, Ident: (HDFS_DELEGATION_TOKEN token 14047 for edhdtaesvc)15/04/25 13:37:26 INFO impl.YarnClientImpl: Submitted application application_1429968417065_000415/04/25 13:37:26 INFO mapreduce.Job: The url to track the job: http://yrncdh01094p001.corp.costco.com:8088/proxy/application_1429968417065_0004/15/04/25 13:37:26 INFO mapreduce.Job: Running job: job_1429968417065_000415/04/25 13:37:40 INFO mapreduce.Job: Job job_1429968417065_0004 running in uber mode : false15/04/25 13:37:40 INFO mapreduce.Job:  map 0% reduce 0%15/04/25 13:37:40 INFO mapreduce.Job: Job job_1429968417065_0004 failed with state FAILED due to: Application application_1429968417065_0004 failed 2 times due to AM Container for appattempt_1429968417065_0004_000002 exited with  exitCode: -1000 due to: Application application_1429968417065_0004 initialization failed (exitCode=255) with output: User edhdtaesvc not found
.Failing this attempt.. Failing the application.15/04/25 13:37:40 INFO mapreduce.Job: Counters: 015/04/25 13:37:40 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead15/04/25 13:37:40 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 17.5273 seconds (0 bytes/sec)15/04/25 13:37:40 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead15/04/25 13:37:40 INFO mapreduce.ImportJobBase: Retrieved 0 records.15/04/25 13:37:40 ERROR tool.ImportTool: Error during import: Import job failed!
thanksSajid
On Fri, Apr 24, 2015 at 10:52 PM, Jagat Singh <ja...@gmail.com> wrote:
What is this error
User edhdtaesvc not found

Are you using any user with that name ?


On Sat, Apr 25, 2015 at 3:21 PM, Kumar Jayapal <kj...@gmail.com> wrote:
Hi,
I am getting the following error while running sqoop import script.
can any one please help in resolving this issue.


15/04/25 03:12:34 INFO mapreduce.Job: Running job: job_1429930456969_000615/04/25 03:12:48 INFO mapreduce.Job: Job job_1429930456969_0006 running in uber mode : false15/04/25 03:12:48 INFO mapreduce.Job:  map 0% reduce 0%15/04/25 03:12:48 INFO mapreduce.Job: Job job_1429930456969_0006 failed with state FAILED due to: Application application_1429930456969_0006 failed 2 times due to AM Container for appattempt_1429930456969_0006_000002 exited with  exitCode: -1000 due to: Application application_1429930456969_0006 initialization failed (exitCode=255) with output: User edhdtaesvc not found
.Failing this attempt.. Failing the application.15/04/25 03:12:48 INFO mapreduce.Job: Counters: 015/04/25 03:12:48 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead15/04/25 03:12:48 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 72.8944 seconds (0 bytes/sec)15/04/25 03:12:48 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead15/04/25 03:12:48 INFO mapreduce.ImportJobBase: Retrieved 0 records.15/04/25 03:12:48 ERROR tool.ImportTool: Error during import: Import job failed!
ThanksKumar




 		 	   		  

RE: YARN Exceptions

Posted by yves callaert <yv...@hotmail.com>.
Hi,I think you are running into the following error: https://issues.apache.org/jira/browse/HDFS-7931The problem is probably due to the error in the hdfs.KeyProviderCache which you get before you get the name error.with Regards,Yves

Date: Sat, 25 Apr 2015 07:40:08 -0700
Subject: Re: YARN Exceptions
From: kjayapal17@gmail.com
To: user@hadoop.apache.org

Yes Here is the complete log and sqoop import command to get the data from oracle.
[root@sqpcdh01094p001 ~]# sqoop import  --connect "jdbc:oracle:thin:@lorct101094t01a.qat.np.costco.com:1521/CT1" --username "edhdtaesvc" --password "xxxxxxxx" --table "SAPSR3.AUSP"  --target-dir "/data/crmdq/CT1" --table "SAPSR3.AUSP" --split-by PARTNER_GUID --as-avrodatafile --compression-codec org.apache.hadoop.io.compress.SnappyCodec --m 1
Warning: /opt/cloudera/parcels/CDH-5.3.2-1.cdh5.3.2.p654.326/bin/../lib/sqoop/../accumulo does not exist! Accumulo imports will fail.Please set $ACCUMULO_HOME to the root of your Accumulo installation.15/04/25 13:37:19 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.3.215/04/25 13:37:19 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.15/04/25 13:37:20 INFO oracle.OraOopManagerFactory: Data Connector for Oracle and Hadoop is disabled.15/04/25 13:37:20 INFO manager.SqlManager: Using default fetchSize of 100015/04/25 13:37:20 INFO tool.CodeGenTool: Beginning code generation15/04/25 13:37:20 INFO manager.OracleManager: Time zone has been set to GMT15/04/25 13:37:20 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM SAPSR3.AUSP t WHERE 1=015/04/25 13:37:20 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /opt/cloudera/parcels/CDH/lib/hadoop-mapreduceNote: /tmp/sqoop-root/compile/dbe5b6d69507ee60c249062c54813557/SAPSR3_AUSP.java uses or overrides a deprecated API.Note: Recompile with -Xlint:deprecation for details.15/04/25 13:37:22 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/dbe5b6d69507ee60c249062c54813557/SAPSR3.AUSP.jar15/04/25 13:37:22 INFO mapreduce.ImportJobBase: Beginning import of SAPSR3.AUSP15/04/25 13:37:22 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar15/04/25 13:37:22 INFO manager.OracleManager: Time zone has been set to GMT15/04/25 13:37:23 INFO manager.OracleManager: Time zone has been set to GMT15/04/25 13:37:23 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM SAPSR3.AUSP t WHERE 1=015/04/25 13:37:23 INFO mapreduce.DataDrivenImportJob: Writing Avro schema file: /tmp/sqoop-root/compile/dbe5b6d69507ee60c249062c54813557/sqoop_import_SAPSR3_AUSP.avsc15/04/25 13:37:23 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps15/04/25 13:37:23 INFO hdfs.DFSClient: Created HDFS_DELEGATION_TOKEN token 14047 for edhdtaesvc on ha-hdfs:nameservice115/04/25 13:37:23 ERROR hdfs.KeyProviderCache: Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!15/04/25 13:37:23 INFO security.TokenCache: Got dt for hdfs://nameservice1; Kind: HDFS_DELEGATION_TOKEN, Service: ha-hdfs:nameservice1, Ident: (HDFS_DELEGATION_TOKEN token 14047 for edhdtaesvc)15/04/25 13:37:23 ERROR hdfs.KeyProviderCache: Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!15/04/25 13:37:23 ERROR hdfs.KeyProviderCache: Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!15/04/25 13:37:25 ERROR hdfs.KeyProviderCache: Could not find uri with key [dfs.encryption.key.provider.uri] to create a keyProvider !!15/04/25 13:37:25 INFO db.DBInputFormat: Using read commited transaction isolation15/04/25 13:37:25 INFO mapreduce.JobSubmitter: number of splits:115/04/25 13:37:26 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1429968417065_000415/04/25 13:37:26 INFO mapreduce.JobSubmitter: Kind: HDFS_DELEGATION_TOKEN, Service: ha-hdfs:nameservice1, Ident: (HDFS_DELEGATION_TOKEN token 14047 for edhdtaesvc)15/04/25 13:37:26 INFO impl.YarnClientImpl: Submitted application application_1429968417065_000415/04/25 13:37:26 INFO mapreduce.Job: The url to track the job: http://yrncdh01094p001.corp.costco.com:8088/proxy/application_1429968417065_0004/15/04/25 13:37:26 INFO mapreduce.Job: Running job: job_1429968417065_000415/04/25 13:37:40 INFO mapreduce.Job: Job job_1429968417065_0004 running in uber mode : false15/04/25 13:37:40 INFO mapreduce.Job:  map 0% reduce 0%15/04/25 13:37:40 INFO mapreduce.Job: Job job_1429968417065_0004 failed with state FAILED due to: Application application_1429968417065_0004 failed 2 times due to AM Container for appattempt_1429968417065_0004_000002 exited with  exitCode: -1000 due to: Application application_1429968417065_0004 initialization failed (exitCode=255) with output: User edhdtaesvc not found
.Failing this attempt.. Failing the application.15/04/25 13:37:40 INFO mapreduce.Job: Counters: 015/04/25 13:37:40 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead15/04/25 13:37:40 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 17.5273 seconds (0 bytes/sec)15/04/25 13:37:40 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead15/04/25 13:37:40 INFO mapreduce.ImportJobBase: Retrieved 0 records.15/04/25 13:37:40 ERROR tool.ImportTool: Error during import: Import job failed!
thanksSajid
On Fri, Apr 24, 2015 at 10:52 PM, Jagat Singh <ja...@gmail.com> wrote:
What is this error
User edhdtaesvc not found

Are you using any user with that name ?


On Sat, Apr 25, 2015 at 3:21 PM, Kumar Jayapal <kj...@gmail.com> wrote:
Hi,
I am getting the following error while running sqoop import script.
can any one please help in resolving this issue.


15/04/25 03:12:34 INFO mapreduce.Job: Running job: job_1429930456969_000615/04/25 03:12:48 INFO mapreduce.Job: Job job_1429930456969_0006 running in uber mode : false15/04/25 03:12:48 INFO mapreduce.Job:  map 0% reduce 0%15/04/25 03:12:48 INFO mapreduce.Job: Job job_1429930456969_0006 failed with state FAILED due to: Application application_1429930456969_0006 failed 2 times due to AM Container for appattempt_1429930456969_0006_000002 exited with  exitCode: -1000 due to: Application application_1429930456969_0006 initialization failed (exitCode=255) with output: User edhdtaesvc not found
.Failing this attempt.. Failing the application.15/04/25 03:12:48 INFO mapreduce.Job: Counters: 015/04/25 03:12:48 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead15/04/25 03:12:48 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 72.8944 seconds (0 bytes/sec)15/04/25 03:12:48 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead15/04/25 03:12:48 INFO mapreduce.ImportJobBase: Retrieved 0 records.15/04/25 03:12:48 ERROR tool.ImportTool: Error during import: Import job failed!
ThanksKumar




 		 	   		  

Re: YARN Exceptions

Posted by Kumar Jayapal <kj...@gmail.com>.
Yes Here is the complete log and sqoop import command to get the data from
oracle.

[root@sqpcdh01094p001 ~]# sqoop import  --connect "
jdbc:oracle:thin:@lorct101094t01a.qat.np.costco.com:1521/CT1" --username
"edhdtaesvc" --password "xxxxxxxx" --table "SAPSR3.AUSP"  --target-dir
"/data/crmdq/CT1" --table "SAPSR3.AUSP" --split-by PARTNER_GUID
--as-avrodatafile --compression-codec
org.apache.hadoop.io.compress.SnappyCodec --m 1

Warning:
/opt/cloudera/parcels/CDH-5.3.2-1.cdh5.3.2.p654.326/bin/../lib/sqoop/../accumulo
does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
15/04/25 13:37:19 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.3.2
15/04/25 13:37:19 WARN tool.BaseSqoopTool: Setting your password on the
command-line is insecure. Consider using -P instead.
15/04/25 13:37:20 INFO oracle.OraOopManagerFactory: Data Connector for
Oracle and Hadoop is disabled.
15/04/25 13:37:20 INFO manager.SqlManager: Using default fetchSize of 1000
15/04/25 13:37:20 INFO tool.CodeGenTool: Beginning code generation
15/04/25 13:37:20 INFO manager.OracleManager: Time zone has been set to GMT
15/04/25 13:37:20 INFO manager.SqlManager: Executing SQL statement: SELECT
t.* FROM SAPSR3.AUSP t WHERE 1=0
15/04/25 13:37:20 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is
/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce
Note:
/tmp/sqoop-root/compile/dbe5b6d69507ee60c249062c54813557/SAPSR3_AUSP.java
uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
15/04/25 13:37:22 INFO orm.CompilationManager: Writing jar file:
/tmp/sqoop-root/compile/dbe5b6d69507ee60c249062c54813557/SAPSR3.AUSP.jar
15/04/25 13:37:22 INFO mapreduce.ImportJobBase: Beginning import of
SAPSR3.AUSP
15/04/25 13:37:22 INFO Configuration.deprecation: mapred.jar is deprecated.
Instead, use mapreduce.job.jar
15/04/25 13:37:22 INFO manager.OracleManager: Time zone has been set to GMT
15/04/25 13:37:23 INFO manager.OracleManager: Time zone has been set to GMT
15/04/25 13:37:23 INFO manager.SqlManager: Executing SQL statement: SELECT
t.* FROM SAPSR3.AUSP t WHERE 1=0
15/04/25 13:37:23 INFO mapreduce.DataDrivenImportJob: Writing Avro schema
file:
/tmp/sqoop-root/compile/dbe5b6d69507ee60c249062c54813557/sqoop_import_SAPSR3_AUSP.avsc
15/04/25 13:37:23 INFO Configuration.deprecation: mapred.map.tasks is
deprecated. Instead, use mapreduce.job.maps
15/04/25 13:37:23 INFO hdfs.DFSClient: Created HDFS_DELEGATION_TOKEN token
14047 for edhdtaesvc on ha-hdfs:nameservice1
15/04/25 13:37:23 ERROR hdfs.KeyProviderCache: Could not find uri with key
[dfs.encryption.key.provider.uri] to create a keyProvider !!
15/04/25 13:37:23 INFO security.TokenCache: Got dt for hdfs://nameservice1;
Kind: HDFS_DELEGATION_TOKEN, Service: ha-hdfs:nameservice1, Ident:
(HDFS_DELEGATION_TOKEN token 14047 for edhdtaesvc)
15/04/25 13:37:23 ERROR hdfs.KeyProviderCache: Could not find uri with key
[dfs.encryption.key.provider.uri] to create a keyProvider !!
15/04/25 13:37:23 ERROR hdfs.KeyProviderCache: Could not find uri with key
[dfs.encryption.key.provider.uri] to create a keyProvider !!
15/04/25 13:37:25 ERROR hdfs.KeyProviderCache: Could not find uri with key
[dfs.encryption.key.provider.uri] to create a keyProvider !!
15/04/25 13:37:25 INFO db.DBInputFormat: Using read commited transaction
isolation
15/04/25 13:37:25 INFO mapreduce.JobSubmitter: number of splits:1
15/04/25 13:37:26 INFO mapreduce.JobSubmitter: Submitting tokens for job:
job_1429968417065_0004
15/04/25 13:37:26 INFO mapreduce.JobSubmitter: Kind: HDFS_DELEGATION_TOKEN,
Service: ha-hdfs:nameservice1, Ident: (HDFS_DELEGATION_TOKEN token 14047
for edhdtaesvc)
15/04/25 13:37:26 INFO impl.YarnClientImpl: Submitted application
application_1429968417065_0004
15/04/25 13:37:26 INFO mapreduce.Job: The url to track the job:
http://yrncdh01094p001.corp.costco.com:8088/proxy/application_1429968417065_0004/
15/04/25 13:37:26 INFO mapreduce.Job: Running job: job_1429968417065_0004
15/04/25 13:37:40 INFO mapreduce.Job: Job job_1429968417065_0004 running in
uber mode : false
15/04/25 13:37:40 INFO mapreduce.Job:  map 0% reduce 0%
15/04/25 13:37:40 INFO mapreduce.Job: Job job_1429968417065_0004 failed
with state FAILED due to: Application application_1429968417065_0004 failed
2 times due to AM Container for appattempt_1429968417065_0004_000002 exited
with  exitCode: -1000 due to: Application application_1429968417065_0004
initialization failed (exitCode=255) with output: User edhdtaesvc not found

.Failing this attempt.. Failing the application.
15/04/25 13:37:40 INFO mapreduce.Job: Counters: 0
15/04/25 13:37:40 WARN mapreduce.Counters: Group FileSystemCounters is
deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
15/04/25 13:37:40 INFO mapreduce.ImportJobBase: Transferred 0 bytes in
17.5273 seconds (0 bytes/sec)
15/04/25 13:37:40 WARN mapreduce.Counters: Group
org.apache.hadoop.mapred.Task$Counter is deprecated. Use
org.apache.hadoop.mapreduce.TaskCounter instead
15/04/25 13:37:40 INFO mapreduce.ImportJobBase: Retrieved 0 records.
15/04/25 13:37:40 ERROR tool.ImportTool: Error during import: Import job
failed!

thanks
Sajid

On Fri, Apr 24, 2015 at 10:52 PM, Jagat Singh <ja...@gmail.com> wrote:

> What is this error
>
> User edhdtaesvc not found
>
> Are you using any user with that name ?
>
>
>
> On Sat, Apr 25, 2015 at 3:21 PM, Kumar Jayapal <kj...@gmail.com>
> wrote:
>
>> Hi,
>>
>> I am getting the following error while running sqoop import script.
>>
>> can any one please help in resolving this issue.
>>
>>
>>
>> 15/04/25 03:12:34 INFO mapreduce.Job: Running job: job_1429930456969_0006
>> 15/04/25 03:12:48 INFO mapreduce.Job: Job job_1429930456969_0006 running
>> in uber mode : false
>> 15/04/25 03:12:48 INFO mapreduce.Job:  map 0% reduce 0%
>> 15/04/25 03:12:48 INFO mapreduce.Job: Job job_1429930456969_0006 failed
>> with state FAILED due to: Application application_1429930456969_0006 failed
>> 2 times due to AM Container for appattempt_1429930456969_0006_000002 exited
>> with  exitCode: -1000 due to: Application application_1429930456969_0006
>> initialization failed (exitCode=255) with output: User edhdtaesvc not found
>>
>> .Failing this attempt.. Failing the application.
>> 15/04/25 03:12:48 INFO mapreduce.Job: Counters: 0
>> 15/04/25 03:12:48 WARN mapreduce.Counters: Group FileSystemCounters is
>> deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
>> 15/04/25 03:12:48 INFO mapreduce.ImportJobBase: Transferred 0 bytes in
>> 72.8944 seconds (0 bytes/sec)
>> 15/04/25 03:12:48 WARN mapreduce.Counters: Group
>> org.apache.hadoop.mapred.Task$Counter is deprecated. Use
>> org.apache.hadoop.mapreduce.TaskCounter instead
>> 15/04/25 03:12:48 INFO mapreduce.ImportJobBase: Retrieved 0 records.
>> 15/04/25 03:12:48 ERROR tool.ImportTool: Error during import: Import job
>> failed!
>>
>> Thanks
>> Kumar
>>
>>
>

Re: YARN Exceptions

Posted by Kumar Jayapal <kj...@gmail.com>.
Yes Here is the complete log and sqoop import command to get the data from
oracle.

[root@sqpcdh01094p001 ~]# sqoop import  --connect "
jdbc:oracle:thin:@lorct101094t01a.qat.np.costco.com:1521/CT1" --username
"edhdtaesvc" --password "xxxxxxxx" --table "SAPSR3.AUSP"  --target-dir
"/data/crmdq/CT1" --table "SAPSR3.AUSP" --split-by PARTNER_GUID
--as-avrodatafile --compression-codec
org.apache.hadoop.io.compress.SnappyCodec --m 1

Warning:
/opt/cloudera/parcels/CDH-5.3.2-1.cdh5.3.2.p654.326/bin/../lib/sqoop/../accumulo
does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
15/04/25 13:37:19 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.3.2
15/04/25 13:37:19 WARN tool.BaseSqoopTool: Setting your password on the
command-line is insecure. Consider using -P instead.
15/04/25 13:37:20 INFO oracle.OraOopManagerFactory: Data Connector for
Oracle and Hadoop is disabled.
15/04/25 13:37:20 INFO manager.SqlManager: Using default fetchSize of 1000
15/04/25 13:37:20 INFO tool.CodeGenTool: Beginning code generation
15/04/25 13:37:20 INFO manager.OracleManager: Time zone has been set to GMT
15/04/25 13:37:20 INFO manager.SqlManager: Executing SQL statement: SELECT
t.* FROM SAPSR3.AUSP t WHERE 1=0
15/04/25 13:37:20 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is
/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce
Note:
/tmp/sqoop-root/compile/dbe5b6d69507ee60c249062c54813557/SAPSR3_AUSP.java
uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
15/04/25 13:37:22 INFO orm.CompilationManager: Writing jar file:
/tmp/sqoop-root/compile/dbe5b6d69507ee60c249062c54813557/SAPSR3.AUSP.jar
15/04/25 13:37:22 INFO mapreduce.ImportJobBase: Beginning import of
SAPSR3.AUSP
15/04/25 13:37:22 INFO Configuration.deprecation: mapred.jar is deprecated.
Instead, use mapreduce.job.jar
15/04/25 13:37:22 INFO manager.OracleManager: Time zone has been set to GMT
15/04/25 13:37:23 INFO manager.OracleManager: Time zone has been set to GMT
15/04/25 13:37:23 INFO manager.SqlManager: Executing SQL statement: SELECT
t.* FROM SAPSR3.AUSP t WHERE 1=0
15/04/25 13:37:23 INFO mapreduce.DataDrivenImportJob: Writing Avro schema
file:
/tmp/sqoop-root/compile/dbe5b6d69507ee60c249062c54813557/sqoop_import_SAPSR3_AUSP.avsc
15/04/25 13:37:23 INFO Configuration.deprecation: mapred.map.tasks is
deprecated. Instead, use mapreduce.job.maps
15/04/25 13:37:23 INFO hdfs.DFSClient: Created HDFS_DELEGATION_TOKEN token
14047 for edhdtaesvc on ha-hdfs:nameservice1
15/04/25 13:37:23 ERROR hdfs.KeyProviderCache: Could not find uri with key
[dfs.encryption.key.provider.uri] to create a keyProvider !!
15/04/25 13:37:23 INFO security.TokenCache: Got dt for hdfs://nameservice1;
Kind: HDFS_DELEGATION_TOKEN, Service: ha-hdfs:nameservice1, Ident:
(HDFS_DELEGATION_TOKEN token 14047 for edhdtaesvc)
15/04/25 13:37:23 ERROR hdfs.KeyProviderCache: Could not find uri with key
[dfs.encryption.key.provider.uri] to create a keyProvider !!
15/04/25 13:37:23 ERROR hdfs.KeyProviderCache: Could not find uri with key
[dfs.encryption.key.provider.uri] to create a keyProvider !!
15/04/25 13:37:25 ERROR hdfs.KeyProviderCache: Could not find uri with key
[dfs.encryption.key.provider.uri] to create a keyProvider !!
15/04/25 13:37:25 INFO db.DBInputFormat: Using read commited transaction
isolation
15/04/25 13:37:25 INFO mapreduce.JobSubmitter: number of splits:1
15/04/25 13:37:26 INFO mapreduce.JobSubmitter: Submitting tokens for job:
job_1429968417065_0004
15/04/25 13:37:26 INFO mapreduce.JobSubmitter: Kind: HDFS_DELEGATION_TOKEN,
Service: ha-hdfs:nameservice1, Ident: (HDFS_DELEGATION_TOKEN token 14047
for edhdtaesvc)
15/04/25 13:37:26 INFO impl.YarnClientImpl: Submitted application
application_1429968417065_0004
15/04/25 13:37:26 INFO mapreduce.Job: The url to track the job:
http://yrncdh01094p001.corp.costco.com:8088/proxy/application_1429968417065_0004/
15/04/25 13:37:26 INFO mapreduce.Job: Running job: job_1429968417065_0004
15/04/25 13:37:40 INFO mapreduce.Job: Job job_1429968417065_0004 running in
uber mode : false
15/04/25 13:37:40 INFO mapreduce.Job:  map 0% reduce 0%
15/04/25 13:37:40 INFO mapreduce.Job: Job job_1429968417065_0004 failed
with state FAILED due to: Application application_1429968417065_0004 failed
2 times due to AM Container for appattempt_1429968417065_0004_000002 exited
with  exitCode: -1000 due to: Application application_1429968417065_0004
initialization failed (exitCode=255) with output: User edhdtaesvc not found

.Failing this attempt.. Failing the application.
15/04/25 13:37:40 INFO mapreduce.Job: Counters: 0
15/04/25 13:37:40 WARN mapreduce.Counters: Group FileSystemCounters is
deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
15/04/25 13:37:40 INFO mapreduce.ImportJobBase: Transferred 0 bytes in
17.5273 seconds (0 bytes/sec)
15/04/25 13:37:40 WARN mapreduce.Counters: Group
org.apache.hadoop.mapred.Task$Counter is deprecated. Use
org.apache.hadoop.mapreduce.TaskCounter instead
15/04/25 13:37:40 INFO mapreduce.ImportJobBase: Retrieved 0 records.
15/04/25 13:37:40 ERROR tool.ImportTool: Error during import: Import job
failed!

thanks
Sajid

On Fri, Apr 24, 2015 at 10:52 PM, Jagat Singh <ja...@gmail.com> wrote:

> What is this error
>
> User edhdtaesvc not found
>
> Are you using any user with that name ?
>
>
>
> On Sat, Apr 25, 2015 at 3:21 PM, Kumar Jayapal <kj...@gmail.com>
> wrote:
>
>> Hi,
>>
>> I am getting the following error while running sqoop import script.
>>
>> can any one please help in resolving this issue.
>>
>>
>>
>> 15/04/25 03:12:34 INFO mapreduce.Job: Running job: job_1429930456969_0006
>> 15/04/25 03:12:48 INFO mapreduce.Job: Job job_1429930456969_0006 running
>> in uber mode : false
>> 15/04/25 03:12:48 INFO mapreduce.Job:  map 0% reduce 0%
>> 15/04/25 03:12:48 INFO mapreduce.Job: Job job_1429930456969_0006 failed
>> with state FAILED due to: Application application_1429930456969_0006 failed
>> 2 times due to AM Container for appattempt_1429930456969_0006_000002 exited
>> with  exitCode: -1000 due to: Application application_1429930456969_0006
>> initialization failed (exitCode=255) with output: User edhdtaesvc not found
>>
>> .Failing this attempt.. Failing the application.
>> 15/04/25 03:12:48 INFO mapreduce.Job: Counters: 0
>> 15/04/25 03:12:48 WARN mapreduce.Counters: Group FileSystemCounters is
>> deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
>> 15/04/25 03:12:48 INFO mapreduce.ImportJobBase: Transferred 0 bytes in
>> 72.8944 seconds (0 bytes/sec)
>> 15/04/25 03:12:48 WARN mapreduce.Counters: Group
>> org.apache.hadoop.mapred.Task$Counter is deprecated. Use
>> org.apache.hadoop.mapreduce.TaskCounter instead
>> 15/04/25 03:12:48 INFO mapreduce.ImportJobBase: Retrieved 0 records.
>> 15/04/25 03:12:48 ERROR tool.ImportTool: Error during import: Import job
>> failed!
>>
>> Thanks
>> Kumar
>>
>>
>

Re: YARN Exceptions

Posted by Kumar Jayapal <kj...@gmail.com>.
Yes Here is the complete log and sqoop import command to get the data from
oracle.

[root@sqpcdh01094p001 ~]# sqoop import  --connect "
jdbc:oracle:thin:@lorct101094t01a.qat.np.costco.com:1521/CT1" --username
"edhdtaesvc" --password "xxxxxxxx" --table "SAPSR3.AUSP"  --target-dir
"/data/crmdq/CT1" --table "SAPSR3.AUSP" --split-by PARTNER_GUID
--as-avrodatafile --compression-codec
org.apache.hadoop.io.compress.SnappyCodec --m 1

Warning:
/opt/cloudera/parcels/CDH-5.3.2-1.cdh5.3.2.p654.326/bin/../lib/sqoop/../accumulo
does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
15/04/25 13:37:19 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.3.2
15/04/25 13:37:19 WARN tool.BaseSqoopTool: Setting your password on the
command-line is insecure. Consider using -P instead.
15/04/25 13:37:20 INFO oracle.OraOopManagerFactory: Data Connector for
Oracle and Hadoop is disabled.
15/04/25 13:37:20 INFO manager.SqlManager: Using default fetchSize of 1000
15/04/25 13:37:20 INFO tool.CodeGenTool: Beginning code generation
15/04/25 13:37:20 INFO manager.OracleManager: Time zone has been set to GMT
15/04/25 13:37:20 INFO manager.SqlManager: Executing SQL statement: SELECT
t.* FROM SAPSR3.AUSP t WHERE 1=0
15/04/25 13:37:20 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is
/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce
Note:
/tmp/sqoop-root/compile/dbe5b6d69507ee60c249062c54813557/SAPSR3_AUSP.java
uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
15/04/25 13:37:22 INFO orm.CompilationManager: Writing jar file:
/tmp/sqoop-root/compile/dbe5b6d69507ee60c249062c54813557/SAPSR3.AUSP.jar
15/04/25 13:37:22 INFO mapreduce.ImportJobBase: Beginning import of
SAPSR3.AUSP
15/04/25 13:37:22 INFO Configuration.deprecation: mapred.jar is deprecated.
Instead, use mapreduce.job.jar
15/04/25 13:37:22 INFO manager.OracleManager: Time zone has been set to GMT
15/04/25 13:37:23 INFO manager.OracleManager: Time zone has been set to GMT
15/04/25 13:37:23 INFO manager.SqlManager: Executing SQL statement: SELECT
t.* FROM SAPSR3.AUSP t WHERE 1=0
15/04/25 13:37:23 INFO mapreduce.DataDrivenImportJob: Writing Avro schema
file:
/tmp/sqoop-root/compile/dbe5b6d69507ee60c249062c54813557/sqoop_import_SAPSR3_AUSP.avsc
15/04/25 13:37:23 INFO Configuration.deprecation: mapred.map.tasks is
deprecated. Instead, use mapreduce.job.maps
15/04/25 13:37:23 INFO hdfs.DFSClient: Created HDFS_DELEGATION_TOKEN token
14047 for edhdtaesvc on ha-hdfs:nameservice1
15/04/25 13:37:23 ERROR hdfs.KeyProviderCache: Could not find uri with key
[dfs.encryption.key.provider.uri] to create a keyProvider !!
15/04/25 13:37:23 INFO security.TokenCache: Got dt for hdfs://nameservice1;
Kind: HDFS_DELEGATION_TOKEN, Service: ha-hdfs:nameservice1, Ident:
(HDFS_DELEGATION_TOKEN token 14047 for edhdtaesvc)
15/04/25 13:37:23 ERROR hdfs.KeyProviderCache: Could not find uri with key
[dfs.encryption.key.provider.uri] to create a keyProvider !!
15/04/25 13:37:23 ERROR hdfs.KeyProviderCache: Could not find uri with key
[dfs.encryption.key.provider.uri] to create a keyProvider !!
15/04/25 13:37:25 ERROR hdfs.KeyProviderCache: Could not find uri with key
[dfs.encryption.key.provider.uri] to create a keyProvider !!
15/04/25 13:37:25 INFO db.DBInputFormat: Using read commited transaction
isolation
15/04/25 13:37:25 INFO mapreduce.JobSubmitter: number of splits:1
15/04/25 13:37:26 INFO mapreduce.JobSubmitter: Submitting tokens for job:
job_1429968417065_0004
15/04/25 13:37:26 INFO mapreduce.JobSubmitter: Kind: HDFS_DELEGATION_TOKEN,
Service: ha-hdfs:nameservice1, Ident: (HDFS_DELEGATION_TOKEN token 14047
for edhdtaesvc)
15/04/25 13:37:26 INFO impl.YarnClientImpl: Submitted application
application_1429968417065_0004
15/04/25 13:37:26 INFO mapreduce.Job: The url to track the job:
http://yrncdh01094p001.corp.costco.com:8088/proxy/application_1429968417065_0004/
15/04/25 13:37:26 INFO mapreduce.Job: Running job: job_1429968417065_0004
15/04/25 13:37:40 INFO mapreduce.Job: Job job_1429968417065_0004 running in
uber mode : false
15/04/25 13:37:40 INFO mapreduce.Job:  map 0% reduce 0%
15/04/25 13:37:40 INFO mapreduce.Job: Job job_1429968417065_0004 failed
with state FAILED due to: Application application_1429968417065_0004 failed
2 times due to AM Container for appattempt_1429968417065_0004_000002 exited
with  exitCode: -1000 due to: Application application_1429968417065_0004
initialization failed (exitCode=255) with output: User edhdtaesvc not found

.Failing this attempt.. Failing the application.
15/04/25 13:37:40 INFO mapreduce.Job: Counters: 0
15/04/25 13:37:40 WARN mapreduce.Counters: Group FileSystemCounters is
deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
15/04/25 13:37:40 INFO mapreduce.ImportJobBase: Transferred 0 bytes in
17.5273 seconds (0 bytes/sec)
15/04/25 13:37:40 WARN mapreduce.Counters: Group
org.apache.hadoop.mapred.Task$Counter is deprecated. Use
org.apache.hadoop.mapreduce.TaskCounter instead
15/04/25 13:37:40 INFO mapreduce.ImportJobBase: Retrieved 0 records.
15/04/25 13:37:40 ERROR tool.ImportTool: Error during import: Import job
failed!

thanks
Sajid

On Fri, Apr 24, 2015 at 10:52 PM, Jagat Singh <ja...@gmail.com> wrote:

> What is this error
>
> User edhdtaesvc not found
>
> Are you using any user with that name ?
>
>
>
> On Sat, Apr 25, 2015 at 3:21 PM, Kumar Jayapal <kj...@gmail.com>
> wrote:
>
>> Hi,
>>
>> I am getting the following error while running sqoop import script.
>>
>> can any one please help in resolving this issue.
>>
>>
>>
>> 15/04/25 03:12:34 INFO mapreduce.Job: Running job: job_1429930456969_0006
>> 15/04/25 03:12:48 INFO mapreduce.Job: Job job_1429930456969_0006 running
>> in uber mode : false
>> 15/04/25 03:12:48 INFO mapreduce.Job:  map 0% reduce 0%
>> 15/04/25 03:12:48 INFO mapreduce.Job: Job job_1429930456969_0006 failed
>> with state FAILED due to: Application application_1429930456969_0006 failed
>> 2 times due to AM Container for appattempt_1429930456969_0006_000002 exited
>> with  exitCode: -1000 due to: Application application_1429930456969_0006
>> initialization failed (exitCode=255) with output: User edhdtaesvc not found
>>
>> .Failing this attempt.. Failing the application.
>> 15/04/25 03:12:48 INFO mapreduce.Job: Counters: 0
>> 15/04/25 03:12:48 WARN mapreduce.Counters: Group FileSystemCounters is
>> deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
>> 15/04/25 03:12:48 INFO mapreduce.ImportJobBase: Transferred 0 bytes in
>> 72.8944 seconds (0 bytes/sec)
>> 15/04/25 03:12:48 WARN mapreduce.Counters: Group
>> org.apache.hadoop.mapred.Task$Counter is deprecated. Use
>> org.apache.hadoop.mapreduce.TaskCounter instead
>> 15/04/25 03:12:48 INFO mapreduce.ImportJobBase: Retrieved 0 records.
>> 15/04/25 03:12:48 ERROR tool.ImportTool: Error during import: Import job
>> failed!
>>
>> Thanks
>> Kumar
>>
>>
>

Re: YARN Exceptions

Posted by Kumar Jayapal <kj...@gmail.com>.
Yes Here is the complete log and sqoop import command to get the data from
oracle.

[root@sqpcdh01094p001 ~]# sqoop import  --connect "
jdbc:oracle:thin:@lorct101094t01a.qat.np.costco.com:1521/CT1" --username
"edhdtaesvc" --password "xxxxxxxx" --table "SAPSR3.AUSP"  --target-dir
"/data/crmdq/CT1" --table "SAPSR3.AUSP" --split-by PARTNER_GUID
--as-avrodatafile --compression-codec
org.apache.hadoop.io.compress.SnappyCodec --m 1

Warning:
/opt/cloudera/parcels/CDH-5.3.2-1.cdh5.3.2.p654.326/bin/../lib/sqoop/../accumulo
does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
15/04/25 13:37:19 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.3.2
15/04/25 13:37:19 WARN tool.BaseSqoopTool: Setting your password on the
command-line is insecure. Consider using -P instead.
15/04/25 13:37:20 INFO oracle.OraOopManagerFactory: Data Connector for
Oracle and Hadoop is disabled.
15/04/25 13:37:20 INFO manager.SqlManager: Using default fetchSize of 1000
15/04/25 13:37:20 INFO tool.CodeGenTool: Beginning code generation
15/04/25 13:37:20 INFO manager.OracleManager: Time zone has been set to GMT
15/04/25 13:37:20 INFO manager.SqlManager: Executing SQL statement: SELECT
t.* FROM SAPSR3.AUSP t WHERE 1=0
15/04/25 13:37:20 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is
/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce
Note:
/tmp/sqoop-root/compile/dbe5b6d69507ee60c249062c54813557/SAPSR3_AUSP.java
uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
15/04/25 13:37:22 INFO orm.CompilationManager: Writing jar file:
/tmp/sqoop-root/compile/dbe5b6d69507ee60c249062c54813557/SAPSR3.AUSP.jar
15/04/25 13:37:22 INFO mapreduce.ImportJobBase: Beginning import of
SAPSR3.AUSP
15/04/25 13:37:22 INFO Configuration.deprecation: mapred.jar is deprecated.
Instead, use mapreduce.job.jar
15/04/25 13:37:22 INFO manager.OracleManager: Time zone has been set to GMT
15/04/25 13:37:23 INFO manager.OracleManager: Time zone has been set to GMT
15/04/25 13:37:23 INFO manager.SqlManager: Executing SQL statement: SELECT
t.* FROM SAPSR3.AUSP t WHERE 1=0
15/04/25 13:37:23 INFO mapreduce.DataDrivenImportJob: Writing Avro schema
file:
/tmp/sqoop-root/compile/dbe5b6d69507ee60c249062c54813557/sqoop_import_SAPSR3_AUSP.avsc
15/04/25 13:37:23 INFO Configuration.deprecation: mapred.map.tasks is
deprecated. Instead, use mapreduce.job.maps
15/04/25 13:37:23 INFO hdfs.DFSClient: Created HDFS_DELEGATION_TOKEN token
14047 for edhdtaesvc on ha-hdfs:nameservice1
15/04/25 13:37:23 ERROR hdfs.KeyProviderCache: Could not find uri with key
[dfs.encryption.key.provider.uri] to create a keyProvider !!
15/04/25 13:37:23 INFO security.TokenCache: Got dt for hdfs://nameservice1;
Kind: HDFS_DELEGATION_TOKEN, Service: ha-hdfs:nameservice1, Ident:
(HDFS_DELEGATION_TOKEN token 14047 for edhdtaesvc)
15/04/25 13:37:23 ERROR hdfs.KeyProviderCache: Could not find uri with key
[dfs.encryption.key.provider.uri] to create a keyProvider !!
15/04/25 13:37:23 ERROR hdfs.KeyProviderCache: Could not find uri with key
[dfs.encryption.key.provider.uri] to create a keyProvider !!
15/04/25 13:37:25 ERROR hdfs.KeyProviderCache: Could not find uri with key
[dfs.encryption.key.provider.uri] to create a keyProvider !!
15/04/25 13:37:25 INFO db.DBInputFormat: Using read commited transaction
isolation
15/04/25 13:37:25 INFO mapreduce.JobSubmitter: number of splits:1
15/04/25 13:37:26 INFO mapreduce.JobSubmitter: Submitting tokens for job:
job_1429968417065_0004
15/04/25 13:37:26 INFO mapreduce.JobSubmitter: Kind: HDFS_DELEGATION_TOKEN,
Service: ha-hdfs:nameservice1, Ident: (HDFS_DELEGATION_TOKEN token 14047
for edhdtaesvc)
15/04/25 13:37:26 INFO impl.YarnClientImpl: Submitted application
application_1429968417065_0004
15/04/25 13:37:26 INFO mapreduce.Job: The url to track the job:
http://yrncdh01094p001.corp.costco.com:8088/proxy/application_1429968417065_0004/
15/04/25 13:37:26 INFO mapreduce.Job: Running job: job_1429968417065_0004
15/04/25 13:37:40 INFO mapreduce.Job: Job job_1429968417065_0004 running in
uber mode : false
15/04/25 13:37:40 INFO mapreduce.Job:  map 0% reduce 0%
15/04/25 13:37:40 INFO mapreduce.Job: Job job_1429968417065_0004 failed
with state FAILED due to: Application application_1429968417065_0004 failed
2 times due to AM Container for appattempt_1429968417065_0004_000002 exited
with  exitCode: -1000 due to: Application application_1429968417065_0004
initialization failed (exitCode=255) with output: User edhdtaesvc not found

.Failing this attempt.. Failing the application.
15/04/25 13:37:40 INFO mapreduce.Job: Counters: 0
15/04/25 13:37:40 WARN mapreduce.Counters: Group FileSystemCounters is
deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
15/04/25 13:37:40 INFO mapreduce.ImportJobBase: Transferred 0 bytes in
17.5273 seconds (0 bytes/sec)
15/04/25 13:37:40 WARN mapreduce.Counters: Group
org.apache.hadoop.mapred.Task$Counter is deprecated. Use
org.apache.hadoop.mapreduce.TaskCounter instead
15/04/25 13:37:40 INFO mapreduce.ImportJobBase: Retrieved 0 records.
15/04/25 13:37:40 ERROR tool.ImportTool: Error during import: Import job
failed!

thanks
Sajid

On Fri, Apr 24, 2015 at 10:52 PM, Jagat Singh <ja...@gmail.com> wrote:

> What is this error
>
> User edhdtaesvc not found
>
> Are you using any user with that name ?
>
>
>
> On Sat, Apr 25, 2015 at 3:21 PM, Kumar Jayapal <kj...@gmail.com>
> wrote:
>
>> Hi,
>>
>> I am getting the following error while running sqoop import script.
>>
>> can any one please help in resolving this issue.
>>
>>
>>
>> 15/04/25 03:12:34 INFO mapreduce.Job: Running job: job_1429930456969_0006
>> 15/04/25 03:12:48 INFO mapreduce.Job: Job job_1429930456969_0006 running
>> in uber mode : false
>> 15/04/25 03:12:48 INFO mapreduce.Job:  map 0% reduce 0%
>> 15/04/25 03:12:48 INFO mapreduce.Job: Job job_1429930456969_0006 failed
>> with state FAILED due to: Application application_1429930456969_0006 failed
>> 2 times due to AM Container for appattempt_1429930456969_0006_000002 exited
>> with  exitCode: -1000 due to: Application application_1429930456969_0006
>> initialization failed (exitCode=255) with output: User edhdtaesvc not found
>>
>> .Failing this attempt.. Failing the application.
>> 15/04/25 03:12:48 INFO mapreduce.Job: Counters: 0
>> 15/04/25 03:12:48 WARN mapreduce.Counters: Group FileSystemCounters is
>> deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
>> 15/04/25 03:12:48 INFO mapreduce.ImportJobBase: Transferred 0 bytes in
>> 72.8944 seconds (0 bytes/sec)
>> 15/04/25 03:12:48 WARN mapreduce.Counters: Group
>> org.apache.hadoop.mapred.Task$Counter is deprecated. Use
>> org.apache.hadoop.mapreduce.TaskCounter instead
>> 15/04/25 03:12:48 INFO mapreduce.ImportJobBase: Retrieved 0 records.
>> 15/04/25 03:12:48 ERROR tool.ImportTool: Error during import: Import job
>> failed!
>>
>> Thanks
>> Kumar
>>
>>
>

Re: YARN Exceptions

Posted by Jagat Singh <ja...@gmail.com>.
What is this error

User edhdtaesvc not found

Are you using any user with that name ?



On Sat, Apr 25, 2015 at 3:21 PM, Kumar Jayapal <kj...@gmail.com> wrote:

> Hi,
>
> I am getting the following error while running sqoop import script.
>
> can any one please help in resolving this issue.
>
>
>
> 15/04/25 03:12:34 INFO mapreduce.Job: Running job: job_1429930456969_0006
> 15/04/25 03:12:48 INFO mapreduce.Job: Job job_1429930456969_0006 running
> in uber mode : false
> 15/04/25 03:12:48 INFO mapreduce.Job:  map 0% reduce 0%
> 15/04/25 03:12:48 INFO mapreduce.Job: Job job_1429930456969_0006 failed
> with state FAILED due to: Application application_1429930456969_0006 failed
> 2 times due to AM Container for appattempt_1429930456969_0006_000002 exited
> with  exitCode: -1000 due to: Application application_1429930456969_0006
> initialization failed (exitCode=255) with output: User edhdtaesvc not found
>
> .Failing this attempt.. Failing the application.
> 15/04/25 03:12:48 INFO mapreduce.Job: Counters: 0
> 15/04/25 03:12:48 WARN mapreduce.Counters: Group FileSystemCounters is
> deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
> 15/04/25 03:12:48 INFO mapreduce.ImportJobBase: Transferred 0 bytes in
> 72.8944 seconds (0 bytes/sec)
> 15/04/25 03:12:48 WARN mapreduce.Counters: Group
> org.apache.hadoop.mapred.Task$Counter is deprecated. Use
> org.apache.hadoop.mapreduce.TaskCounter instead
> 15/04/25 03:12:48 INFO mapreduce.ImportJobBase: Retrieved 0 records.
> 15/04/25 03:12:48 ERROR tool.ImportTool: Error during import: Import job
> failed!
>
> Thanks
> Kumar
>
>

Re: YARN Exceptions

Posted by Jagat Singh <ja...@gmail.com>.
What is this error

User edhdtaesvc not found

Are you using any user with that name ?



On Sat, Apr 25, 2015 at 3:21 PM, Kumar Jayapal <kj...@gmail.com> wrote:

> Hi,
>
> I am getting the following error while running sqoop import script.
>
> can any one please help in resolving this issue.
>
>
>
> 15/04/25 03:12:34 INFO mapreduce.Job: Running job: job_1429930456969_0006
> 15/04/25 03:12:48 INFO mapreduce.Job: Job job_1429930456969_0006 running
> in uber mode : false
> 15/04/25 03:12:48 INFO mapreduce.Job:  map 0% reduce 0%
> 15/04/25 03:12:48 INFO mapreduce.Job: Job job_1429930456969_0006 failed
> with state FAILED due to: Application application_1429930456969_0006 failed
> 2 times due to AM Container for appattempt_1429930456969_0006_000002 exited
> with  exitCode: -1000 due to: Application application_1429930456969_0006
> initialization failed (exitCode=255) with output: User edhdtaesvc not found
>
> .Failing this attempt.. Failing the application.
> 15/04/25 03:12:48 INFO mapreduce.Job: Counters: 0
> 15/04/25 03:12:48 WARN mapreduce.Counters: Group FileSystemCounters is
> deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
> 15/04/25 03:12:48 INFO mapreduce.ImportJobBase: Transferred 0 bytes in
> 72.8944 seconds (0 bytes/sec)
> 15/04/25 03:12:48 WARN mapreduce.Counters: Group
> org.apache.hadoop.mapred.Task$Counter is deprecated. Use
> org.apache.hadoop.mapreduce.TaskCounter instead
> 15/04/25 03:12:48 INFO mapreduce.ImportJobBase: Retrieved 0 records.
> 15/04/25 03:12:48 ERROR tool.ImportTool: Error during import: Import job
> failed!
>
> Thanks
> Kumar
>
>

Re: YARN Exceptions

Posted by Jagat Singh <ja...@gmail.com>.
What is this error

User edhdtaesvc not found

Are you using any user with that name ?



On Sat, Apr 25, 2015 at 3:21 PM, Kumar Jayapal <kj...@gmail.com> wrote:

> Hi,
>
> I am getting the following error while running sqoop import script.
>
> can any one please help in resolving this issue.
>
>
>
> 15/04/25 03:12:34 INFO mapreduce.Job: Running job: job_1429930456969_0006
> 15/04/25 03:12:48 INFO mapreduce.Job: Job job_1429930456969_0006 running
> in uber mode : false
> 15/04/25 03:12:48 INFO mapreduce.Job:  map 0% reduce 0%
> 15/04/25 03:12:48 INFO mapreduce.Job: Job job_1429930456969_0006 failed
> with state FAILED due to: Application application_1429930456969_0006 failed
> 2 times due to AM Container for appattempt_1429930456969_0006_000002 exited
> with  exitCode: -1000 due to: Application application_1429930456969_0006
> initialization failed (exitCode=255) with output: User edhdtaesvc not found
>
> .Failing this attempt.. Failing the application.
> 15/04/25 03:12:48 INFO mapreduce.Job: Counters: 0
> 15/04/25 03:12:48 WARN mapreduce.Counters: Group FileSystemCounters is
> deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
> 15/04/25 03:12:48 INFO mapreduce.ImportJobBase: Transferred 0 bytes in
> 72.8944 seconds (0 bytes/sec)
> 15/04/25 03:12:48 WARN mapreduce.Counters: Group
> org.apache.hadoop.mapred.Task$Counter is deprecated. Use
> org.apache.hadoop.mapreduce.TaskCounter instead
> 15/04/25 03:12:48 INFO mapreduce.ImportJobBase: Retrieved 0 records.
> 15/04/25 03:12:48 ERROR tool.ImportTool: Error during import: Import job
> failed!
>
> Thanks
> Kumar
>
>

Re: YARN Exceptions

Posted by Jagat Singh <ja...@gmail.com>.
What is this error

User edhdtaesvc not found

Are you using any user with that name ?



On Sat, Apr 25, 2015 at 3:21 PM, Kumar Jayapal <kj...@gmail.com> wrote:

> Hi,
>
> I am getting the following error while running sqoop import script.
>
> can any one please help in resolving this issue.
>
>
>
> 15/04/25 03:12:34 INFO mapreduce.Job: Running job: job_1429930456969_0006
> 15/04/25 03:12:48 INFO mapreduce.Job: Job job_1429930456969_0006 running
> in uber mode : false
> 15/04/25 03:12:48 INFO mapreduce.Job:  map 0% reduce 0%
> 15/04/25 03:12:48 INFO mapreduce.Job: Job job_1429930456969_0006 failed
> with state FAILED due to: Application application_1429930456969_0006 failed
> 2 times due to AM Container for appattempt_1429930456969_0006_000002 exited
> with  exitCode: -1000 due to: Application application_1429930456969_0006
> initialization failed (exitCode=255) with output: User edhdtaesvc not found
>
> .Failing this attempt.. Failing the application.
> 15/04/25 03:12:48 INFO mapreduce.Job: Counters: 0
> 15/04/25 03:12:48 WARN mapreduce.Counters: Group FileSystemCounters is
> deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
> 15/04/25 03:12:48 INFO mapreduce.ImportJobBase: Transferred 0 bytes in
> 72.8944 seconds (0 bytes/sec)
> 15/04/25 03:12:48 WARN mapreduce.Counters: Group
> org.apache.hadoop.mapred.Task$Counter is deprecated. Use
> org.apache.hadoop.mapreduce.TaskCounter instead
> 15/04/25 03:12:48 INFO mapreduce.ImportJobBase: Retrieved 0 records.
> 15/04/25 03:12:48 ERROR tool.ImportTool: Error during import: Import job
> failed!
>
> Thanks
> Kumar
>
>