You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@oozie.apache.org by Venkatesan Ramachandran <me...@gmail.com> on 2015/07/21 09:15:14 UTC

Sqoop action writing to HCATALOG via Oozie throws class not found exception

Hi All,

I'm running the following Sqoop command to write to HCATALOG on the command
line and it works as expected;

sqoop import --connect jdbc:mysql://c6402/test --table simple --username
sqoop_user --password-file /user/ambari-qa/datastore/testdb_password
--num-mappers 2 --split-by id --hcatalog-database default --hcatalog-table
simple --create-hcatalog-table

But when it run it via Oozie Sqoop action as below, it throws class not
found exception for org/apache/hive/hcatalog/mapreduce/HCatOutputFormat.

I have all the hcatalog jars in the oozie shared lib in the following path:

-rw-r--r--   3 oozie hdfs    1997485 2015-07-21 01:08
/user/oozie/share/lib/lib_20150721010816/hcatalog/ant-1.9.1.jar
-rw-r--r--   3 oozie hdfs      18336 2015-07-21 01:09
/user/oozie/share/lib/lib_20150721010816/hcatalog/ant-launcher-1.9.1.jar
-rw-r--r--   3 oozie hdfs     346729 2015-07-21 01:09
/user/oozie/share/lib/lib_20150721010816/hcatalog/apache-log4j-extras-1.1.jar
-rw-r--r--   3 oozie hdfs     110600 2015-07-21 01:08
/user/oozie/share/lib/lib_20150721010816/hcatalog/bonecp-0.8.0.RELEASE.jar
-rw-r--r--   3 oozie hdfs     339666 2015-07-21 01:08
/user/oozie/share/lib/lib_20150721010816/hcatalog/datanucleus-api-jdo-3.2.6.jar
-rw-r--r--   3 oozie hdfs     294321 2015-07-21 01:08
/user/oozie/share/lib/lib_20150721010816/hcatalog/hive-common-1.2.1.2.3.0.0-2557.jar
-rw-r--r--   3 oozie hdfs     257633 2015-07-21 01:09
/user/oozie/share/lib/lib_20150721010816/hcatalog/hive-hcatalog-core-1.2.1.2.3.0.0-2557.jar
-rw-r--r--   3 oozie hdfs      54328 2015-07-21 01:08
/user/oozie/share/lib/lib_20150721010816/hcatalog/hive-hcatalog-server-extensions-1.2.1.2.3.0.0-2557.jar
-rw-r--r--   3 oozie hdfs    5505255 2015-07-21 01:09
/user/oozie/share/lib/lib_20150721010816/hcatalog/hive-metastore-1.2.1.2.3.0.0-2557.jar
-rw-r--r--   3 oozie hdfs     916926 2015-07-21 01:08
/user/oozie/share/lib/lib_20150721010816/hcatalog/hive-serde-1.2.1.2.3.0.0-2557.jar
-rw-r--r--   3 oozie hdfs     108030 2015-07-21 01:08
/user/oozie/share/lib/lib_20150721010816/hcatalog/hive-webhcat-java-client-1.2.1.2.3.0.0-2557.jar
-rw-r--r--   3 oozie hdfs     232248 2015-07-21 01:08
/user/oozie/share/lib/lib_20150721010816/hcatalog/jackson-core-asl-1.9.13.jar
-rw-r--r--   3 oozie hdfs     780664 2015-07-21 01:09
/user/oozie/share/lib/lib_20150721010816/hcatalog/jackson-mapper-asl-1.9.13.jar
-rw-r--r--   3 oozie hdfs     201124 2015-07-21 01:09
/user/oozie/share/lib/lib_20150721010816/hcatalog/jdo-api-3.0.1.jar
-rw-r--r--   3 oozie hdfs     570478 2015-07-21 01:08
/user/oozie/share/lib/lib_20150721010816/hcatalog/joda-time-2.1.jar
-rw-r--r--   3 oozie hdfs     313686 2015-07-21 01:08
/user/oozie/share/lib/lib_20150721010816/hcatalog/libfb303-0.9.2.jar
-rw-r--r--   3 oozie hdfs     481535 2015-07-21 01:08
/user/oozie/share/lib/lib_20150721010816/hcatalog/log4j-1.2.16.jar
-rw-r--r--   3 oozie hdfs      16134 2015-07-21 01:08
/user/oozie/share/lib/lib_20150721010816/hcatalog/oozie-sharelib-hcatalog-4.2.0.2.3.0.0-2557.jar
-rw-r--r--   3 oozie hdfs      19827 2015-07-21 01:08
/user/oozie/share/lib/lib_20150721010816/hcatalog/opencsv-2.3.jar
-rw-r--r--   3 oozie hdfs    2796935 2015-07-21 01:09
/user/oozie/share/lib/lib_20150721010816/hcatalog/parquet-hadoop-bundle-1.6.0.jar
-rw-r--r--   3 oozie hdfs      26176 2015-07-21 01:08
/user/oozie/share/lib/lib_20150721010816/hcatalog/slf4j-api-1.6.6.jar
-rw-r--r--   3 oozie hdfs       9711 2015-07-21 01:08
/user/oozie/share/lib/lib_20150721010816/hcatalog/slf4j-log4j12-1.6.6.jar


SQOOP command and the exception

Sqoop command arguments :
             import
             --connect
             jdbc:mysql://c6402/test
             --table
             simple
             --username
             sqoop_user
             --password-file
             /user/ambari-qa/datastore/testdb_password
             --num-mappers
             2
             --split-by
             id
             --hcatalog-database
             default
             --hcatalog-table
             simple
             --create-hcatalog-table
Fetching child yarn jobs
tag id : oozie-a6b97cd85c4eba81b5653acf1dd341ad
2015-07-21 06:38:17,708 INFO  [main] client.RMProxy
(RMProxy.java:createRMProxy(98)) - Connecting to ResourceManager at
c6402.ambari.apache.org/192.168.64.102:8050
Child yarn jobs are found -
=================================================================

>>> Invoking Sqoop command line now >>>

5433 [main] WARN  org.apache.sqoop.tool.SqoopTool  - $SQOOP_CONF_DIR
has not been set in the environment. Cannot check for additional
configuration.
2015-07-21 06:38:18,204 WARN  [main] tool.SqoopTool
(SqoopTool.java:loadPluginsFromConfDir(177)) - $SQOOP_CONF_DIR has not
been set in the environment. Cannot check for additional
configuration.
5467 [main] INFO  org.apache.sqoop.Sqoop  - Running Sqoop version:
1.4.6.2.3.0.0-2557
2015-07-21 06:38:18,238 INFO  [main] sqoop.Sqoop
(Sqoop.java:<init>(92)) - Running Sqoop version: 1.4.6.2.3.0.0-2557
5668 [main] WARN  org.apache.sqoop.ConnFactory  - $SQOOP_CONF_DIR has
not been set in the environment. Cannot check for additional
configuration.
2015-07-21 06:38:18,439 WARN  [main] sqoop.ConnFactory
(ConnFactory.java:loadManagersFromConfDir(273)) - $SQOOP_CONF_DIR has
not been set in the environment. Cannot check for additional
configuration.
5797 [main] INFO  org.apache.sqoop.manager.MySQLManager  - Preparing
to use a MySQL streaming resultset.
2015-07-21 06:38:18,568 INFO  [main] manager.MySQLManager
(MySQLManager.java:initOptionDefaults(69)) - Preparing to use a MySQL
streaming resultset.
5797 [main] INFO  org.apache.sqoop.tool.CodeGenTool  - Beginning code generation
2015-07-21 06:38:18,568 INFO  [main] tool.CodeGenTool
(CodeGenTool.java:generateORM(92)) - Beginning code generation
6492 [main] INFO  org.apache.sqoop.manager.SqlManager  - Executing SQL
statement: SELECT t.* FROM `simple` AS t LIMIT 1
2015-07-21 06:38:19,263 INFO  [main] manager.SqlManager
(SqlManager.java:execute(757)) - Executing SQL statement: SELECT t.*
FROM `simple` AS t LIMIT 1
6542 [main] INFO  org.apache.sqoop.manager.SqlManager  - Executing SQL
statement: SELECT t.* FROM `simple` AS t LIMIT 1
2015-07-21 06:38:19,313 INFO  [main] manager.SqlManager
(SqlManager.java:execute(757)) - Executing SQL statement: SELECT t.*
FROM `simple` AS t LIMIT 1
6560 [main] INFO  org.apache.sqoop.orm.CompilationManager  -
HADOOP_MAPRED_HOME is /usr/hdp/2.3.0.0-2557/hadoop-mapreduce
2015-07-21 06:38:19,331 INFO  [main] orm.CompilationManager
(CompilationManager.java:findHadoopJars(94)) - HADOOP_MAPRED_HOME is
/usr/hdp/2.3.0.0-2557/hadoop-mapreduce
9744 [main] INFO  org.apache.sqoop.orm.CompilationManager  - Writing
jar file: /tmp/sqoop-yarn/compile/457f0736705f75f68537447520d1a916/simple.jar
2015-07-21 06:38:22,515 INFO  [main] orm.CompilationManager
(CompilationManager.java:jar(330)) - Writing jar file:
/tmp/sqoop-yarn/compile/457f0736705f75f68537447520d1a916/simple.jar
9797 [main] WARN  org.apache.sqoop.manager.MySQLManager  - It looks
like you are importing from mysql.
2015-07-21 06:38:22,568 WARN  [main] manager.MySQLManager
(MySQLManager.java:importTable(107)) - It looks like you are importing
from mysql.
9797 [main] WARN  org.apache.sqoop.manager.MySQLManager  - This
transfer can be faster! Use the --direct
2015-07-21 06:38:22,568 WARN  [main] manager.MySQLManager
(MySQLManager.java:importTable(108)) - This transfer can be faster!
Use the --direct
9797 [main] WARN  org.apache.sqoop.manager.MySQLManager  - option to
exercise a MySQL-specific fast path.
2015-07-21 06:38:22,568 WARN  [main] manager.MySQLManager
(MySQLManager.java:importTable(109)) - option to exercise a
MySQL-specific fast path.
9797 [main] INFO  org.apache.sqoop.manager.MySQLManager  - Setting
zero DATETIME behavior to convertToNull (mysql)
2015-07-21 06:38:22,568 INFO  [main] manager.MySQLManager
(MySQLManager.java:checkDateTimeBehavior(189)) - Setting zero DATETIME
behavior to convertToNull (mysql)
9813 [main] INFO  org.apache.sqoop.mapreduce.ImportJobBase  -
Beginning import of simple
2015-07-21 06:38:22,584 INFO  [main] mapreduce.ImportJobBase
(ImportJobBase.java:runImport(235)) - Beginning import of simple
2015-07-21 06:38:22,585 INFO  [main] Configuration.deprecation
(Configuration.java:warnOnceIfDeprecated(1173)) - mapred.job.tracker
is deprecated. Instead, use mapreduce.jobtracker.address
2015-07-21 06:38:22,594 INFO  [main] Configuration.deprecation
(Configuration.java:warnOnceIfDeprecated(1173)) - mapred.jar is
deprecated. Instead, use mapreduce.job.jar

<<< Invocation of Main class completed <<<

Failing Oozie Launcher, Main class
[org.apache.oozie.action.hadoop.SqoopMain], main() threw exception,
org/apache/hive/hcatalog/mapreduce/HCatOutputFormat
java.lang.NoClassDefFoundError:
org/apache/hive/hcatalog/mapreduce/HCatOutputFormat
	at org.apache.sqoop.mapreduce.DataDrivenImportJob.getOutputFormatClass(DataDrivenImportJob.java:178)
	at org.apache.sqoop.mapreduce.ImportJobBase.configureOutputFormat(ImportJobBase.java:94)
	at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:259)
	at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673)
	at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:118)
	at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)
	at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
	at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
	at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
	at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
	at org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:197)
	at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:177)
	at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:47)
	at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:46)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:497)
	at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:236)
	at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.lang.ClassNotFoundException:
org.apache.hive.hcatalog.mapreduce.HCatOutputFormat
	at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
	... 30 more

Oozie Launcher failed, finishing Hadoop job gracefully

Oozie Launcher, uploading action data to HDFS sequence file:
hdfs://c6401.ambari.apache.org:8020/user/ambari-qa/oozie-oozi/0000005-150721024948042-oozie-oozi-W/sqoop-node--sqoop/action-data.seq
2015-07-21 <http://c6401.ambari.apache.org:8020/user/ambari-qa/oozie-oozi/0000005-150721024948042-oozie-oozi-W/sqoop-node--sqoop/action-data.seq2015-07-21>
06:38:23,062 INFO  [main] zlib.ZlibFactory
(ZlibFactory.java:<clinit>(49)) - Successfully loaded & initialized
native-zlib library
2015-07-21 06:38:23,063 INFO  [main] compress.CodecPool
(CodecPool.java:getCompressor(153)) - Got brand-new compressor
[.deflate]

Oozie Launcher ends

Re: Sqoop action writing to HCATALOG via Oozie throws class not found exception

Posted by Laurent H <la...@gmail.com>.
I remember that someone had already posted an similar issue.

I've got a issue when i try to import data from RDBMS to HDFS so i decide to
bring data into hdfs first (with sqoop and textfile import option), then
use a LOAD DATA with Hive.

--
Laurent HATIER - Consultant Big Data & Business Intelligence chez CapGemini
fr.linkedin.com/pub/laurent-hatier/25/36b/a86/
<http://fr.linkedin.com/pub/laurent-h/25/36b/a86/>

2015-07-21 11:47 GMT+02:00 Venkat Ramachandran <me...@gmail.com>:

> Makes sense. Let me try
>
> On Tuesday, July 21, 2015, Shwetha Shivalingamurthy <
> sshivalingamurthy@hortonworks.com> wrote:
>
> > Hive-site.xml should be in the class path like
> >
> https://github.com/apache/oozie/blob/master/examples/src/main/apps/hcatalog
> > /workflow.xml
> >
> > Alternatively, you can probably add it to hive sharelib, I am not sure
> >
> > -Shwetha
> >
> > On 21/07/15 1:57 pm, "Venkatesan Ramachandran" <me.venkatr@gmail.com
> > <javascript:;>>
> > wrote:
> >
> > >Thanks Shwetha. Adding hcatalog,hive,hive2 to
> > >oozie.action.sharelib.for.sqoop
> > >in job configuration makes it move beyond the class not found exception.
> > >
> > >However, now Sqoop throws *java.io.IOException:
> > >NoSuchObjectException(message:default.simple table not found) *even
> though
> > >I have manually created the table in default.simple.
> > >
> > >How does the Sqoop action (or oozie) know how to talk to HCAT/metastore
> > >server? is there some other config I'm missing:
> > >
> > >
> > >
> > >*Sqoop action workflow.xml*<workflow-app xmlns="uri:oozie:workflow:0.2"
> > >name="sqoop-wf">
> > >    <start to="sqoop-node"/>
> > >    <action name="sqoop-node">
> > >        <sqoop xmlns="uri:oozie:sqoop-action:0.2">
> > >            <job-tracker>${jobTracker}</job-tracker>
> > >            <name-node>${nameNode}</name-node>
> > >            <configuration>
> > >                <property>
> > >                    <name>mapred.job.queue.name</name>
> > >                    <value>${queueName}</value>
> > >                </property>
> > >            </configuration>
> > >        <command>
> > >        import --connect jdbc:mysql://c6402/test --table simple
> --username
> > >sqoop_user --password-file /user/ambari-qa/datastore/testdb_password
> > >--num-mappers 2 --split-by id --hcatalog-database default
> --hcatalog-table
> > >simple
> > >          </command>
> > >        </sqoop>
> > >        <ok to="end"/>
> > >        <error to="fail"/>
> > >    </action>
> > >
> > >    <kill name="fail">
> > >        <message>Sqoop failed, error
> > >message[${wf:errorMessage(wf:lastErrorNode())}]</message>
> > >    </kill>
> > >    <end name="end"/>
> > ></workflow-app>
> > >
> > >*Exception*:
> > >
> > >Sqoop command arguments :
> > >             import
> > >             --connect
> > >             jdbc:mysql://c6402/test
> > >             --table
> > >             simple
> > >             --username
> > >             sqoop_user
> > >             --password-file
> > >             /user/ambari-qa/datastore/testdb_password
> > >             --num-mappers
> > >             2
> > >             --split-by
> > >             id
> > >             --hcatalog-database
> > >             default
> > >             --hcatalog-table
> > >             simple
> > >
> > >
> > >
> > >7511 [main] INFO  org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities
> > >- Database column names projected : [id, name, value, modified_ts]
> > >2015-07-21 08:18:43,718 INFO  [main] hcat.SqoopHCatUtilities
> > >(SqoopHCatUtilities.java:initDBColumnInfo(519)) - Database column
> > >names projected : [id, name, value, modified_ts]
> > >7511 [main] INFO  org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities
> > >- Database column name - info map :
> > >       modified_ts : [Type : 93,Precision : 19,Scale : 0]
> > >       name : [Type : 12,Precision : 20,Scale : 0]
> > >       id : [Type : 4,Precision : 11,Scale : 0]
> > >       value : [Type : 4,Precision : 11,Scale : 0]
> > >
> > >2015-07-21 08:18:43,718 INFO  [main] hcat.SqoopHCatUtilities
> > >(SqoopHCatUtilities.java:initDBColumnInfo(530)) - Database column name
> > >- info map :
> > >       modified_ts : [Type : 93,Precision : 19,Scale : 0]
> > >       name : [Type : 12,Precision : 20,Scale : 0]
> > >       id : [Type : 4,Precision : 11,Scale : 0]
> > >       value : [Type : 4,Precision : 11,Scale : 0]
> > >
> > >2015-07-21 08:18:44,244 INFO  [main] metastore.HiveMetaStore
> > >(HiveMetaStore.java:newRawStore(589)) - 0: Opening raw store with
> > >implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
> > >2015-07-21 08:18:44,286 INFO  [main] metastore.ObjectStore
> > >(ObjectStore.java:initialize(289)) - ObjectStore, initialize called
> > >2015-07-21 08:18:44,682 INFO  [main] DataNucleus.Persistence
> > >(Log4JLogger.java:info(77)) - Property
> > >hive.metastore.integral.jdo.pushdown unknown - will be ignored
> > >2015-07-21 08:18:44,682 INFO  [main] DataNucleus.Persistence
> > >(Log4JLogger.java:info(77)) - Property datanucleus.cache.level2
> > >unknown - will be ignored
> > >2015-07-21 08:18:49,025 INFO  [main] metastore.ObjectStore
> > >(ObjectStore.java:getPMF(370)) - Setting MetaStore object pin classes
> > >with
> >
> >hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partit
> > >ion,Database,Type,FieldSchema,Order"
> > >2015-07-21 08:18:51,339 INFO  [main] DataNucleus.Datastore
> > >(Log4JLogger.java:info(77)) - The class
> > >"org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
> > >"embedded-only" so does not have its own datastore table.
> > >2015-07-21 08:18:51,341 INFO  [main] DataNucleus.Datastore
> > >(Log4JLogger.java:info(77)) - The class
> > >"org.apache.hadoop.hive.metastore.model.MOrder" is tagged as
> > >"embedded-only" so does not have its own datastore table.
> > >2015-07-21 08:18:53,638 INFO  [main] DataNucleus.Datastore
> > >(Log4JLogger.java:info(77)) - The class
> > >"org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
> > >"embedded-only" so does not have its own datastore table.
> > >2015-07-21 08:18:53,638 INFO  [main] DataNucleus.Datastore
> > >(Log4JLogger.java:info(77)) - The class
> > >"org.apache.hadoop.hive.metastore.model.MOrder" is tagged as
> > >"embedded-only" so does not have its own datastore table.
> > >2015-07-21 08:18:54,116 INFO  [main] metastore.MetaStoreDirectSql
> > >(MetaStoreDirectSql.java:<init>(139)) - Using direct SQL, underlying
> > >DB is DERBY
> > >2015-07-21 08:18:54,120 INFO  [main] metastore.ObjectStore
> > >(ObjectStore.java:setConf(272)) - Initialized ObjectStore
> > >2015-07-21 08:18:54,303 WARN  [main] metastore.ObjectStore
> > >(ObjectStore.java:checkSchema(6658)) - Version information not found
> > >in metastore. hive.metastore.schema.verification is not enabled so
> > >recording the schema version 1.2.0
> > >2015-07-21 08:18:54,508 WARN  [main] metastore.ObjectStore
> > >(ObjectStore.java:getDatabase(568)) - Failed to get database default,
> > >returning NoSuchObjectException
> > >2015-07-21 08:18:54,694 INFO  [main] metastore.HiveMetaStore
> > >(HiveMetaStore.java:createDefaultRoles_core(663)) - Added admin role
> > >in metastore
> > >2015-07-21 08:18:54,701 INFO  [main] metastore.HiveMetaStore
> > >(HiveMetaStore.java:createDefaultRoles_core(672)) - Added public role
> > >in metastore
> > >2015-07-21 08:18:54,813 INFO  [main] metastore.HiveMetaStore
> > >(HiveMetaStore.java:addAdminUsers_core(712)) - No user is added in
> > >admin role, since config is empty
> > >2015-07-21 08:18:54,980 INFO  [main] metastore.HiveMetaStore
> > >(HiveMetaStore.java:logInfo(746)) - 0: get_databases:
> > >NonExistentDatabaseUsedForHealthCheck
> > >2015-07-21 08:18:54,981 INFO  [main] HiveMetaStore.audit
> > >(HiveMetaStore.java:logAuditEvent(371)) -
> > >ugi=ambari-qa  ip=unknown-ip-addr      cmd=get_databases:
> > >NonExistentDatabaseUsedForHealthCheck
> > >2015-07-21 08:18:55,019 INFO  [main] metastore.HiveMetaStore
> > >(HiveMetaStore.java:logInfo(746)) - 0: get_table : db=default
> > >tbl=simple
> > >2015-07-21 08:18:55,022 INFO  [main] HiveMetaStore.audit
> > >(HiveMetaStore.java:logAuditEvent(371)) -
> > >ugi=ambari-qa  ip=unknown-ip-addr      cmd=get_table : db=default
> > tbl=simple
> > >18847 [main] ERROR org.apache.sqoop.tool.ImportTool  - Encountered
> > >IOException running import job: java.io.IOException:
> > >NoSuchObjectException(message:default.simple table not found)
> > >       at
> >
> >org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputForma
> > >t.java:97)
> > >       at
> >
> >org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputForma
> > >t.java:51)
> > >       at
> >
> >org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.configureHCat(SqoopHCat
> > >Utilities.java:343)
> > >       at
> >
> >org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.configureImportOutputFo
> > >rmat(SqoopHCatUtilities.java:783)
> > >       at
> >
> >org.apache.sqoop.mapreduce.ImportJobBase.configureOutputFormat(ImportJobBa
> > >se.java:98)
> > >       at
> >
> >org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:259)
> > >       at
> > org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673)
> > >       at
> > >org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:118)
> > >       at
> > org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)
> > >       at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
> > >       at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
> > >       at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> > >       at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
> > >       at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
> > >       at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
> > >       at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
> > >       at
> > >org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:197)
> > >       at
> org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:177)
> > >       at
> > org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:47)
> > >       at
> org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:46)
> > >       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > >       at
> >
> >sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:
> > >62)
> > >       at
> >
> >sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorIm
> > >pl.java:43)
> > >       at java.lang.reflect.Method.invoke(Method.java:497)
> > >       at
> >
> >org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:236)
> > >       at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
> > >       at
> org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
> > >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
> > >       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
> > >       at java.security.AccessController.doPrivileged(Native Method)
> > >       at javax.security.auth.Subject.doAs(Subject.java:422)
> > >       at
> >
> >org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.
> > >java:1657)
> > >       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
> > >Caused by: NoSuchObjectException(message:default.simple table not found)
> > >       at
> >
> >org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table_core(H
> > >iveMetaStore.java:1808)
> > >       at
> >
> >org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMe
> > >taStore.java:1778)
> > >       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > >       at
> >
> >sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:
> > >62)
> > >       at
> >
> >sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorIm
> > >pl.java:43)
> > >       at java.lang.reflect.Method.invoke(Method.java:497)
> > >       at
> >
> >org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHand
> > >ler.java:107)
> > >       at com.sun.proxy.$Proxy20.get_table(Unknown Source)
> > >       at
> >
> >org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStor
> > >eClient.java:1208)
> > >       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > >       at
> >
> >sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:
> > >62)
> > >       at
> >
> >sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorIm
> > >pl.java:43)
> > >       at java.lang.reflect.Method.invoke(Method.java:497)
> > >       at
> >
> >org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMe
> > >taStoreClient.java:152)
> > >       at com.sun.proxy.$Proxy21.getTable(Unknown Source)
> > >       at
> > org.apache.hive.hcatalog.common.HCatUtil.getTable(HCatUtil.java:180)
> > >       at
> >
> >org.apache.hive.hcatalog.mapreduce.InitializeInput.getInputJobInfo(Initial
> > >izeInput.java:105)
> > >       at
> >
> >org.apache.hive.hcatalog.mapreduce.InitializeInput.setInput(InitializeInpu
> > >t.java:86)
> > >       at
> >
> >org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputForma
> > >t.java:95)
> > >       ... 32 more
> >
> >
>

Re: Sqoop action writing to HCATALOG via Oozie throws class not found exception

Posted by Venkat Ramachandran <me...@gmail.com>.
Makes sense. Let me try

On Tuesday, July 21, 2015, Shwetha Shivalingamurthy <
sshivalingamurthy@hortonworks.com> wrote:

> Hive-site.xml should be in the class path like
> https://github.com/apache/oozie/blob/master/examples/src/main/apps/hcatalog
> /workflow.xml
>
> Alternatively, you can probably add it to hive sharelib, I am not sure
>
> -Shwetha
>
> On 21/07/15 1:57 pm, "Venkatesan Ramachandran" <me.venkatr@gmail.com
> <javascript:;>>
> wrote:
>
> >Thanks Shwetha. Adding hcatalog,hive,hive2 to
> >oozie.action.sharelib.for.sqoop
> >in job configuration makes it move beyond the class not found exception.
> >
> >However, now Sqoop throws *java.io.IOException:
> >NoSuchObjectException(message:default.simple table not found) *even though
> >I have manually created the table in default.simple.
> >
> >How does the Sqoop action (or oozie) know how to talk to HCAT/metastore
> >server? is there some other config I'm missing:
> >
> >
> >
> >*Sqoop action workflow.xml*<workflow-app xmlns="uri:oozie:workflow:0.2"
> >name="sqoop-wf">
> >    <start to="sqoop-node"/>
> >    <action name="sqoop-node">
> >        <sqoop xmlns="uri:oozie:sqoop-action:0.2">
> >            <job-tracker>${jobTracker}</job-tracker>
> >            <name-node>${nameNode}</name-node>
> >            <configuration>
> >                <property>
> >                    <name>mapred.job.queue.name</name>
> >                    <value>${queueName}</value>
> >                </property>
> >            </configuration>
> >        <command>
> >        import --connect jdbc:mysql://c6402/test --table simple --username
> >sqoop_user --password-file /user/ambari-qa/datastore/testdb_password
> >--num-mappers 2 --split-by id --hcatalog-database default --hcatalog-table
> >simple
> >          </command>
> >        </sqoop>
> >        <ok to="end"/>
> >        <error to="fail"/>
> >    </action>
> >
> >    <kill name="fail">
> >        <message>Sqoop failed, error
> >message[${wf:errorMessage(wf:lastErrorNode())}]</message>
> >    </kill>
> >    <end name="end"/>
> ></workflow-app>
> >
> >*Exception*:
> >
> >Sqoop command arguments :
> >             import
> >             --connect
> >             jdbc:mysql://c6402/test
> >             --table
> >             simple
> >             --username
> >             sqoop_user
> >             --password-file
> >             /user/ambari-qa/datastore/testdb_password
> >             --num-mappers
> >             2
> >             --split-by
> >             id
> >             --hcatalog-database
> >             default
> >             --hcatalog-table
> >             simple
> >
> >
> >
> >7511 [main] INFO  org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities
> >- Database column names projected : [id, name, value, modified_ts]
> >2015-07-21 08:18:43,718 INFO  [main] hcat.SqoopHCatUtilities
> >(SqoopHCatUtilities.java:initDBColumnInfo(519)) - Database column
> >names projected : [id, name, value, modified_ts]
> >7511 [main] INFO  org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities
> >- Database column name - info map :
> >       modified_ts : [Type : 93,Precision : 19,Scale : 0]
> >       name : [Type : 12,Precision : 20,Scale : 0]
> >       id : [Type : 4,Precision : 11,Scale : 0]
> >       value : [Type : 4,Precision : 11,Scale : 0]
> >
> >2015-07-21 08:18:43,718 INFO  [main] hcat.SqoopHCatUtilities
> >(SqoopHCatUtilities.java:initDBColumnInfo(530)) - Database column name
> >- info map :
> >       modified_ts : [Type : 93,Precision : 19,Scale : 0]
> >       name : [Type : 12,Precision : 20,Scale : 0]
> >       id : [Type : 4,Precision : 11,Scale : 0]
> >       value : [Type : 4,Precision : 11,Scale : 0]
> >
> >2015-07-21 08:18:44,244 INFO  [main] metastore.HiveMetaStore
> >(HiveMetaStore.java:newRawStore(589)) - 0: Opening raw store with
> >implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
> >2015-07-21 08:18:44,286 INFO  [main] metastore.ObjectStore
> >(ObjectStore.java:initialize(289)) - ObjectStore, initialize called
> >2015-07-21 08:18:44,682 INFO  [main] DataNucleus.Persistence
> >(Log4JLogger.java:info(77)) - Property
> >hive.metastore.integral.jdo.pushdown unknown - will be ignored
> >2015-07-21 08:18:44,682 INFO  [main] DataNucleus.Persistence
> >(Log4JLogger.java:info(77)) - Property datanucleus.cache.level2
> >unknown - will be ignored
> >2015-07-21 08:18:49,025 INFO  [main] metastore.ObjectStore
> >(ObjectStore.java:getPMF(370)) - Setting MetaStore object pin classes
> >with
> >hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partit
> >ion,Database,Type,FieldSchema,Order"
> >2015-07-21 08:18:51,339 INFO  [main] DataNucleus.Datastore
> >(Log4JLogger.java:info(77)) - The class
> >"org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
> >"embedded-only" so does not have its own datastore table.
> >2015-07-21 08:18:51,341 INFO  [main] DataNucleus.Datastore
> >(Log4JLogger.java:info(77)) - The class
> >"org.apache.hadoop.hive.metastore.model.MOrder" is tagged as
> >"embedded-only" so does not have its own datastore table.
> >2015-07-21 08:18:53,638 INFO  [main] DataNucleus.Datastore
> >(Log4JLogger.java:info(77)) - The class
> >"org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
> >"embedded-only" so does not have its own datastore table.
> >2015-07-21 08:18:53,638 INFO  [main] DataNucleus.Datastore
> >(Log4JLogger.java:info(77)) - The class
> >"org.apache.hadoop.hive.metastore.model.MOrder" is tagged as
> >"embedded-only" so does not have its own datastore table.
> >2015-07-21 08:18:54,116 INFO  [main] metastore.MetaStoreDirectSql
> >(MetaStoreDirectSql.java:<init>(139)) - Using direct SQL, underlying
> >DB is DERBY
> >2015-07-21 08:18:54,120 INFO  [main] metastore.ObjectStore
> >(ObjectStore.java:setConf(272)) - Initialized ObjectStore
> >2015-07-21 08:18:54,303 WARN  [main] metastore.ObjectStore
> >(ObjectStore.java:checkSchema(6658)) - Version information not found
> >in metastore. hive.metastore.schema.verification is not enabled so
> >recording the schema version 1.2.0
> >2015-07-21 08:18:54,508 WARN  [main] metastore.ObjectStore
> >(ObjectStore.java:getDatabase(568)) - Failed to get database default,
> >returning NoSuchObjectException
> >2015-07-21 08:18:54,694 INFO  [main] metastore.HiveMetaStore
> >(HiveMetaStore.java:createDefaultRoles_core(663)) - Added admin role
> >in metastore
> >2015-07-21 08:18:54,701 INFO  [main] metastore.HiveMetaStore
> >(HiveMetaStore.java:createDefaultRoles_core(672)) - Added public role
> >in metastore
> >2015-07-21 08:18:54,813 INFO  [main] metastore.HiveMetaStore
> >(HiveMetaStore.java:addAdminUsers_core(712)) - No user is added in
> >admin role, since config is empty
> >2015-07-21 08:18:54,980 INFO  [main] metastore.HiveMetaStore
> >(HiveMetaStore.java:logInfo(746)) - 0: get_databases:
> >NonExistentDatabaseUsedForHealthCheck
> >2015-07-21 08:18:54,981 INFO  [main] HiveMetaStore.audit
> >(HiveMetaStore.java:logAuditEvent(371)) -
> >ugi=ambari-qa  ip=unknown-ip-addr      cmd=get_databases:
> >NonExistentDatabaseUsedForHealthCheck
> >2015-07-21 08:18:55,019 INFO  [main] metastore.HiveMetaStore
> >(HiveMetaStore.java:logInfo(746)) - 0: get_table : db=default
> >tbl=simple
> >2015-07-21 08:18:55,022 INFO  [main] HiveMetaStore.audit
> >(HiveMetaStore.java:logAuditEvent(371)) -
> >ugi=ambari-qa  ip=unknown-ip-addr      cmd=get_table : db=default
> tbl=simple
> >18847 [main] ERROR org.apache.sqoop.tool.ImportTool  - Encountered
> >IOException running import job: java.io.IOException:
> >NoSuchObjectException(message:default.simple table not found)
> >       at
> >org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputForma
> >t.java:97)
> >       at
> >org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputForma
> >t.java:51)
> >       at
> >org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.configureHCat(SqoopHCat
> >Utilities.java:343)
> >       at
> >org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.configureImportOutputFo
> >rmat(SqoopHCatUtilities.java:783)
> >       at
> >org.apache.sqoop.mapreduce.ImportJobBase.configureOutputFormat(ImportJobBa
> >se.java:98)
> >       at
> >org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:259)
> >       at
> org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673)
> >       at
> >org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:118)
> >       at
> org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)
> >       at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
> >       at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
> >       at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> >       at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
> >       at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
> >       at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
> >       at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
> >       at
> >org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:197)
> >       at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:177)
> >       at
> org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:47)
> >       at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:46)
> >       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >       at
> >sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:
> >62)
> >       at
> >sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorIm
> >pl.java:43)
> >       at java.lang.reflect.Method.invoke(Method.java:497)
> >       at
> >org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:236)
> >       at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
> >       at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
> >       at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
> >       at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
> >       at java.security.AccessController.doPrivileged(Native Method)
> >       at javax.security.auth.Subject.doAs(Subject.java:422)
> >       at
> >org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.
> >java:1657)
> >       at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
> >Caused by: NoSuchObjectException(message:default.simple table not found)
> >       at
> >org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table_core(H
> >iveMetaStore.java:1808)
> >       at
> >org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMe
> >taStore.java:1778)
> >       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >       at
> >sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:
> >62)
> >       at
> >sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorIm
> >pl.java:43)
> >       at java.lang.reflect.Method.invoke(Method.java:497)
> >       at
> >org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHand
> >ler.java:107)
> >       at com.sun.proxy.$Proxy20.get_table(Unknown Source)
> >       at
> >org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStor
> >eClient.java:1208)
> >       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >       at
> >sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:
> >62)
> >       at
> >sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorIm
> >pl.java:43)
> >       at java.lang.reflect.Method.invoke(Method.java:497)
> >       at
> >org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMe
> >taStoreClient.java:152)
> >       at com.sun.proxy.$Proxy21.getTable(Unknown Source)
> >       at
> org.apache.hive.hcatalog.common.HCatUtil.getTable(HCatUtil.java:180)
> >       at
> >org.apache.hive.hcatalog.mapreduce.InitializeInput.getInputJobInfo(Initial
> >izeInput.java:105)
> >       at
> >org.apache.hive.hcatalog.mapreduce.InitializeInput.setInput(InitializeInpu
> >t.java:86)
> >       at
> >org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputForma
> >t.java:95)
> >       ... 32 more
>
>

Re: Sqoop action writing to HCATALOG via Oozie throws class not found exception

Posted by Shwetha Shivalingamurthy <ss...@hortonworks.com>.
Hive-site.xml should be in the class path like
https://github.com/apache/oozie/blob/master/examples/src/main/apps/hcatalog
/workflow.xml

Alternatively, you can probably add it to hive sharelib, I am not sure

-Shwetha

On 21/07/15 1:57 pm, "Venkatesan Ramachandran" <me...@gmail.com>
wrote:

>Thanks Shwetha. Adding hcatalog,hive,hive2 to
>oozie.action.sharelib.for.sqoop
>in job configuration makes it move beyond the class not found exception.
>
>However, now Sqoop throws *java.io.IOException:
>NoSuchObjectException(message:default.simple table not found) *even though
>I have manually created the table in default.simple.
>
>How does the Sqoop action (or oozie) know how to talk to HCAT/metastore
>server? is there some other config I'm missing:
>
>
>
>*Sqoop action workflow.xml*<workflow-app xmlns="uri:oozie:workflow:0.2"
>name="sqoop-wf">
>    <start to="sqoop-node"/>
>    <action name="sqoop-node">
>        <sqoop xmlns="uri:oozie:sqoop-action:0.2">
>            <job-tracker>${jobTracker}</job-tracker>
>            <name-node>${nameNode}</name-node>
>            <configuration>
>                <property>
>                    <name>mapred.job.queue.name</name>
>                    <value>${queueName}</value>
>                </property>
>            </configuration>
>        <command>
>        import --connect jdbc:mysql://c6402/test --table simple --username
>sqoop_user --password-file /user/ambari-qa/datastore/testdb_password
>--num-mappers 2 --split-by id --hcatalog-database default --hcatalog-table
>simple
>          </command>
>        </sqoop>
>        <ok to="end"/>
>        <error to="fail"/>
>    </action>
>
>    <kill name="fail">
>        <message>Sqoop failed, error
>message[${wf:errorMessage(wf:lastErrorNode())}]</message>
>    </kill>
>    <end name="end"/>
></workflow-app>
>
>*Exception*:
>
>Sqoop command arguments :
>             import
>             --connect
>             jdbc:mysql://c6402/test
>             --table
>             simple
>             --username
>             sqoop_user
>             --password-file
>             /user/ambari-qa/datastore/testdb_password
>             --num-mappers
>             2
>             --split-by
>             id
>             --hcatalog-database
>             default
>             --hcatalog-table
>             simple
>
>
>
>7511 [main] INFO  org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities
>- Database column names projected : [id, name, value, modified_ts]
>2015-07-21 08:18:43,718 INFO  [main] hcat.SqoopHCatUtilities
>(SqoopHCatUtilities.java:initDBColumnInfo(519)) - Database column
>names projected : [id, name, value, modified_ts]
>7511 [main] INFO  org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities
>- Database column name - info map :
>	modified_ts : [Type : 93,Precision : 19,Scale : 0]
>	name : [Type : 12,Precision : 20,Scale : 0]
>	id : [Type : 4,Precision : 11,Scale : 0]
>	value : [Type : 4,Precision : 11,Scale : 0]
>
>2015-07-21 08:18:43,718 INFO  [main] hcat.SqoopHCatUtilities
>(SqoopHCatUtilities.java:initDBColumnInfo(530)) - Database column name
>- info map :
>	modified_ts : [Type : 93,Precision : 19,Scale : 0]
>	name : [Type : 12,Precision : 20,Scale : 0]
>	id : [Type : 4,Precision : 11,Scale : 0]
>	value : [Type : 4,Precision : 11,Scale : 0]
>
>2015-07-21 08:18:44,244 INFO  [main] metastore.HiveMetaStore
>(HiveMetaStore.java:newRawStore(589)) - 0: Opening raw store with
>implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
>2015-07-21 08:18:44,286 INFO  [main] metastore.ObjectStore
>(ObjectStore.java:initialize(289)) - ObjectStore, initialize called
>2015-07-21 08:18:44,682 INFO  [main] DataNucleus.Persistence
>(Log4JLogger.java:info(77)) - Property
>hive.metastore.integral.jdo.pushdown unknown - will be ignored
>2015-07-21 08:18:44,682 INFO  [main] DataNucleus.Persistence
>(Log4JLogger.java:info(77)) - Property datanucleus.cache.level2
>unknown - will be ignored
>2015-07-21 08:18:49,025 INFO  [main] metastore.ObjectStore
>(ObjectStore.java:getPMF(370)) - Setting MetaStore object pin classes
>with 
>hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partit
>ion,Database,Type,FieldSchema,Order"
>2015-07-21 08:18:51,339 INFO  [main] DataNucleus.Datastore
>(Log4JLogger.java:info(77)) - The class
>"org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
>"embedded-only" so does not have its own datastore table.
>2015-07-21 08:18:51,341 INFO  [main] DataNucleus.Datastore
>(Log4JLogger.java:info(77)) - The class
>"org.apache.hadoop.hive.metastore.model.MOrder" is tagged as
>"embedded-only" so does not have its own datastore table.
>2015-07-21 08:18:53,638 INFO  [main] DataNucleus.Datastore
>(Log4JLogger.java:info(77)) - The class
>"org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
>"embedded-only" so does not have its own datastore table.
>2015-07-21 08:18:53,638 INFO  [main] DataNucleus.Datastore
>(Log4JLogger.java:info(77)) - The class
>"org.apache.hadoop.hive.metastore.model.MOrder" is tagged as
>"embedded-only" so does not have its own datastore table.
>2015-07-21 08:18:54,116 INFO  [main] metastore.MetaStoreDirectSql
>(MetaStoreDirectSql.java:<init>(139)) - Using direct SQL, underlying
>DB is DERBY
>2015-07-21 08:18:54,120 INFO  [main] metastore.ObjectStore
>(ObjectStore.java:setConf(272)) - Initialized ObjectStore
>2015-07-21 08:18:54,303 WARN  [main] metastore.ObjectStore
>(ObjectStore.java:checkSchema(6658)) - Version information not found
>in metastore. hive.metastore.schema.verification is not enabled so
>recording the schema version 1.2.0
>2015-07-21 08:18:54,508 WARN  [main] metastore.ObjectStore
>(ObjectStore.java:getDatabase(568)) - Failed to get database default,
>returning NoSuchObjectException
>2015-07-21 08:18:54,694 INFO  [main] metastore.HiveMetaStore
>(HiveMetaStore.java:createDefaultRoles_core(663)) - Added admin role
>in metastore
>2015-07-21 08:18:54,701 INFO  [main] metastore.HiveMetaStore
>(HiveMetaStore.java:createDefaultRoles_core(672)) - Added public role
>in metastore
>2015-07-21 08:18:54,813 INFO  [main] metastore.HiveMetaStore
>(HiveMetaStore.java:addAdminUsers_core(712)) - No user is added in
>admin role, since config is empty
>2015-07-21 08:18:54,980 INFO  [main] metastore.HiveMetaStore
>(HiveMetaStore.java:logInfo(746)) - 0: get_databases:
>NonExistentDatabaseUsedForHealthCheck
>2015-07-21 08:18:54,981 INFO  [main] HiveMetaStore.audit
>(HiveMetaStore.java:logAuditEvent(371)) -
>ugi=ambari-qa	ip=unknown-ip-addr	cmd=get_databases:
>NonExistentDatabaseUsedForHealthCheck
>2015-07-21 08:18:55,019 INFO  [main] metastore.HiveMetaStore
>(HiveMetaStore.java:logInfo(746)) - 0: get_table : db=default
>tbl=simple
>2015-07-21 08:18:55,022 INFO  [main] HiveMetaStore.audit
>(HiveMetaStore.java:logAuditEvent(371)) -
>ugi=ambari-qa	ip=unknown-ip-addr	cmd=get_table : db=default tbl=simple
>18847 [main] ERROR org.apache.sqoop.tool.ImportTool  - Encountered
>IOException running import job: java.io.IOException:
>NoSuchObjectException(message:default.simple table not found)
>	at 
>org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputForma
>t.java:97)
>	at 
>org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputForma
>t.java:51)
>	at 
>org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.configureHCat(SqoopHCat
>Utilities.java:343)
>	at 
>org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.configureImportOutputFo
>rmat(SqoopHCatUtilities.java:783)
>	at 
>org.apache.sqoop.mapreduce.ImportJobBase.configureOutputFormat(ImportJobBa
>se.java:98)
>	at 
>org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:259)
>	at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673)
>	at 
>org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:118)
>	at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)
>	at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
>	at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
>	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>	at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
>	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
>	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
>	at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
>	at 
>org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:197)
>	at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:177)
>	at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:47)
>	at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:46)
>	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>	at 
>sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:
>62)
>	at 
>sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorIm
>pl.java:43)
>	at java.lang.reflect.Method.invoke(Method.java:497)
>	at 
>org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:236)
>	at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
>	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
>	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
>	at java.security.AccessController.doPrivileged(Native Method)
>	at javax.security.auth.Subject.doAs(Subject.java:422)
>	at 
>org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.
>java:1657)
>	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
>Caused by: NoSuchObjectException(message:default.simple table not found)
>	at 
>org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table_core(H
>iveMetaStore.java:1808)
>	at 
>org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMe
>taStore.java:1778)
>	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>	at 
>sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:
>62)
>	at 
>sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorIm
>pl.java:43)
>	at java.lang.reflect.Method.invoke(Method.java:497)
>	at 
>org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHand
>ler.java:107)
>	at com.sun.proxy.$Proxy20.get_table(Unknown Source)
>	at 
>org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStor
>eClient.java:1208)
>	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>	at 
>sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:
>62)
>	at 
>sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorIm
>pl.java:43)
>	at java.lang.reflect.Method.invoke(Method.java:497)
>	at 
>org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMe
>taStoreClient.java:152)
>	at com.sun.proxy.$Proxy21.getTable(Unknown Source)
>	at org.apache.hive.hcatalog.common.HCatUtil.getTable(HCatUtil.java:180)
>	at 
>org.apache.hive.hcatalog.mapreduce.InitializeInput.getInputJobInfo(Initial
>izeInput.java:105)
>	at 
>org.apache.hive.hcatalog.mapreduce.InitializeInput.setInput(InitializeInpu
>t.java:86)
>	at 
>org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputForma
>t.java:95)
>	... 32 more


Re: Sqoop action writing to HCATALOG via Oozie throws class not found exception

Posted by Venkatesan Ramachandran <me...@gmail.com>.
Thanks Shwetha. Adding hcatalog,hive,hive2 to oozie.action.sharelib.for.sqoop
in job configuration makes it move beyond the class not found exception.

However, now Sqoop throws *java.io.IOException:
NoSuchObjectException(message:default.simple table not found) *even though
I have manually created the table in default.simple.

How does the Sqoop action (or oozie) know how to talk to HCAT/metastore
server? is there some other config I'm missing:



*Sqoop action workflow.xml*<workflow-app xmlns="uri:oozie:workflow:0.2"
name="sqoop-wf">
    <start to="sqoop-node"/>
    <action name="sqoop-node">
        <sqoop xmlns="uri:oozie:sqoop-action:0.2">
            <job-tracker>${jobTracker}</job-tracker>
            <name-node>${nameNode}</name-node>
            <configuration>
                <property>
                    <name>mapred.job.queue.name</name>
                    <value>${queueName}</value>
                </property>
            </configuration>
        <command>
        import --connect jdbc:mysql://c6402/test --table simple --username
sqoop_user --password-file /user/ambari-qa/datastore/testdb_password
--num-mappers 2 --split-by id --hcatalog-database default --hcatalog-table
simple
          </command>
        </sqoop>
        <ok to="end"/>
        <error to="fail"/>
    </action>

    <kill name="fail">
        <message>Sqoop failed, error
message[${wf:errorMessage(wf:lastErrorNode())}]</message>
    </kill>
    <end name="end"/>
</workflow-app>

*Exception*:

Sqoop command arguments :
             import
             --connect
             jdbc:mysql://c6402/test
             --table
             simple
             --username
             sqoop_user
             --password-file
             /user/ambari-qa/datastore/testdb_password
             --num-mappers
             2
             --split-by
             id
             --hcatalog-database
             default
             --hcatalog-table
             simple



7511 [main] INFO  org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities
- Database column names projected : [id, name, value, modified_ts]
2015-07-21 08:18:43,718 INFO  [main] hcat.SqoopHCatUtilities
(SqoopHCatUtilities.java:initDBColumnInfo(519)) - Database column
names projected : [id, name, value, modified_ts]
7511 [main] INFO  org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities
- Database column name - info map :
	modified_ts : [Type : 93,Precision : 19,Scale : 0]
	name : [Type : 12,Precision : 20,Scale : 0]
	id : [Type : 4,Precision : 11,Scale : 0]
	value : [Type : 4,Precision : 11,Scale : 0]

2015-07-21 08:18:43,718 INFO  [main] hcat.SqoopHCatUtilities
(SqoopHCatUtilities.java:initDBColumnInfo(530)) - Database column name
- info map :
	modified_ts : [Type : 93,Precision : 19,Scale : 0]
	name : [Type : 12,Precision : 20,Scale : 0]
	id : [Type : 4,Precision : 11,Scale : 0]
	value : [Type : 4,Precision : 11,Scale : 0]

2015-07-21 08:18:44,244 INFO  [main] metastore.HiveMetaStore
(HiveMetaStore.java:newRawStore(589)) - 0: Opening raw store with
implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
2015-07-21 08:18:44,286 INFO  [main] metastore.ObjectStore
(ObjectStore.java:initialize(289)) - ObjectStore, initialize called
2015-07-21 08:18:44,682 INFO  [main] DataNucleus.Persistence
(Log4JLogger.java:info(77)) - Property
hive.metastore.integral.jdo.pushdown unknown - will be ignored
2015-07-21 08:18:44,682 INFO  [main] DataNucleus.Persistence
(Log4JLogger.java:info(77)) - Property datanucleus.cache.level2
unknown - will be ignored
2015-07-21 08:18:49,025 INFO  [main] metastore.ObjectStore
(ObjectStore.java:getPMF(370)) - Setting MetaStore object pin classes
with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
2015-07-21 08:18:51,339 INFO  [main] DataNucleus.Datastore
(Log4JLogger.java:info(77)) - The class
"org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
"embedded-only" so does not have its own datastore table.
2015-07-21 08:18:51,341 INFO  [main] DataNucleus.Datastore
(Log4JLogger.java:info(77)) - The class
"org.apache.hadoop.hive.metastore.model.MOrder" is tagged as
"embedded-only" so does not have its own datastore table.
2015-07-21 08:18:53,638 INFO  [main] DataNucleus.Datastore
(Log4JLogger.java:info(77)) - The class
"org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as
"embedded-only" so does not have its own datastore table.
2015-07-21 08:18:53,638 INFO  [main] DataNucleus.Datastore
(Log4JLogger.java:info(77)) - The class
"org.apache.hadoop.hive.metastore.model.MOrder" is tagged as
"embedded-only" so does not have its own datastore table.
2015-07-21 08:18:54,116 INFO  [main] metastore.MetaStoreDirectSql
(MetaStoreDirectSql.java:<init>(139)) - Using direct SQL, underlying
DB is DERBY
2015-07-21 08:18:54,120 INFO  [main] metastore.ObjectStore
(ObjectStore.java:setConf(272)) - Initialized ObjectStore
2015-07-21 08:18:54,303 WARN  [main] metastore.ObjectStore
(ObjectStore.java:checkSchema(6658)) - Version information not found
in metastore. hive.metastore.schema.verification is not enabled so
recording the schema version 1.2.0
2015-07-21 08:18:54,508 WARN  [main] metastore.ObjectStore
(ObjectStore.java:getDatabase(568)) - Failed to get database default,
returning NoSuchObjectException
2015-07-21 08:18:54,694 INFO  [main] metastore.HiveMetaStore
(HiveMetaStore.java:createDefaultRoles_core(663)) - Added admin role
in metastore
2015-07-21 08:18:54,701 INFO  [main] metastore.HiveMetaStore
(HiveMetaStore.java:createDefaultRoles_core(672)) - Added public role
in metastore
2015-07-21 08:18:54,813 INFO  [main] metastore.HiveMetaStore
(HiveMetaStore.java:addAdminUsers_core(712)) - No user is added in
admin role, since config is empty
2015-07-21 08:18:54,980 INFO  [main] metastore.HiveMetaStore
(HiveMetaStore.java:logInfo(746)) - 0: get_databases:
NonExistentDatabaseUsedForHealthCheck
2015-07-21 08:18:54,981 INFO  [main] HiveMetaStore.audit
(HiveMetaStore.java:logAuditEvent(371)) -
ugi=ambari-qa	ip=unknown-ip-addr	cmd=get_databases:
NonExistentDatabaseUsedForHealthCheck
2015-07-21 08:18:55,019 INFO  [main] metastore.HiveMetaStore
(HiveMetaStore.java:logInfo(746)) - 0: get_table : db=default
tbl=simple
2015-07-21 08:18:55,022 INFO  [main] HiveMetaStore.audit
(HiveMetaStore.java:logAuditEvent(371)) -
ugi=ambari-qa	ip=unknown-ip-addr	cmd=get_table : db=default tbl=simple
18847 [main] ERROR org.apache.sqoop.tool.ImportTool  - Encountered
IOException running import job: java.io.IOException:
NoSuchObjectException(message:default.simple table not found)
	at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:97)
	at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:51)
	at org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.configureHCat(SqoopHCatUtilities.java:343)
	at org.apache.sqoop.mapreduce.hcat.SqoopHCatUtilities.configureImportOutputFormat(SqoopHCatUtilities.java:783)
	at org.apache.sqoop.mapreduce.ImportJobBase.configureOutputFormat(ImportJobBase.java:98)
	at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:259)
	at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673)
	at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:118)
	at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)
	at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
	at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
	at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
	at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
	at org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:197)
	at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:177)
	at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:47)
	at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:46)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:497)
	at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:236)
	at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: NoSuchObjectException(message:default.simple table not found)
	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table_core(HiveMetaStore.java:1808)
	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1778)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:497)
	at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:107)
	at com.sun.proxy.$Proxy20.get_table(Unknown Source)
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:1208)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:497)
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:152)
	at com.sun.proxy.$Proxy21.getTable(Unknown Source)
	at org.apache.hive.hcatalog.common.HCatUtil.getTable(HCatUtil.java:180)
	at org.apache.hive.hcatalog.mapreduce.InitializeInput.getInputJobInfo(InitializeInput.java:105)
	at org.apache.hive.hcatalog.mapreduce.InitializeInput.setInput(InitializeInput.java:86)
	at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:95)
	... 32 more

Re: Sqoop action writing to HCATALOG via Oozie throws class not found exception

Posted by Shwetha Shivalingamurthy <ss...@hortonworks.com>.
By default, only sqoop sharelib is loaded for sqoop actions. Add hcatalog
sharelib as well - 
http://oozie.apache.org/docs/4.2.0/WorkflowFunctionalSpec.html#a17.1_Action
_Share_Library_Override_since_Oozie_3.3

-Shwetha

On 21/07/15 12:45 pm, "Venkatesan Ramachandran" <me...@gmail.com>
wrote:

>Hi All,
>
>I'm running the following Sqoop command to write to HCATALOG on the
>command
>line and it works as expected;
>
>sqoop import --connect jdbc:mysql://c6402/test --table simple --username
>sqoop_user --password-file /user/ambari-qa/datastore/testdb_password
>--num-mappers 2 --split-by id --hcatalog-database default --hcatalog-table
>simple --create-hcatalog-table
>
>But when it run it via Oozie Sqoop action as below, it throws class not
>found exception for org/apache/hive/hcatalog/mapreduce/HCatOutputFormat.
>
>I have all the hcatalog jars in the oozie shared lib in the following
>path:
>
>-rw-r--r--   3 oozie hdfs    1997485 2015-07-21 01:08
>/user/oozie/share/lib/lib_20150721010816/hcatalog/ant-1.9.1.jar
>-rw-r--r--   3 oozie hdfs      18336 2015-07-21 01:09
>/user/oozie/share/lib/lib_20150721010816/hcatalog/ant-launcher-1.9.1.jar
>-rw-r--r--   3 oozie hdfs     346729 2015-07-21 01:09
>/user/oozie/share/lib/lib_20150721010816/hcatalog/apache-log4j-extras-1.1.
>jar
>-rw-r--r--   3 oozie hdfs     110600 2015-07-21 01:08
>/user/oozie/share/lib/lib_20150721010816/hcatalog/bonecp-0.8.0.RELEASE.jar
>-rw-r--r--   3 oozie hdfs     339666 2015-07-21 01:08
>/user/oozie/share/lib/lib_20150721010816/hcatalog/datanucleus-api-jdo-3.2.
>6.jar
>-rw-r--r--   3 oozie hdfs     294321 2015-07-21 01:08
>/user/oozie/share/lib/lib_20150721010816/hcatalog/hive-common-1.2.1.2.3.0.
>0-2557.jar
>-rw-r--r--   3 oozie hdfs     257633 2015-07-21 01:09
>/user/oozie/share/lib/lib_20150721010816/hcatalog/hive-hcatalog-core-1.2.1
>.2.3.0.0-2557.jar
>-rw-r--r--   3 oozie hdfs      54328 2015-07-21 01:08
>/user/oozie/share/lib/lib_20150721010816/hcatalog/hive-hcatalog-server-ext
>ensions-1.2.1.2.3.0.0-2557.jar
>-rw-r--r--   3 oozie hdfs    5505255 2015-07-21 01:09
>/user/oozie/share/lib/lib_20150721010816/hcatalog/hive-metastore-1.2.1.2.3
>.0.0-2557.jar
>-rw-r--r--   3 oozie hdfs     916926 2015-07-21 01:08
>/user/oozie/share/lib/lib_20150721010816/hcatalog/hive-serde-1.2.1.2.3.0.0
>-2557.jar
>-rw-r--r--   3 oozie hdfs     108030 2015-07-21 01:08
>/user/oozie/share/lib/lib_20150721010816/hcatalog/hive-webhcat-java-client
>-1.2.1.2.3.0.0-2557.jar
>-rw-r--r--   3 oozie hdfs     232248 2015-07-21 01:08
>/user/oozie/share/lib/lib_20150721010816/hcatalog/jackson-core-asl-1.9.13.
>jar
>-rw-r--r--   3 oozie hdfs     780664 2015-07-21 01:09
>/user/oozie/share/lib/lib_20150721010816/hcatalog/jackson-mapper-asl-1.9.1
>3.jar
>-rw-r--r--   3 oozie hdfs     201124 2015-07-21 01:09
>/user/oozie/share/lib/lib_20150721010816/hcatalog/jdo-api-3.0.1.jar
>-rw-r--r--   3 oozie hdfs     570478 2015-07-21 01:08
>/user/oozie/share/lib/lib_20150721010816/hcatalog/joda-time-2.1.jar
>-rw-r--r--   3 oozie hdfs     313686 2015-07-21 01:08
>/user/oozie/share/lib/lib_20150721010816/hcatalog/libfb303-0.9.2.jar
>-rw-r--r--   3 oozie hdfs     481535 2015-07-21 01:08
>/user/oozie/share/lib/lib_20150721010816/hcatalog/log4j-1.2.16.jar
>-rw-r--r--   3 oozie hdfs      16134 2015-07-21 01:08
>/user/oozie/share/lib/lib_20150721010816/hcatalog/oozie-sharelib-hcatalog-
>4.2.0.2.3.0.0-2557.jar
>-rw-r--r--   3 oozie hdfs      19827 2015-07-21 01:08
>/user/oozie/share/lib/lib_20150721010816/hcatalog/opencsv-2.3.jar
>-rw-r--r--   3 oozie hdfs    2796935 2015-07-21 01:09
>/user/oozie/share/lib/lib_20150721010816/hcatalog/parquet-hadoop-bundle-1.
>6.0.jar
>-rw-r--r--   3 oozie hdfs      26176 2015-07-21 01:08
>/user/oozie/share/lib/lib_20150721010816/hcatalog/slf4j-api-1.6.6.jar
>-rw-r--r--   3 oozie hdfs       9711 2015-07-21 01:08
>/user/oozie/share/lib/lib_20150721010816/hcatalog/slf4j-log4j12-1.6.6.jar
>
>
>SQOOP command and the exception
>
>Sqoop command arguments :
>             import
>             --connect
>             jdbc:mysql://c6402/test
>             --table
>             simple
>             --username
>             sqoop_user
>             --password-file
>             /user/ambari-qa/datastore/testdb_password
>             --num-mappers
>             2
>             --split-by
>             id
>             --hcatalog-database
>             default
>             --hcatalog-table
>             simple
>             --create-hcatalog-table
>Fetching child yarn jobs
>tag id : oozie-a6b97cd85c4eba81b5653acf1dd341ad
>2015-07-21 06:38:17,708 INFO  [main] client.RMProxy
>(RMProxy.java:createRMProxy(98)) - Connecting to ResourceManager at
>c6402.ambari.apache.org/192.168.64.102:8050
>Child yarn jobs are found -
>=================================================================
>
>>>> Invoking Sqoop command line now >>>
>
>5433 [main] WARN  org.apache.sqoop.tool.SqoopTool  - $SQOOP_CONF_DIR
>has not been set in the environment. Cannot check for additional
>configuration.
>2015-07-21 06:38:18,204 WARN  [main] tool.SqoopTool
>(SqoopTool.java:loadPluginsFromConfDir(177)) - $SQOOP_CONF_DIR has not
>been set in the environment. Cannot check for additional
>configuration.
>5467 [main] INFO  org.apache.sqoop.Sqoop  - Running Sqoop version:
>1.4.6.2.3.0.0-2557
>2015-07-21 06:38:18,238 INFO  [main] sqoop.Sqoop
>(Sqoop.java:<init>(92)) - Running Sqoop version: 1.4.6.2.3.0.0-2557
>5668 [main] WARN  org.apache.sqoop.ConnFactory  - $SQOOP_CONF_DIR has
>not been set in the environment. Cannot check for additional
>configuration.
>2015-07-21 06:38:18,439 WARN  [main] sqoop.ConnFactory
>(ConnFactory.java:loadManagersFromConfDir(273)) - $SQOOP_CONF_DIR has
>not been set in the environment. Cannot check for additional
>configuration.
>5797 [main] INFO  org.apache.sqoop.manager.MySQLManager  - Preparing
>to use a MySQL streaming resultset.
>2015-07-21 06:38:18,568 INFO  [main] manager.MySQLManager
>(MySQLManager.java:initOptionDefaults(69)) - Preparing to use a MySQL
>streaming resultset.
>5797 [main] INFO  org.apache.sqoop.tool.CodeGenTool  - Beginning code
>generation
>2015-07-21 06:38:18,568 INFO  [main] tool.CodeGenTool
>(CodeGenTool.java:generateORM(92)) - Beginning code generation
>6492 [main] INFO  org.apache.sqoop.manager.SqlManager  - Executing SQL
>statement: SELECT t.* FROM `simple` AS t LIMIT 1
>2015-07-21 06:38:19,263 INFO  [main] manager.SqlManager
>(SqlManager.java:execute(757)) - Executing SQL statement: SELECT t.*
>FROM `simple` AS t LIMIT 1
>6542 [main] INFO  org.apache.sqoop.manager.SqlManager  - Executing SQL
>statement: SELECT t.* FROM `simple` AS t LIMIT 1
>2015-07-21 06:38:19,313 INFO  [main] manager.SqlManager
>(SqlManager.java:execute(757)) - Executing SQL statement: SELECT t.*
>FROM `simple` AS t LIMIT 1
>6560 [main] INFO  org.apache.sqoop.orm.CompilationManager  -
>HADOOP_MAPRED_HOME is /usr/hdp/2.3.0.0-2557/hadoop-mapreduce
>2015-07-21 06:38:19,331 INFO  [main] orm.CompilationManager
>(CompilationManager.java:findHadoopJars(94)) - HADOOP_MAPRED_HOME is
>/usr/hdp/2.3.0.0-2557/hadoop-mapreduce
>9744 [main] INFO  org.apache.sqoop.orm.CompilationManager  - Writing
>jar file: 
>/tmp/sqoop-yarn/compile/457f0736705f75f68537447520d1a916/simple.jar
>2015-07-21 06:38:22,515 INFO  [main] orm.CompilationManager
>(CompilationManager.java:jar(330)) - Writing jar file:
>/tmp/sqoop-yarn/compile/457f0736705f75f68537447520d1a916/simple.jar
>9797 [main] WARN  org.apache.sqoop.manager.MySQLManager  - It looks
>like you are importing from mysql.
>2015-07-21 06:38:22,568 WARN  [main] manager.MySQLManager
>(MySQLManager.java:importTable(107)) - It looks like you are importing
>from mysql.
>9797 [main] WARN  org.apache.sqoop.manager.MySQLManager  - This
>transfer can be faster! Use the --direct
>2015-07-21 06:38:22,568 WARN  [main] manager.MySQLManager
>(MySQLManager.java:importTable(108)) - This transfer can be faster!
>Use the --direct
>9797 [main] WARN  org.apache.sqoop.manager.MySQLManager  - option to
>exercise a MySQL-specific fast path.
>2015-07-21 06:38:22,568 WARN  [main] manager.MySQLManager
>(MySQLManager.java:importTable(109)) - option to exercise a
>MySQL-specific fast path.
>9797 [main] INFO  org.apache.sqoop.manager.MySQLManager  - Setting
>zero DATETIME behavior to convertToNull (mysql)
>2015-07-21 06:38:22,568 INFO  [main] manager.MySQLManager
>(MySQLManager.java:checkDateTimeBehavior(189)) - Setting zero DATETIME
>behavior to convertToNull (mysql)
>9813 [main] INFO  org.apache.sqoop.mapreduce.ImportJobBase  -
>Beginning import of simple
>2015-07-21 06:38:22,584 INFO  [main] mapreduce.ImportJobBase
>(ImportJobBase.java:runImport(235)) - Beginning import of simple
>2015-07-21 06:38:22,585 INFO  [main] Configuration.deprecation
>(Configuration.java:warnOnceIfDeprecated(1173)) - mapred.job.tracker
>is deprecated. Instead, use mapreduce.jobtracker.address
>2015-07-21 06:38:22,594 INFO  [main] Configuration.deprecation
>(Configuration.java:warnOnceIfDeprecated(1173)) - mapred.jar is
>deprecated. Instead, use mapreduce.job.jar
>
><<< Invocation of Main class completed <<<
>
>Failing Oozie Launcher, Main class
>[org.apache.oozie.action.hadoop.SqoopMain], main() threw exception,
>org/apache/hive/hcatalog/mapreduce/HCatOutputFormat
>java.lang.NoClassDefFoundError:
>org/apache/hive/hcatalog/mapreduce/HCatOutputFormat
>	at 
>org.apache.sqoop.mapreduce.DataDrivenImportJob.getOutputFormatClass(DataDr
>ivenImportJob.java:178)
>	at 
>org.apache.sqoop.mapreduce.ImportJobBase.configureOutputFormat(ImportJobBa
>se.java:94)
>	at 
>org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:259)
>	at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:673)
>	at 
>org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:118)
>	at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)
>	at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
>	at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
>	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>	at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
>	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
>	at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
>	at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
>	at 
>org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:197)
>	at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:177)
>	at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:47)
>	at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:46)
>	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>	at 
>sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:
>62)
>	at 
>sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorIm
>pl.java:43)
>	at java.lang.reflect.Method.invoke(Method.java:497)
>	at 
>org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:236)
>	at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
>	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
>	at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
>	at java.security.AccessController.doPrivileged(Native Method)
>	at javax.security.auth.Subject.doAs(Subject.java:422)
>	at 
>org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.
>java:1657)
>	at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
>Caused by: java.lang.ClassNotFoundException:
>org.apache.hive.hcatalog.mapreduce.HCatOutputFormat
>	at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
>	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>	... 30 more
>
>Oozie Launcher failed, finishing Hadoop job gracefully
>
>Oozie Launcher, uploading action data to HDFS sequence file:
>hdfs://c6401.ambari.apache.org:8020/user/ambari-qa/oozie-oozi/0000005-1507
>21024948042-oozie-oozi-W/sqoop-node--sqoop/action-data.seq
>2015-07-21 
><http://c6401.ambari.apache.org:8020/user/ambari-qa/oozie-oozi/0000005-150
>721024948042-oozie-oozi-W/sqoop-node--sqoop/action-data.seq2015-07-21>
>06:38:23,062 INFO  [main] zlib.ZlibFactory
>(ZlibFactory.java:<clinit>(49)) - Successfully loaded & initialized
>native-zlib library
>2015-07-21 06:38:23,063 INFO  [main] compress.CodecPool
>(CodecPool.java:getCompressor(153)) - Got brand-new compressor
>[.deflate]
>
>Oozie Launcher ends