You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@oozie.apache.org by Liping Zhang <zl...@gmail.com> on 2016/02/24 10:43:40 UTC

Class org.apache.oozie.action.hadoop.SparkMain not found

Dear oozie user and dev,

I set following values in job.properties and in action properties in CDH
5.5.0 Hue oozie:
oozie.use.system.libpath=true
oozie.libpath=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935
oozie.action.sharelib.for.spark=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935/spark

*Here is the workflow file:*
<workflow-app name="sparktest-cassandra" xmlns="uri:oozie:workflow:0.5">
    <start to="spark-b23b"/>
    <kill name="Kill">
        <message>Action failed, error
message[${wf:errorMessage(wf:lastErrorNode())}]</message>
    </kill>
    <action name="spark-b23b">
        <spark xmlns="uri:oozie:spark-action:0.1">
            <job-tracker>${jobTracker}</job-tracker>
            <name-node>${nameNode}</name-node>
            <configuration>
                <property>
                    <name>spark.executor.extraClassPath</name>
                    <value>lib/guava-16.0.1.jar</value>
                </property>
                <property>
                    <name>oozie.action.sharelib.for.spark</name>

<value>/user/oozie/share/lib/lib_20151201085935/spark</value>
                </property>
                <property>
                    <name>oozie.use.system.libpath</name>
                    <value>true</value>
                </property>
                <property>
                    <name>oozie.libpath</name>
                    <value>/user/oozie/share/lib/lib_20151201085935</value>
                </property>
            </configuration>
            <master>local[4]</master>
            <mode>client</mode>
            <name>sparktest-cassandra</name>
              <class>TestCassandra</class>
            <jar>lib/sparktest.jar</jar>
              <spark-opts>--driver-class-path
/opt/cloudera/parcels/CDH/jars/guava-16.0.1.jar</spark-opts>
              <arg>s3n://gridx-output/sparktest/ </arg>
              <arg>10</arg>
              <arg>3</arg>
              <arg>2</arg>
        </spark>
        <ok to="End"/>
        <error to="Kill"/>
    </action>
    <end name="End"/>
</workflow-app>

*job.properties file:*
oozie.use.system.libpath=true
security_enabled=False
oozie.libpath=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935
oozie.action.sharelib.for.spark=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935/spark
dryrun=False
jobTracker=ip-10-0-4-248.us-west-1.compute.internal:8032
nameNode=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020


But I got following SparkMain class not found issue:

<<< Invocation of Main class completed <<<

Failing Oozie Launcher, Main class
[org.apache.oozie.action.hadoop.SparkMain], exception invoking main(),
java.lang.ClassNotFoundException: Class
org.apache.oozie.action.hadoop.SparkMain not found
java.lang.RuntimeException: java.lang.ClassNotFoundException: Class
org.apache.oozie.action.hadoop.SparkMain not found
	at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2199)
	at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:234)
	at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
	at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runSubtask(LocalContainerLauncher.java:378)
	at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runTask(LocalContainerLauncher.java:296)
	at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.access$200(LocalContainerLauncher.java:181)
	at org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler$1.run(LocalContainerLauncher.java:224)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
	at java.util.concurrent.FutureTask.run(FutureTask.java:262)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
	at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: Class
org.apache.oozie.action.hadoop.SparkMain not found
	at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2105)
	at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2197)
	... 13 more

Oozie Launcher failed, finishing Hadoop job gracefully

Oozie Launcher, uploading action data to HDFS sequence file:
hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/admin/oozie-oozi/0000006-160224085347053-oozie-oozi-W/spark-b23b--spark/action-data.seq
<http://ip-10-0-4-248.us-west-1.compute.internal:8888/filebrowser/view=/user/admin/oozie-oozi/0000006-160224085347053-oozie-oozi-W/spark-b23b--spark/action-data.seq>

Oozie Launcher ends

Can you help to give any suggestion? Thanks a lot!

-- 
Cheers,
-----
Big Data - Big Wisdom - Big Value
--------------
Michelle Zhang (张莉苹)

Re: Class org.apache.oozie.action.hadoop.SparkMain not found

Posted by Robert Kanter <rk...@cloudera.com>.
That means that Spark is returning an exit code of 101 to Oozie.  You need
to look at the stdout, stderr, and syslog logs from the Launcher Job and
see if Spark said anything helpful.  From a quick Google search, an exit
code of 101 is probably a ClassNotFoundException or something like that,
which is probably the issue we're looking at in the other thread/Clouder
forum.

On Wed, Feb 24, 2016 at 2:57 PM, Liping Zhang <zl...@gmail.com> wrote:

> Thanks Robert!
>
> oozie.use.system.libpath=True is the default in job.properties,
> and job.properties is generated by CDH Hue oozie:
> oozie.use.system.libpath=true
> security_enabled=False
> dryrun=False
> jobTracker=ip-10-0-4-248.us-west-1.compute.internal:8032
> nameNode=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020
>
> If I don't add other values in job.properties, it will throw previous
> exception agian:
>
> Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SparkMain], exit code [101]
>
>
>
> On Wed, Feb 24, 2016 at 2:15 PM, Robert Kanter <rk...@cloudera.com>
> wrote:
>
>> Hi Liping,
>>
>> You don't need all 3 of these:
>> oozie.use.system.libpath=true
>>
>> oozie.libpath=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935
>>
>> oozie.action.sharelib.for.spark=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935/spark
>>
>> In fact, you may run into problems with the latter two if you ever upgrade
>> the sharelib, as the timestamped directory will change.  Plus, your
>> setting
>> for oozie.libpath is including all sharelib subdirectories, which is way
>> more jars than you need and may also cause conflicts.
>>
>> All you need is
>> oozie.use.system.libpath=true
>>
>> This tells Oozie to get the appropriate sharelib for the action; in this
>> case, spark.
>>
>> Please read this blog post to get a better understanding of how the
>> sharelib works:
>>
>> http://blog.cloudera.com/blog/2014/05/how-to-use-the-sharelib-in-apache-oozie-cdh-5/
>>
>> - Robert
>>
>> On Wed, Feb 24, 2016 at 11:40 AM, Liping Zhang <zl...@gmail.com>
>> wrote:
>>
>> > Thanks Jaydeep for you quick response!
>> >
>> > After adding oozie.action.sharelib.for.spark = spark, it threw
>> > NoSuchMethodError exception.
>> >
>> > Do you know how I can I add guava-16.0.1 into the oozie's class path?
>> >
>> >
>> > >>> Invoking Spark class now >>>
>> >
>> >
>> > <<< Invocation of Main class completed <<<
>> >
>> > Failing Oozie Launcher, Main class
>> > [org.apache.oozie.action.hadoop.SparkMain], main() threw exception,
>> > com.google.common.reflect.TypeToken.isPrimitive()Z
>> > java.lang.NoSuchMethodError:
>> > com.google.common.reflect.TypeToken.isPrimitive()Z
>> >         at com.datastax.driver.core.TypeCodec.<init>(TypeCodec.java:142)
>> >         at com.datastax.driver.core.TypeCodec.<init>(TypeCodec.java:136)
>> >         at
>> > com.datastax.driver.core.TypeCodec$BlobCodec.<init>(TypeCodec.java:609)
>> >         at
>> >
>> com.datastax.driver.core.TypeCodec$BlobCodec.<clinit>(TypeCodec.java:606)
>> >         at
>> > com.datastax.driver.core.CodecRegistry.<clinit>(CodecRegistry.java:147)
>> >         at
>> >
>> com.datastax.driver.core.Configuration$Builder.build(Configuration.java:259)
>> >         at
>> >
>> com.datastax.driver.core.Cluster$Builder.getConfiguration(Cluster.java:1135)
>> >         at com.datastax.driver.core.Cluster.<init>(Cluster.java:111)
>> >         at com.datastax.driver.core.Cluster.buildFrom(Cluster.java:178)
>> >         at
>> > com.datastax.driver.core.Cluster$Builder.build(Cluster.java:1152)
>> >         at
>> >
>> com.datastax.spark.connector.cql.DefaultConnectionFactory$.createCluster(CassandraConnectionFactory.scala:85)
>> >         at
>> >
>> com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:155)
>> >         at
>> >
>> com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:150)
>> >         at
>> >
>> com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:150)
>> >         at
>> >
>> com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:31)
>> >         at
>> >
>> com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:56)
>> >         at
>> >
>> com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:81)
>> >         at
>> >
>> com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:109)
>> >         at
>> >
>> com.datastax.spark.connector.cql.CassandraConnector.withClusterDo(CassandraConnector.scala:120)
>> >         at
>> > com.datastax.spark.connector.cql.Schema$.fromCassandra(Schema.scala:241)
>> >         at
>> >
>> com.datastax.spark.connector.writer.TableWriter$.apply(TableWriter.scala:263)
>> >         at
>> >
>> com.datastax.spark.connector.RDDFunctions.saveToCassandra(RDDFunctions.scala:36)
>> >         at TestCassandra$.main(TestCassandra.scala:44)
>> >         at TestCassandra.main(TestCassandra.scala)
>> >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> >
>> >
>> > On Wed, Feb 24, 2016 at 5:00 AM, Jaydeep Vishwakarma <
>> > jaydeep.vishwakarma@inmobi.com> wrote:
>> >
>> > > Add spark sharelib in oozie setup it will work.
>> > >
>> > > Regards,
>> > > Jaydep
>> > >
>> > > On Wed, Feb 24, 2016 at 3:13 PM, Liping Zhang <zl...@gmail.com>
>> > > wrote:
>> > >
>> > > > Dear oozie user and dev,
>> > > >
>> > > > I set following values in job.properties and in action properties in
>> > CDH
>> > > > 5.5.0 Hue oozie:
>> > > > oozie.use.system.libpath=true
>> > > >
>> > > >
>> > >
>> >
>> oozie.libpath=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935
>> > > >
>> > > >
>> > >
>> >
>> oozie.action.sharelib.for.spark=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935/spark
>> > > >
>> > > > *Here is the workflow file:*
>> > > > <workflow-app name="sparktest-cassandra"
>> > xmlns="uri:oozie:workflow:0.5">
>> > > >     <start to="spark-b23b"/>
>> > > >     <kill name="Kill">
>> > > >         <message>Action failed, error
>> > > > message[${wf:errorMessage(wf:lastErrorNode())}]</message>
>> > > >     </kill>
>> > > >     <action name="spark-b23b">
>> > > >         <spark xmlns="uri:oozie:spark-action:0.1">
>> > > >             <job-tracker>${jobTracker}</job-tracker>
>> > > >             <name-node>${nameNode}</name-node>
>> > > >             <configuration>
>> > > >                 <property>
>> > > >                     <name>spark.executor.extraClassPath</name>
>> > > >                     <value>lib/guava-16.0.1.jar</value>
>> > > >                 </property>
>> > > >                 <property>
>> > > >                     <name>oozie.action.sharelib.for.spark</name>
>> > > >
>> > > > <value>/user/oozie/share/lib/lib_20151201085935/spark</value>
>> > > >                 </property>
>> > > >                 <property>
>> > > >                     <name>oozie.use.system.libpath</name>
>> > > >                     <value>true</value>
>> > > >                 </property>
>> > > >                 <property>
>> > > >                     <name>oozie.libpath</name>
>> > > >
>> > >  <value>/user/oozie/share/lib/lib_20151201085935</value>
>> > > >                 </property>
>> > > >             </configuration>
>> > > >             <master>local[4]</master>
>> > > >             <mode>client</mode>
>> > > >             <name>sparktest-cassandra</name>
>> > > >               <class>TestCassandra</class>
>> > > >             <jar>lib/sparktest.jar</jar>
>> > > >               <spark-opts>--driver-class-path
>> > > > /opt/cloudera/parcels/CDH/jars/guava-16.0.1.jar</spark-opts>
>> > > >               <arg>s3n://gridx-output/sparktest/ </arg>
>> > > >               <arg>10</arg>
>> > > >               <arg>3</arg>
>> > > >               <arg>2</arg>
>> > > >         </spark>
>> > > >         <ok to="End"/>
>> > > >         <error to="Kill"/>
>> > > >     </action>
>> > > >     <end name="End"/>
>> > > > </workflow-app>
>> > > >
>> > > > *job.properties file:*
>> > > > oozie.use.system.libpath=true
>> > > > security_enabled=False
>> > > >
>> > > >
>> > >
>> >
>> oozie.libpath=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935
>> > > >
>> > > >
>> > >
>> >
>> oozie.action.sharelib.for.spark=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935/spark
>> > > > dryrun=False
>> > > > jobTracker=ip-10-0-4-248.us-west-1.compute.internal:8032
>> > > > nameNode=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020
>> > > >
>> > > >
>> > > > But I got following SparkMain class not found issue:
>> > > >
>> > > > <<< Invocation of Main class completed <<<
>> > > >
>> > > > Failing Oozie Launcher, Main class
>> > > > [org.apache.oozie.action.hadoop.SparkMain], exception invoking
>> main(),
>> > > > java.lang.ClassNotFoundException: Class
>> > > > org.apache.oozie.action.hadoop.SparkMain not found
>> > > > java.lang.RuntimeException: java.lang.ClassNotFoundException: Class
>> > > > org.apache.oozie.action.hadoop.SparkMain not found
>> > > >         at
>> > > >
>> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2199)
>> > > >         at
>> > > >
>> > >
>> >
>> org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:234)
>> > > >         at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>> > > >         at
>> > > org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
>> > > >         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
>> > > >         at
>> > > >
>> > >
>> >
>> org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runSubtask(LocalContainerLauncher.java:378)
>> > > >         at
>> > > >
>> > >
>> >
>> org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runTask(LocalContainerLauncher.java:296)
>> > > >         at
>> > > >
>> > >
>> >
>> org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.access$200(LocalContainerLauncher.java:181)
>> > > >         at
>> > > >
>> > >
>> >
>> org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler$1.run(LocalContainerLauncher.java:224)
>> > > >         at
>> > > >
>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>> > > >         at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>> > > >         at
>> > > >
>> > >
>> >
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> > > >         at
>> > > >
>> > >
>> >
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> > > >         at java.lang.Thread.run(Thread.java:745)
>> > > > Caused by: java.lang.ClassNotFoundException: Class
>> > > > org.apache.oozie.action.hadoop.SparkMain not found
>> > > >         at
>> > > >
>> > >
>> >
>> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2105)
>> > > >         at
>> > > >
>> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2197)
>> > > >         ... 13 more
>> > > >
>> > > > Oozie Launcher failed, finishing Hadoop job gracefully
>> > > >
>> > > > Oozie Launcher, uploading action data to HDFS sequence file:
>> > > >
>> > > >
>> > >
>> >
>> hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/admin/oozie-oozi/0000006-160224085347053-oozie-oozi-W/spark-b23b--spark/action-data.seq
>> > > > <
>> > > >
>> > >
>> >
>> http://ip-10-0-4-248.us-west-1.compute.internal:8888/filebrowser/view=/user/admin/oozie-oozi/0000006-160224085347053-oozie-oozi-W/spark-b23b--spark/action-data.seq
>> > > > >
>> > > >
>> > > > Oozie Launcher ends
>> > > >
>> > > > Can you help to give any suggestion? Thanks a lot!
>> > > >
>> > > > --
>> > > > Cheers,
>> > > > -----
>> > > > Big Data - Big Wisdom - Big Value
>> > > > --------------
>> > > > Michelle Zhang (张莉苹)
>> > > >
>> > >
>> > > --
>> > > _____________________________________________________________
>> > > The information contained in this communication is intended solely for
>> > the
>> > > use of the individual or entity to whom it is addressed and others
>> > > authorized to receive it. It may contain confidential or legally
>> > privileged
>> > > information. If you are not the intended recipient you are hereby
>> > notified
>> > > that any disclosure, copying, distribution or taking any action in
>> > reliance
>> > > on the contents of this information is strictly prohibited and may be
>> > > unlawful. If you have received this communication in error, please
>> notify
>> > > us immediately by responding to this email and then delete it from
>> your
>> > > system. The firm is neither liable for the proper and complete
>> > transmission
>> > > of the information contained in this communication nor for any delay
>> in
>> > its
>> > > receipt.
>> > >
>> >
>> >
>> >
>> > --
>> > Cheers,
>> > -----
>> > Big Data - Big Wisdom - Big Value
>> > --------------
>> > Michelle Zhang (张莉苹)
>> >
>>
>
>
>
> --
> Cheers,
> -----
> Big Data - Big Wisdom - Big Value
> --------------
> Michelle Zhang (张莉苹)
>

Re: Class org.apache.oozie.action.hadoop.SparkMain not found

Posted by Robert Kanter <rk...@cloudera.com>.
That means that Spark is returning an exit code of 101 to Oozie.  You need
to look at the stdout, stderr, and syslog logs from the Launcher Job and
see if Spark said anything helpful.  From a quick Google search, an exit
code of 101 is probably a ClassNotFoundException or something like that,
which is probably the issue we're looking at in the other thread/Clouder
forum.

On Wed, Feb 24, 2016 at 2:57 PM, Liping Zhang <zl...@gmail.com> wrote:

> Thanks Robert!
>
> oozie.use.system.libpath=True is the default in job.properties,
> and job.properties is generated by CDH Hue oozie:
> oozie.use.system.libpath=true
> security_enabled=False
> dryrun=False
> jobTracker=ip-10-0-4-248.us-west-1.compute.internal:8032
> nameNode=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020
>
> If I don't add other values in job.properties, it will throw previous
> exception agian:
>
> Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SparkMain], exit code [101]
>
>
>
> On Wed, Feb 24, 2016 at 2:15 PM, Robert Kanter <rk...@cloudera.com>
> wrote:
>
>> Hi Liping,
>>
>> You don't need all 3 of these:
>> oozie.use.system.libpath=true
>>
>> oozie.libpath=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935
>>
>> oozie.action.sharelib.for.spark=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935/spark
>>
>> In fact, you may run into problems with the latter two if you ever upgrade
>> the sharelib, as the timestamped directory will change.  Plus, your
>> setting
>> for oozie.libpath is including all sharelib subdirectories, which is way
>> more jars than you need and may also cause conflicts.
>>
>> All you need is
>> oozie.use.system.libpath=true
>>
>> This tells Oozie to get the appropriate sharelib for the action; in this
>> case, spark.
>>
>> Please read this blog post to get a better understanding of how the
>> sharelib works:
>>
>> http://blog.cloudera.com/blog/2014/05/how-to-use-the-sharelib-in-apache-oozie-cdh-5/
>>
>> - Robert
>>
>> On Wed, Feb 24, 2016 at 11:40 AM, Liping Zhang <zl...@gmail.com>
>> wrote:
>>
>> > Thanks Jaydeep for you quick response!
>> >
>> > After adding oozie.action.sharelib.for.spark = spark, it threw
>> > NoSuchMethodError exception.
>> >
>> > Do you know how I can I add guava-16.0.1 into the oozie's class path?
>> >
>> >
>> > >>> Invoking Spark class now >>>
>> >
>> >
>> > <<< Invocation of Main class completed <<<
>> >
>> > Failing Oozie Launcher, Main class
>> > [org.apache.oozie.action.hadoop.SparkMain], main() threw exception,
>> > com.google.common.reflect.TypeToken.isPrimitive()Z
>> > java.lang.NoSuchMethodError:
>> > com.google.common.reflect.TypeToken.isPrimitive()Z
>> >         at com.datastax.driver.core.TypeCodec.<init>(TypeCodec.java:142)
>> >         at com.datastax.driver.core.TypeCodec.<init>(TypeCodec.java:136)
>> >         at
>> > com.datastax.driver.core.TypeCodec$BlobCodec.<init>(TypeCodec.java:609)
>> >         at
>> >
>> com.datastax.driver.core.TypeCodec$BlobCodec.<clinit>(TypeCodec.java:606)
>> >         at
>> > com.datastax.driver.core.CodecRegistry.<clinit>(CodecRegistry.java:147)
>> >         at
>> >
>> com.datastax.driver.core.Configuration$Builder.build(Configuration.java:259)
>> >         at
>> >
>> com.datastax.driver.core.Cluster$Builder.getConfiguration(Cluster.java:1135)
>> >         at com.datastax.driver.core.Cluster.<init>(Cluster.java:111)
>> >         at com.datastax.driver.core.Cluster.buildFrom(Cluster.java:178)
>> >         at
>> > com.datastax.driver.core.Cluster$Builder.build(Cluster.java:1152)
>> >         at
>> >
>> com.datastax.spark.connector.cql.DefaultConnectionFactory$.createCluster(CassandraConnectionFactory.scala:85)
>> >         at
>> >
>> com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:155)
>> >         at
>> >
>> com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:150)
>> >         at
>> >
>> com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:150)
>> >         at
>> >
>> com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:31)
>> >         at
>> >
>> com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:56)
>> >         at
>> >
>> com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:81)
>> >         at
>> >
>> com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:109)
>> >         at
>> >
>> com.datastax.spark.connector.cql.CassandraConnector.withClusterDo(CassandraConnector.scala:120)
>> >         at
>> > com.datastax.spark.connector.cql.Schema$.fromCassandra(Schema.scala:241)
>> >         at
>> >
>> com.datastax.spark.connector.writer.TableWriter$.apply(TableWriter.scala:263)
>> >         at
>> >
>> com.datastax.spark.connector.RDDFunctions.saveToCassandra(RDDFunctions.scala:36)
>> >         at TestCassandra$.main(TestCassandra.scala:44)
>> >         at TestCassandra.main(TestCassandra.scala)
>> >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> >
>> >
>> > On Wed, Feb 24, 2016 at 5:00 AM, Jaydeep Vishwakarma <
>> > jaydeep.vishwakarma@inmobi.com> wrote:
>> >
>> > > Add spark sharelib in oozie setup it will work.
>> > >
>> > > Regards,
>> > > Jaydep
>> > >
>> > > On Wed, Feb 24, 2016 at 3:13 PM, Liping Zhang <zl...@gmail.com>
>> > > wrote:
>> > >
>> > > > Dear oozie user and dev,
>> > > >
>> > > > I set following values in job.properties and in action properties in
>> > CDH
>> > > > 5.5.0 Hue oozie:
>> > > > oozie.use.system.libpath=true
>> > > >
>> > > >
>> > >
>> >
>> oozie.libpath=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935
>> > > >
>> > > >
>> > >
>> >
>> oozie.action.sharelib.for.spark=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935/spark
>> > > >
>> > > > *Here is the workflow file:*
>> > > > <workflow-app name="sparktest-cassandra"
>> > xmlns="uri:oozie:workflow:0.5">
>> > > >     <start to="spark-b23b"/>
>> > > >     <kill name="Kill">
>> > > >         <message>Action failed, error
>> > > > message[${wf:errorMessage(wf:lastErrorNode())}]</message>
>> > > >     </kill>
>> > > >     <action name="spark-b23b">
>> > > >         <spark xmlns="uri:oozie:spark-action:0.1">
>> > > >             <job-tracker>${jobTracker}</job-tracker>
>> > > >             <name-node>${nameNode}</name-node>
>> > > >             <configuration>
>> > > >                 <property>
>> > > >                     <name>spark.executor.extraClassPath</name>
>> > > >                     <value>lib/guava-16.0.1.jar</value>
>> > > >                 </property>
>> > > >                 <property>
>> > > >                     <name>oozie.action.sharelib.for.spark</name>
>> > > >
>> > > > <value>/user/oozie/share/lib/lib_20151201085935/spark</value>
>> > > >                 </property>
>> > > >                 <property>
>> > > >                     <name>oozie.use.system.libpath</name>
>> > > >                     <value>true</value>
>> > > >                 </property>
>> > > >                 <property>
>> > > >                     <name>oozie.libpath</name>
>> > > >
>> > >  <value>/user/oozie/share/lib/lib_20151201085935</value>
>> > > >                 </property>
>> > > >             </configuration>
>> > > >             <master>local[4]</master>
>> > > >             <mode>client</mode>
>> > > >             <name>sparktest-cassandra</name>
>> > > >               <class>TestCassandra</class>
>> > > >             <jar>lib/sparktest.jar</jar>
>> > > >               <spark-opts>--driver-class-path
>> > > > /opt/cloudera/parcels/CDH/jars/guava-16.0.1.jar</spark-opts>
>> > > >               <arg>s3n://gridx-output/sparktest/ </arg>
>> > > >               <arg>10</arg>
>> > > >               <arg>3</arg>
>> > > >               <arg>2</arg>
>> > > >         </spark>
>> > > >         <ok to="End"/>
>> > > >         <error to="Kill"/>
>> > > >     </action>
>> > > >     <end name="End"/>
>> > > > </workflow-app>
>> > > >
>> > > > *job.properties file:*
>> > > > oozie.use.system.libpath=true
>> > > > security_enabled=False
>> > > >
>> > > >
>> > >
>> >
>> oozie.libpath=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935
>> > > >
>> > > >
>> > >
>> >
>> oozie.action.sharelib.for.spark=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935/spark
>> > > > dryrun=False
>> > > > jobTracker=ip-10-0-4-248.us-west-1.compute.internal:8032
>> > > > nameNode=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020
>> > > >
>> > > >
>> > > > But I got following SparkMain class not found issue:
>> > > >
>> > > > <<< Invocation of Main class completed <<<
>> > > >
>> > > > Failing Oozie Launcher, Main class
>> > > > [org.apache.oozie.action.hadoop.SparkMain], exception invoking
>> main(),
>> > > > java.lang.ClassNotFoundException: Class
>> > > > org.apache.oozie.action.hadoop.SparkMain not found
>> > > > java.lang.RuntimeException: java.lang.ClassNotFoundException: Class
>> > > > org.apache.oozie.action.hadoop.SparkMain not found
>> > > >         at
>> > > >
>> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2199)
>> > > >         at
>> > > >
>> > >
>> >
>> org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:234)
>> > > >         at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>> > > >         at
>> > > org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
>> > > >         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
>> > > >         at
>> > > >
>> > >
>> >
>> org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runSubtask(LocalContainerLauncher.java:378)
>> > > >         at
>> > > >
>> > >
>> >
>> org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runTask(LocalContainerLauncher.java:296)
>> > > >         at
>> > > >
>> > >
>> >
>> org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.access$200(LocalContainerLauncher.java:181)
>> > > >         at
>> > > >
>> > >
>> >
>> org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler$1.run(LocalContainerLauncher.java:224)
>> > > >         at
>> > > >
>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>> > > >         at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>> > > >         at
>> > > >
>> > >
>> >
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> > > >         at
>> > > >
>> > >
>> >
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> > > >         at java.lang.Thread.run(Thread.java:745)
>> > > > Caused by: java.lang.ClassNotFoundException: Class
>> > > > org.apache.oozie.action.hadoop.SparkMain not found
>> > > >         at
>> > > >
>> > >
>> >
>> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2105)
>> > > >         at
>> > > >
>> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2197)
>> > > >         ... 13 more
>> > > >
>> > > > Oozie Launcher failed, finishing Hadoop job gracefully
>> > > >
>> > > > Oozie Launcher, uploading action data to HDFS sequence file:
>> > > >
>> > > >
>> > >
>> >
>> hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/admin/oozie-oozi/0000006-160224085347053-oozie-oozi-W/spark-b23b--spark/action-data.seq
>> > > > <
>> > > >
>> > >
>> >
>> http://ip-10-0-4-248.us-west-1.compute.internal:8888/filebrowser/view=/user/admin/oozie-oozi/0000006-160224085347053-oozie-oozi-W/spark-b23b--spark/action-data.seq
>> > > > >
>> > > >
>> > > > Oozie Launcher ends
>> > > >
>> > > > Can you help to give any suggestion? Thanks a lot!
>> > > >
>> > > > --
>> > > > Cheers,
>> > > > -----
>> > > > Big Data - Big Wisdom - Big Value
>> > > > --------------
>> > > > Michelle Zhang (张莉苹)
>> > > >
>> > >
>> > > --
>> > > _____________________________________________________________
>> > > The information contained in this communication is intended solely for
>> > the
>> > > use of the individual or entity to whom it is addressed and others
>> > > authorized to receive it. It may contain confidential or legally
>> > privileged
>> > > information. If you are not the intended recipient you are hereby
>> > notified
>> > > that any disclosure, copying, distribution or taking any action in
>> > reliance
>> > > on the contents of this information is strictly prohibited and may be
>> > > unlawful. If you have received this communication in error, please
>> notify
>> > > us immediately by responding to this email and then delete it from
>> your
>> > > system. The firm is neither liable for the proper and complete
>> > transmission
>> > > of the information contained in this communication nor for any delay
>> in
>> > its
>> > > receipt.
>> > >
>> >
>> >
>> >
>> > --
>> > Cheers,
>> > -----
>> > Big Data - Big Wisdom - Big Value
>> > --------------
>> > Michelle Zhang (张莉苹)
>> >
>>
>
>
>
> --
> Cheers,
> -----
> Big Data - Big Wisdom - Big Value
> --------------
> Michelle Zhang (张莉苹)
>

Re: Class org.apache.oozie.action.hadoop.SparkMain not found

Posted by Liping Zhang <zl...@gmail.com>.
Thanks Robert!

oozie.use.system.libpath=True is the default in job.properties,
and job.properties is generated by CDH Hue oozie:
oozie.use.system.libpath=true
security_enabled=False
dryrun=False
jobTracker=ip-10-0-4-248.us-west-1.compute.internal:8032
nameNode=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020

If I don't add other values in job.properties, it will throw previous
exception agian:

Failing Oozie Launcher, Main class
[org.apache.oozie.action.hadoop.SparkMain], exit code [101]



On Wed, Feb 24, 2016 at 2:15 PM, Robert Kanter <rk...@cloudera.com> wrote:

> Hi Liping,
>
> You don't need all 3 of these:
> oozie.use.system.libpath=true
>
> oozie.libpath=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935
>
> oozie.action.sharelib.for.spark=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935/spark
>
> In fact, you may run into problems with the latter two if you ever upgrade
> the sharelib, as the timestamped directory will change.  Plus, your setting
> for oozie.libpath is including all sharelib subdirectories, which is way
> more jars than you need and may also cause conflicts.
>
> All you need is
> oozie.use.system.libpath=true
>
> This tells Oozie to get the appropriate sharelib for the action; in this
> case, spark.
>
> Please read this blog post to get a better understanding of how the
> sharelib works:
>
> http://blog.cloudera.com/blog/2014/05/how-to-use-the-sharelib-in-apache-oozie-cdh-5/
>
> - Robert
>
> On Wed, Feb 24, 2016 at 11:40 AM, Liping Zhang <zl...@gmail.com>
> wrote:
>
> > Thanks Jaydeep for you quick response!
> >
> > After adding oozie.action.sharelib.for.spark = spark, it threw
> > NoSuchMethodError exception.
> >
> > Do you know how I can I add guava-16.0.1 into the oozie's class path?
> >
> >
> > >>> Invoking Spark class now >>>
> >
> >
> > <<< Invocation of Main class completed <<<
> >
> > Failing Oozie Launcher, Main class
> > [org.apache.oozie.action.hadoop.SparkMain], main() threw exception,
> > com.google.common.reflect.TypeToken.isPrimitive()Z
> > java.lang.NoSuchMethodError:
> > com.google.common.reflect.TypeToken.isPrimitive()Z
> >         at com.datastax.driver.core.TypeCodec.<init>(TypeCodec.java:142)
> >         at com.datastax.driver.core.TypeCodec.<init>(TypeCodec.java:136)
> >         at
> > com.datastax.driver.core.TypeCodec$BlobCodec.<init>(TypeCodec.java:609)
> >         at
> > com.datastax.driver.core.TypeCodec$BlobCodec.<clinit>(TypeCodec.java:606)
> >         at
> > com.datastax.driver.core.CodecRegistry.<clinit>(CodecRegistry.java:147)
> >         at
> >
> com.datastax.driver.core.Configuration$Builder.build(Configuration.java:259)
> >         at
> >
> com.datastax.driver.core.Cluster$Builder.getConfiguration(Cluster.java:1135)
> >         at com.datastax.driver.core.Cluster.<init>(Cluster.java:111)
> >         at com.datastax.driver.core.Cluster.buildFrom(Cluster.java:178)
> >         at
> > com.datastax.driver.core.Cluster$Builder.build(Cluster.java:1152)
> >         at
> >
> com.datastax.spark.connector.cql.DefaultConnectionFactory$.createCluster(CassandraConnectionFactory.scala:85)
> >         at
> >
> com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:155)
> >         at
> >
> com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:150)
> >         at
> >
> com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:150)
> >         at
> >
> com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:31)
> >         at
> >
> com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:56)
> >         at
> >
> com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:81)
> >         at
> >
> com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:109)
> >         at
> >
> com.datastax.spark.connector.cql.CassandraConnector.withClusterDo(CassandraConnector.scala:120)
> >         at
> > com.datastax.spark.connector.cql.Schema$.fromCassandra(Schema.scala:241)
> >         at
> >
> com.datastax.spark.connector.writer.TableWriter$.apply(TableWriter.scala:263)
> >         at
> >
> com.datastax.spark.connector.RDDFunctions.saveToCassandra(RDDFunctions.scala:36)
> >         at TestCassandra$.main(TestCassandra.scala:44)
> >         at TestCassandra.main(TestCassandra.scala)
> >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >
> >
> > On Wed, Feb 24, 2016 at 5:00 AM, Jaydeep Vishwakarma <
> > jaydeep.vishwakarma@inmobi.com> wrote:
> >
> > > Add spark sharelib in oozie setup it will work.
> > >
> > > Regards,
> > > Jaydep
> > >
> > > On Wed, Feb 24, 2016 at 3:13 PM, Liping Zhang <zl...@gmail.com>
> > > wrote:
> > >
> > > > Dear oozie user and dev,
> > > >
> > > > I set following values in job.properties and in action properties in
> > CDH
> > > > 5.5.0 Hue oozie:
> > > > oozie.use.system.libpath=true
> > > >
> > > >
> > >
> >
> oozie.libpath=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935
> > > >
> > > >
> > >
> >
> oozie.action.sharelib.for.spark=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935/spark
> > > >
> > > > *Here is the workflow file:*
> > > > <workflow-app name="sparktest-cassandra"
> > xmlns="uri:oozie:workflow:0.5">
> > > >     <start to="spark-b23b"/>
> > > >     <kill name="Kill">
> > > >         <message>Action failed, error
> > > > message[${wf:errorMessage(wf:lastErrorNode())}]</message>
> > > >     </kill>
> > > >     <action name="spark-b23b">
> > > >         <spark xmlns="uri:oozie:spark-action:0.1">
> > > >             <job-tracker>${jobTracker}</job-tracker>
> > > >             <name-node>${nameNode}</name-node>
> > > >             <configuration>
> > > >                 <property>
> > > >                     <name>spark.executor.extraClassPath</name>
> > > >                     <value>lib/guava-16.0.1.jar</value>
> > > >                 </property>
> > > >                 <property>
> > > >                     <name>oozie.action.sharelib.for.spark</name>
> > > >
> > > > <value>/user/oozie/share/lib/lib_20151201085935/spark</value>
> > > >                 </property>
> > > >                 <property>
> > > >                     <name>oozie.use.system.libpath</name>
> > > >                     <value>true</value>
> > > >                 </property>
> > > >                 <property>
> > > >                     <name>oozie.libpath</name>
> > > >
> > >  <value>/user/oozie/share/lib/lib_20151201085935</value>
> > > >                 </property>
> > > >             </configuration>
> > > >             <master>local[4]</master>
> > > >             <mode>client</mode>
> > > >             <name>sparktest-cassandra</name>
> > > >               <class>TestCassandra</class>
> > > >             <jar>lib/sparktest.jar</jar>
> > > >               <spark-opts>--driver-class-path
> > > > /opt/cloudera/parcels/CDH/jars/guava-16.0.1.jar</spark-opts>
> > > >               <arg>s3n://gridx-output/sparktest/ </arg>
> > > >               <arg>10</arg>
> > > >               <arg>3</arg>
> > > >               <arg>2</arg>
> > > >         </spark>
> > > >         <ok to="End"/>
> > > >         <error to="Kill"/>
> > > >     </action>
> > > >     <end name="End"/>
> > > > </workflow-app>
> > > >
> > > > *job.properties file:*
> > > > oozie.use.system.libpath=true
> > > > security_enabled=False
> > > >
> > > >
> > >
> >
> oozie.libpath=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935
> > > >
> > > >
> > >
> >
> oozie.action.sharelib.for.spark=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935/spark
> > > > dryrun=False
> > > > jobTracker=ip-10-0-4-248.us-west-1.compute.internal:8032
> > > > nameNode=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020
> > > >
> > > >
> > > > But I got following SparkMain class not found issue:
> > > >
> > > > <<< Invocation of Main class completed <<<
> > > >
> > > > Failing Oozie Launcher, Main class
> > > > [org.apache.oozie.action.hadoop.SparkMain], exception invoking
> main(),
> > > > java.lang.ClassNotFoundException: Class
> > > > org.apache.oozie.action.hadoop.SparkMain not found
> > > > java.lang.RuntimeException: java.lang.ClassNotFoundException: Class
> > > > org.apache.oozie.action.hadoop.SparkMain not found
> > > >         at
> > > >
> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2199)
> > > >         at
> > > >
> > >
> >
> org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:234)
> > > >         at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
> > > >         at
> > > org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
> > > >         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
> > > >         at
> > > >
> > >
> >
> org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runSubtask(LocalContainerLauncher.java:378)
> > > >         at
> > > >
> > >
> >
> org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runTask(LocalContainerLauncher.java:296)
> > > >         at
> > > >
> > >
> >
> org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.access$200(LocalContainerLauncher.java:181)
> > > >         at
> > > >
> > >
> >
> org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler$1.run(LocalContainerLauncher.java:224)
> > > >         at
> > > >
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> > > >         at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> > > >         at
> > > >
> > >
> >
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> > > >         at
> > > >
> > >
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> > > >         at java.lang.Thread.run(Thread.java:745)
> > > > Caused by: java.lang.ClassNotFoundException: Class
> > > > org.apache.oozie.action.hadoop.SparkMain not found
> > > >         at
> > > >
> > >
> >
> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2105)
> > > >         at
> > > >
> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2197)
> > > >         ... 13 more
> > > >
> > > > Oozie Launcher failed, finishing Hadoop job gracefully
> > > >
> > > > Oozie Launcher, uploading action data to HDFS sequence file:
> > > >
> > > >
> > >
> >
> hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/admin/oozie-oozi/0000006-160224085347053-oozie-oozi-W/spark-b23b--spark/action-data.seq
> > > > <
> > > >
> > >
> >
> http://ip-10-0-4-248.us-west-1.compute.internal:8888/filebrowser/view=/user/admin/oozie-oozi/0000006-160224085347053-oozie-oozi-W/spark-b23b--spark/action-data.seq
> > > > >
> > > >
> > > > Oozie Launcher ends
> > > >
> > > > Can you help to give any suggestion? Thanks a lot!
> > > >
> > > > --
> > > > Cheers,
> > > > -----
> > > > Big Data - Big Wisdom - Big Value
> > > > --------------
> > > > Michelle Zhang (张莉苹)
> > > >
> > >
> > > --
> > > _____________________________________________________________
> > > The information contained in this communication is intended solely for
> > the
> > > use of the individual or entity to whom it is addressed and others
> > > authorized to receive it. It may contain confidential or legally
> > privileged
> > > information. If you are not the intended recipient you are hereby
> > notified
> > > that any disclosure, copying, distribution or taking any action in
> > reliance
> > > on the contents of this information is strictly prohibited and may be
> > > unlawful. If you have received this communication in error, please
> notify
> > > us immediately by responding to this email and then delete it from your
> > > system. The firm is neither liable for the proper and complete
> > transmission
> > > of the information contained in this communication nor for any delay in
> > its
> > > receipt.
> > >
> >
> >
> >
> > --
> > Cheers,
> > -----
> > Big Data - Big Wisdom - Big Value
> > --------------
> > Michelle Zhang (张莉苹)
> >
>



-- 
Cheers,
-----
Big Data - Big Wisdom - Big Value
--------------
Michelle Zhang (张莉苹)

Re: Class org.apache.oozie.action.hadoop.SparkMain not found

Posted by Liping Zhang <zl...@gmail.com>.
Thanks Robert!

oozie.use.system.libpath=True is the default in job.properties,
and job.properties is generated by CDH Hue oozie:
oozie.use.system.libpath=true
security_enabled=False
dryrun=False
jobTracker=ip-10-0-4-248.us-west-1.compute.internal:8032
nameNode=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020

If I don't add other values in job.properties, it will throw previous
exception agian:

Failing Oozie Launcher, Main class
[org.apache.oozie.action.hadoop.SparkMain], exit code [101]



On Wed, Feb 24, 2016 at 2:15 PM, Robert Kanter <rk...@cloudera.com> wrote:

> Hi Liping,
>
> You don't need all 3 of these:
> oozie.use.system.libpath=true
>
> oozie.libpath=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935
>
> oozie.action.sharelib.for.spark=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935/spark
>
> In fact, you may run into problems with the latter two if you ever upgrade
> the sharelib, as the timestamped directory will change.  Plus, your setting
> for oozie.libpath is including all sharelib subdirectories, which is way
> more jars than you need and may also cause conflicts.
>
> All you need is
> oozie.use.system.libpath=true
>
> This tells Oozie to get the appropriate sharelib for the action; in this
> case, spark.
>
> Please read this blog post to get a better understanding of how the
> sharelib works:
>
> http://blog.cloudera.com/blog/2014/05/how-to-use-the-sharelib-in-apache-oozie-cdh-5/
>
> - Robert
>
> On Wed, Feb 24, 2016 at 11:40 AM, Liping Zhang <zl...@gmail.com>
> wrote:
>
> > Thanks Jaydeep for you quick response!
> >
> > After adding oozie.action.sharelib.for.spark = spark, it threw
> > NoSuchMethodError exception.
> >
> > Do you know how I can I add guava-16.0.1 into the oozie's class path?
> >
> >
> > >>> Invoking Spark class now >>>
> >
> >
> > <<< Invocation of Main class completed <<<
> >
> > Failing Oozie Launcher, Main class
> > [org.apache.oozie.action.hadoop.SparkMain], main() threw exception,
> > com.google.common.reflect.TypeToken.isPrimitive()Z
> > java.lang.NoSuchMethodError:
> > com.google.common.reflect.TypeToken.isPrimitive()Z
> >         at com.datastax.driver.core.TypeCodec.<init>(TypeCodec.java:142)
> >         at com.datastax.driver.core.TypeCodec.<init>(TypeCodec.java:136)
> >         at
> > com.datastax.driver.core.TypeCodec$BlobCodec.<init>(TypeCodec.java:609)
> >         at
> > com.datastax.driver.core.TypeCodec$BlobCodec.<clinit>(TypeCodec.java:606)
> >         at
> > com.datastax.driver.core.CodecRegistry.<clinit>(CodecRegistry.java:147)
> >         at
> >
> com.datastax.driver.core.Configuration$Builder.build(Configuration.java:259)
> >         at
> >
> com.datastax.driver.core.Cluster$Builder.getConfiguration(Cluster.java:1135)
> >         at com.datastax.driver.core.Cluster.<init>(Cluster.java:111)
> >         at com.datastax.driver.core.Cluster.buildFrom(Cluster.java:178)
> >         at
> > com.datastax.driver.core.Cluster$Builder.build(Cluster.java:1152)
> >         at
> >
> com.datastax.spark.connector.cql.DefaultConnectionFactory$.createCluster(CassandraConnectionFactory.scala:85)
> >         at
> >
> com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:155)
> >         at
> >
> com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:150)
> >         at
> >
> com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:150)
> >         at
> >
> com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:31)
> >         at
> >
> com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:56)
> >         at
> >
> com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:81)
> >         at
> >
> com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:109)
> >         at
> >
> com.datastax.spark.connector.cql.CassandraConnector.withClusterDo(CassandraConnector.scala:120)
> >         at
> > com.datastax.spark.connector.cql.Schema$.fromCassandra(Schema.scala:241)
> >         at
> >
> com.datastax.spark.connector.writer.TableWriter$.apply(TableWriter.scala:263)
> >         at
> >
> com.datastax.spark.connector.RDDFunctions.saveToCassandra(RDDFunctions.scala:36)
> >         at TestCassandra$.main(TestCassandra.scala:44)
> >         at TestCassandra.main(TestCassandra.scala)
> >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >
> >
> > On Wed, Feb 24, 2016 at 5:00 AM, Jaydeep Vishwakarma <
> > jaydeep.vishwakarma@inmobi.com> wrote:
> >
> > > Add spark sharelib in oozie setup it will work.
> > >
> > > Regards,
> > > Jaydep
> > >
> > > On Wed, Feb 24, 2016 at 3:13 PM, Liping Zhang <zl...@gmail.com>
> > > wrote:
> > >
> > > > Dear oozie user and dev,
> > > >
> > > > I set following values in job.properties and in action properties in
> > CDH
> > > > 5.5.0 Hue oozie:
> > > > oozie.use.system.libpath=true
> > > >
> > > >
> > >
> >
> oozie.libpath=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935
> > > >
> > > >
> > >
> >
> oozie.action.sharelib.for.spark=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935/spark
> > > >
> > > > *Here is the workflow file:*
> > > > <workflow-app name="sparktest-cassandra"
> > xmlns="uri:oozie:workflow:0.5">
> > > >     <start to="spark-b23b"/>
> > > >     <kill name="Kill">
> > > >         <message>Action failed, error
> > > > message[${wf:errorMessage(wf:lastErrorNode())}]</message>
> > > >     </kill>
> > > >     <action name="spark-b23b">
> > > >         <spark xmlns="uri:oozie:spark-action:0.1">
> > > >             <job-tracker>${jobTracker}</job-tracker>
> > > >             <name-node>${nameNode}</name-node>
> > > >             <configuration>
> > > >                 <property>
> > > >                     <name>spark.executor.extraClassPath</name>
> > > >                     <value>lib/guava-16.0.1.jar</value>
> > > >                 </property>
> > > >                 <property>
> > > >                     <name>oozie.action.sharelib.for.spark</name>
> > > >
> > > > <value>/user/oozie/share/lib/lib_20151201085935/spark</value>
> > > >                 </property>
> > > >                 <property>
> > > >                     <name>oozie.use.system.libpath</name>
> > > >                     <value>true</value>
> > > >                 </property>
> > > >                 <property>
> > > >                     <name>oozie.libpath</name>
> > > >
> > >  <value>/user/oozie/share/lib/lib_20151201085935</value>
> > > >                 </property>
> > > >             </configuration>
> > > >             <master>local[4]</master>
> > > >             <mode>client</mode>
> > > >             <name>sparktest-cassandra</name>
> > > >               <class>TestCassandra</class>
> > > >             <jar>lib/sparktest.jar</jar>
> > > >               <spark-opts>--driver-class-path
> > > > /opt/cloudera/parcels/CDH/jars/guava-16.0.1.jar</spark-opts>
> > > >               <arg>s3n://gridx-output/sparktest/ </arg>
> > > >               <arg>10</arg>
> > > >               <arg>3</arg>
> > > >               <arg>2</arg>
> > > >         </spark>
> > > >         <ok to="End"/>
> > > >         <error to="Kill"/>
> > > >     </action>
> > > >     <end name="End"/>
> > > > </workflow-app>
> > > >
> > > > *job.properties file:*
> > > > oozie.use.system.libpath=true
> > > > security_enabled=False
> > > >
> > > >
> > >
> >
> oozie.libpath=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935
> > > >
> > > >
> > >
> >
> oozie.action.sharelib.for.spark=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935/spark
> > > > dryrun=False
> > > > jobTracker=ip-10-0-4-248.us-west-1.compute.internal:8032
> > > > nameNode=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020
> > > >
> > > >
> > > > But I got following SparkMain class not found issue:
> > > >
> > > > <<< Invocation of Main class completed <<<
> > > >
> > > > Failing Oozie Launcher, Main class
> > > > [org.apache.oozie.action.hadoop.SparkMain], exception invoking
> main(),
> > > > java.lang.ClassNotFoundException: Class
> > > > org.apache.oozie.action.hadoop.SparkMain not found
> > > > java.lang.RuntimeException: java.lang.ClassNotFoundException: Class
> > > > org.apache.oozie.action.hadoop.SparkMain not found
> > > >         at
> > > >
> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2199)
> > > >         at
> > > >
> > >
> >
> org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:234)
> > > >         at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
> > > >         at
> > > org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
> > > >         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
> > > >         at
> > > >
> > >
> >
> org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runSubtask(LocalContainerLauncher.java:378)
> > > >         at
> > > >
> > >
> >
> org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runTask(LocalContainerLauncher.java:296)
> > > >         at
> > > >
> > >
> >
> org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.access$200(LocalContainerLauncher.java:181)
> > > >         at
> > > >
> > >
> >
> org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler$1.run(LocalContainerLauncher.java:224)
> > > >         at
> > > >
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> > > >         at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> > > >         at
> > > >
> > >
> >
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> > > >         at
> > > >
> > >
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> > > >         at java.lang.Thread.run(Thread.java:745)
> > > > Caused by: java.lang.ClassNotFoundException: Class
> > > > org.apache.oozie.action.hadoop.SparkMain not found
> > > >         at
> > > >
> > >
> >
> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2105)
> > > >         at
> > > >
> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2197)
> > > >         ... 13 more
> > > >
> > > > Oozie Launcher failed, finishing Hadoop job gracefully
> > > >
> > > > Oozie Launcher, uploading action data to HDFS sequence file:
> > > >
> > > >
> > >
> >
> hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/admin/oozie-oozi/0000006-160224085347053-oozie-oozi-W/spark-b23b--spark/action-data.seq
> > > > <
> > > >
> > >
> >
> http://ip-10-0-4-248.us-west-1.compute.internal:8888/filebrowser/view=/user/admin/oozie-oozi/0000006-160224085347053-oozie-oozi-W/spark-b23b--spark/action-data.seq
> > > > >
> > > >
> > > > Oozie Launcher ends
> > > >
> > > > Can you help to give any suggestion? Thanks a lot!
> > > >
> > > > --
> > > > Cheers,
> > > > -----
> > > > Big Data - Big Wisdom - Big Value
> > > > --------------
> > > > Michelle Zhang (张莉苹)
> > > >
> > >
> > > --
> > > _____________________________________________________________
> > > The information contained in this communication is intended solely for
> > the
> > > use of the individual or entity to whom it is addressed and others
> > > authorized to receive it. It may contain confidential or legally
> > privileged
> > > information. If you are not the intended recipient you are hereby
> > notified
> > > that any disclosure, copying, distribution or taking any action in
> > reliance
> > > on the contents of this information is strictly prohibited and may be
> > > unlawful. If you have received this communication in error, please
> notify
> > > us immediately by responding to this email and then delete it from your
> > > system. The firm is neither liable for the proper and complete
> > transmission
> > > of the information contained in this communication nor for any delay in
> > its
> > > receipt.
> > >
> >
> >
> >
> > --
> > Cheers,
> > -----
> > Big Data - Big Wisdom - Big Value
> > --------------
> > Michelle Zhang (张莉苹)
> >
>



-- 
Cheers,
-----
Big Data - Big Wisdom - Big Value
--------------
Michelle Zhang (张莉苹)

Re: Class org.apache.oozie.action.hadoop.SparkMain not found

Posted by Robert Kanter <rk...@cloudera.com>.
Hi Liping,

You don't need all 3 of these:
oozie.use.system.libpath=true
oozie.libpath=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935
oozie.action.sharelib.for.spark=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935/spark

In fact, you may run into problems with the latter two if you ever upgrade
the sharelib, as the timestamped directory will change.  Plus, your setting
for oozie.libpath is including all sharelib subdirectories, which is way
more jars than you need and may also cause conflicts.

All you need is
oozie.use.system.libpath=true

This tells Oozie to get the appropriate sharelib for the action; in this
case, spark.

Please read this blog post to get a better understanding of how the
sharelib works:
http://blog.cloudera.com/blog/2014/05/how-to-use-the-sharelib-in-apache-oozie-cdh-5/

- Robert

On Wed, Feb 24, 2016 at 11:40 AM, Liping Zhang <zl...@gmail.com>
wrote:

> Thanks Jaydeep for you quick response!
>
> After adding oozie.action.sharelib.for.spark = spark, it threw
> NoSuchMethodError exception.
>
> Do you know how I can I add guava-16.0.1 into the oozie's class path?
>
>
> >>> Invoking Spark class now >>>
>
>
> <<< Invocation of Main class completed <<<
>
> Failing Oozie Launcher, Main class
> [org.apache.oozie.action.hadoop.SparkMain], main() threw exception,
> com.google.common.reflect.TypeToken.isPrimitive()Z
> java.lang.NoSuchMethodError:
> com.google.common.reflect.TypeToken.isPrimitive()Z
>         at com.datastax.driver.core.TypeCodec.<init>(TypeCodec.java:142)
>         at com.datastax.driver.core.TypeCodec.<init>(TypeCodec.java:136)
>         at
> com.datastax.driver.core.TypeCodec$BlobCodec.<init>(TypeCodec.java:609)
>         at
> com.datastax.driver.core.TypeCodec$BlobCodec.<clinit>(TypeCodec.java:606)
>         at
> com.datastax.driver.core.CodecRegistry.<clinit>(CodecRegistry.java:147)
>         at
> com.datastax.driver.core.Configuration$Builder.build(Configuration.java:259)
>         at
> com.datastax.driver.core.Cluster$Builder.getConfiguration(Cluster.java:1135)
>         at com.datastax.driver.core.Cluster.<init>(Cluster.java:111)
>         at com.datastax.driver.core.Cluster.buildFrom(Cluster.java:178)
>         at
> com.datastax.driver.core.Cluster$Builder.build(Cluster.java:1152)
>         at
> com.datastax.spark.connector.cql.DefaultConnectionFactory$.createCluster(CassandraConnectionFactory.scala:85)
>         at
> com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:155)
>         at
> com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:150)
>         at
> com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:150)
>         at
> com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:31)
>         at
> com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:56)
>         at
> com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:81)
>         at
> com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:109)
>         at
> com.datastax.spark.connector.cql.CassandraConnector.withClusterDo(CassandraConnector.scala:120)
>         at
> com.datastax.spark.connector.cql.Schema$.fromCassandra(Schema.scala:241)
>         at
> com.datastax.spark.connector.writer.TableWriter$.apply(TableWriter.scala:263)
>         at
> com.datastax.spark.connector.RDDFunctions.saveToCassandra(RDDFunctions.scala:36)
>         at TestCassandra$.main(TestCassandra.scala:44)
>         at TestCassandra.main(TestCassandra.scala)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
>
> On Wed, Feb 24, 2016 at 5:00 AM, Jaydeep Vishwakarma <
> jaydeep.vishwakarma@inmobi.com> wrote:
>
> > Add spark sharelib in oozie setup it will work.
> >
> > Regards,
> > Jaydep
> >
> > On Wed, Feb 24, 2016 at 3:13 PM, Liping Zhang <zl...@gmail.com>
> > wrote:
> >
> > > Dear oozie user and dev,
> > >
> > > I set following values in job.properties and in action properties in
> CDH
> > > 5.5.0 Hue oozie:
> > > oozie.use.system.libpath=true
> > >
> > >
> >
> oozie.libpath=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935
> > >
> > >
> >
> oozie.action.sharelib.for.spark=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935/spark
> > >
> > > *Here is the workflow file:*
> > > <workflow-app name="sparktest-cassandra"
> xmlns="uri:oozie:workflow:0.5">
> > >     <start to="spark-b23b"/>
> > >     <kill name="Kill">
> > >         <message>Action failed, error
> > > message[${wf:errorMessage(wf:lastErrorNode())}]</message>
> > >     </kill>
> > >     <action name="spark-b23b">
> > >         <spark xmlns="uri:oozie:spark-action:0.1">
> > >             <job-tracker>${jobTracker}</job-tracker>
> > >             <name-node>${nameNode}</name-node>
> > >             <configuration>
> > >                 <property>
> > >                     <name>spark.executor.extraClassPath</name>
> > >                     <value>lib/guava-16.0.1.jar</value>
> > >                 </property>
> > >                 <property>
> > >                     <name>oozie.action.sharelib.for.spark</name>
> > >
> > > <value>/user/oozie/share/lib/lib_20151201085935/spark</value>
> > >                 </property>
> > >                 <property>
> > >                     <name>oozie.use.system.libpath</name>
> > >                     <value>true</value>
> > >                 </property>
> > >                 <property>
> > >                     <name>oozie.libpath</name>
> > >
> >  <value>/user/oozie/share/lib/lib_20151201085935</value>
> > >                 </property>
> > >             </configuration>
> > >             <master>local[4]</master>
> > >             <mode>client</mode>
> > >             <name>sparktest-cassandra</name>
> > >               <class>TestCassandra</class>
> > >             <jar>lib/sparktest.jar</jar>
> > >               <spark-opts>--driver-class-path
> > > /opt/cloudera/parcels/CDH/jars/guava-16.0.1.jar</spark-opts>
> > >               <arg>s3n://gridx-output/sparktest/ </arg>
> > >               <arg>10</arg>
> > >               <arg>3</arg>
> > >               <arg>2</arg>
> > >         </spark>
> > >         <ok to="End"/>
> > >         <error to="Kill"/>
> > >     </action>
> > >     <end name="End"/>
> > > </workflow-app>
> > >
> > > *job.properties file:*
> > > oozie.use.system.libpath=true
> > > security_enabled=False
> > >
> > >
> >
> oozie.libpath=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935
> > >
> > >
> >
> oozie.action.sharelib.for.spark=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935/spark
> > > dryrun=False
> > > jobTracker=ip-10-0-4-248.us-west-1.compute.internal:8032
> > > nameNode=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020
> > >
> > >
> > > But I got following SparkMain class not found issue:
> > >
> > > <<< Invocation of Main class completed <<<
> > >
> > > Failing Oozie Launcher, Main class
> > > [org.apache.oozie.action.hadoop.SparkMain], exception invoking main(),
> > > java.lang.ClassNotFoundException: Class
> > > org.apache.oozie.action.hadoop.SparkMain not found
> > > java.lang.RuntimeException: java.lang.ClassNotFoundException: Class
> > > org.apache.oozie.action.hadoop.SparkMain not found
> > >         at
> > > org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2199)
> > >         at
> > >
> >
> org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:234)
> > >         at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
> > >         at
> > org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
> > >         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
> > >         at
> > >
> >
> org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runSubtask(LocalContainerLauncher.java:378)
> > >         at
> > >
> >
> org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runTask(LocalContainerLauncher.java:296)
> > >         at
> > >
> >
> org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.access$200(LocalContainerLauncher.java:181)
> > >         at
> > >
> >
> org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler$1.run(LocalContainerLauncher.java:224)
> > >         at
> > > java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> > >         at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> > >         at
> > >
> >
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> > >         at
> > >
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> > >         at java.lang.Thread.run(Thread.java:745)
> > > Caused by: java.lang.ClassNotFoundException: Class
> > > org.apache.oozie.action.hadoop.SparkMain not found
> > >         at
> > >
> >
> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2105)
> > >         at
> > > org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2197)
> > >         ... 13 more
> > >
> > > Oozie Launcher failed, finishing Hadoop job gracefully
> > >
> > > Oozie Launcher, uploading action data to HDFS sequence file:
> > >
> > >
> >
> hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/admin/oozie-oozi/0000006-160224085347053-oozie-oozi-W/spark-b23b--spark/action-data.seq
> > > <
> > >
> >
> http://ip-10-0-4-248.us-west-1.compute.internal:8888/filebrowser/view=/user/admin/oozie-oozi/0000006-160224085347053-oozie-oozi-W/spark-b23b--spark/action-data.seq
> > > >
> > >
> > > Oozie Launcher ends
> > >
> > > Can you help to give any suggestion? Thanks a lot!
> > >
> > > --
> > > Cheers,
> > > -----
> > > Big Data - Big Wisdom - Big Value
> > > --------------
> > > Michelle Zhang (张莉苹)
> > >
> >
> > --
> > _____________________________________________________________
> > The information contained in this communication is intended solely for
> the
> > use of the individual or entity to whom it is addressed and others
> > authorized to receive it. It may contain confidential or legally
> privileged
> > information. If you are not the intended recipient you are hereby
> notified
> > that any disclosure, copying, distribution or taking any action in
> reliance
> > on the contents of this information is strictly prohibited and may be
> > unlawful. If you have received this communication in error, please notify
> > us immediately by responding to this email and then delete it from your
> > system. The firm is neither liable for the proper and complete
> transmission
> > of the information contained in this communication nor for any delay in
> its
> > receipt.
> >
>
>
>
> --
> Cheers,
> -----
> Big Data - Big Wisdom - Big Value
> --------------
> Michelle Zhang (张莉苹)
>

Re: Class org.apache.oozie.action.hadoop.SparkMain not found

Posted by Robert Kanter <rk...@cloudera.com>.
Hi Liping,

You don't need all 3 of these:
oozie.use.system.libpath=true
oozie.libpath=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935
oozie.action.sharelib.for.spark=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935/spark

In fact, you may run into problems with the latter two if you ever upgrade
the sharelib, as the timestamped directory will change.  Plus, your setting
for oozie.libpath is including all sharelib subdirectories, which is way
more jars than you need and may also cause conflicts.

All you need is
oozie.use.system.libpath=true

This tells Oozie to get the appropriate sharelib for the action; in this
case, spark.

Please read this blog post to get a better understanding of how the
sharelib works:
http://blog.cloudera.com/blog/2014/05/how-to-use-the-sharelib-in-apache-oozie-cdh-5/

- Robert

On Wed, Feb 24, 2016 at 11:40 AM, Liping Zhang <zl...@gmail.com>
wrote:

> Thanks Jaydeep for you quick response!
>
> After adding oozie.action.sharelib.for.spark = spark, it threw
> NoSuchMethodError exception.
>
> Do you know how I can I add guava-16.0.1 into the oozie's class path?
>
>
> >>> Invoking Spark class now >>>
>
>
> <<< Invocation of Main class completed <<<
>
> Failing Oozie Launcher, Main class
> [org.apache.oozie.action.hadoop.SparkMain], main() threw exception,
> com.google.common.reflect.TypeToken.isPrimitive()Z
> java.lang.NoSuchMethodError:
> com.google.common.reflect.TypeToken.isPrimitive()Z
>         at com.datastax.driver.core.TypeCodec.<init>(TypeCodec.java:142)
>         at com.datastax.driver.core.TypeCodec.<init>(TypeCodec.java:136)
>         at
> com.datastax.driver.core.TypeCodec$BlobCodec.<init>(TypeCodec.java:609)
>         at
> com.datastax.driver.core.TypeCodec$BlobCodec.<clinit>(TypeCodec.java:606)
>         at
> com.datastax.driver.core.CodecRegistry.<clinit>(CodecRegistry.java:147)
>         at
> com.datastax.driver.core.Configuration$Builder.build(Configuration.java:259)
>         at
> com.datastax.driver.core.Cluster$Builder.getConfiguration(Cluster.java:1135)
>         at com.datastax.driver.core.Cluster.<init>(Cluster.java:111)
>         at com.datastax.driver.core.Cluster.buildFrom(Cluster.java:178)
>         at
> com.datastax.driver.core.Cluster$Builder.build(Cluster.java:1152)
>         at
> com.datastax.spark.connector.cql.DefaultConnectionFactory$.createCluster(CassandraConnectionFactory.scala:85)
>         at
> com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:155)
>         at
> com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:150)
>         at
> com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:150)
>         at
> com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:31)
>         at
> com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:56)
>         at
> com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:81)
>         at
> com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:109)
>         at
> com.datastax.spark.connector.cql.CassandraConnector.withClusterDo(CassandraConnector.scala:120)
>         at
> com.datastax.spark.connector.cql.Schema$.fromCassandra(Schema.scala:241)
>         at
> com.datastax.spark.connector.writer.TableWriter$.apply(TableWriter.scala:263)
>         at
> com.datastax.spark.connector.RDDFunctions.saveToCassandra(RDDFunctions.scala:36)
>         at TestCassandra$.main(TestCassandra.scala:44)
>         at TestCassandra.main(TestCassandra.scala)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
>
> On Wed, Feb 24, 2016 at 5:00 AM, Jaydeep Vishwakarma <
> jaydeep.vishwakarma@inmobi.com> wrote:
>
> > Add spark sharelib in oozie setup it will work.
> >
> > Regards,
> > Jaydep
> >
> > On Wed, Feb 24, 2016 at 3:13 PM, Liping Zhang <zl...@gmail.com>
> > wrote:
> >
> > > Dear oozie user and dev,
> > >
> > > I set following values in job.properties and in action properties in
> CDH
> > > 5.5.0 Hue oozie:
> > > oozie.use.system.libpath=true
> > >
> > >
> >
> oozie.libpath=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935
> > >
> > >
> >
> oozie.action.sharelib.for.spark=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935/spark
> > >
> > > *Here is the workflow file:*
> > > <workflow-app name="sparktest-cassandra"
> xmlns="uri:oozie:workflow:0.5">
> > >     <start to="spark-b23b"/>
> > >     <kill name="Kill">
> > >         <message>Action failed, error
> > > message[${wf:errorMessage(wf:lastErrorNode())}]</message>
> > >     </kill>
> > >     <action name="spark-b23b">
> > >         <spark xmlns="uri:oozie:spark-action:0.1">
> > >             <job-tracker>${jobTracker}</job-tracker>
> > >             <name-node>${nameNode}</name-node>
> > >             <configuration>
> > >                 <property>
> > >                     <name>spark.executor.extraClassPath</name>
> > >                     <value>lib/guava-16.0.1.jar</value>
> > >                 </property>
> > >                 <property>
> > >                     <name>oozie.action.sharelib.for.spark</name>
> > >
> > > <value>/user/oozie/share/lib/lib_20151201085935/spark</value>
> > >                 </property>
> > >                 <property>
> > >                     <name>oozie.use.system.libpath</name>
> > >                     <value>true</value>
> > >                 </property>
> > >                 <property>
> > >                     <name>oozie.libpath</name>
> > >
> >  <value>/user/oozie/share/lib/lib_20151201085935</value>
> > >                 </property>
> > >             </configuration>
> > >             <master>local[4]</master>
> > >             <mode>client</mode>
> > >             <name>sparktest-cassandra</name>
> > >               <class>TestCassandra</class>
> > >             <jar>lib/sparktest.jar</jar>
> > >               <spark-opts>--driver-class-path
> > > /opt/cloudera/parcels/CDH/jars/guava-16.0.1.jar</spark-opts>
> > >               <arg>s3n://gridx-output/sparktest/ </arg>
> > >               <arg>10</arg>
> > >               <arg>3</arg>
> > >               <arg>2</arg>
> > >         </spark>
> > >         <ok to="End"/>
> > >         <error to="Kill"/>
> > >     </action>
> > >     <end name="End"/>
> > > </workflow-app>
> > >
> > > *job.properties file:*
> > > oozie.use.system.libpath=true
> > > security_enabled=False
> > >
> > >
> >
> oozie.libpath=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935
> > >
> > >
> >
> oozie.action.sharelib.for.spark=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935/spark
> > > dryrun=False
> > > jobTracker=ip-10-0-4-248.us-west-1.compute.internal:8032
> > > nameNode=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020
> > >
> > >
> > > But I got following SparkMain class not found issue:
> > >
> > > <<< Invocation of Main class completed <<<
> > >
> > > Failing Oozie Launcher, Main class
> > > [org.apache.oozie.action.hadoop.SparkMain], exception invoking main(),
> > > java.lang.ClassNotFoundException: Class
> > > org.apache.oozie.action.hadoop.SparkMain not found
> > > java.lang.RuntimeException: java.lang.ClassNotFoundException: Class
> > > org.apache.oozie.action.hadoop.SparkMain not found
> > >         at
> > > org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2199)
> > >         at
> > >
> >
> org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:234)
> > >         at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
> > >         at
> > org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
> > >         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
> > >         at
> > >
> >
> org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runSubtask(LocalContainerLauncher.java:378)
> > >         at
> > >
> >
> org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runTask(LocalContainerLauncher.java:296)
> > >         at
> > >
> >
> org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.access$200(LocalContainerLauncher.java:181)
> > >         at
> > >
> >
> org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler$1.run(LocalContainerLauncher.java:224)
> > >         at
> > > java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> > >         at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> > >         at
> > >
> >
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> > >         at
> > >
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> > >         at java.lang.Thread.run(Thread.java:745)
> > > Caused by: java.lang.ClassNotFoundException: Class
> > > org.apache.oozie.action.hadoop.SparkMain not found
> > >         at
> > >
> >
> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2105)
> > >         at
> > > org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2197)
> > >         ... 13 more
> > >
> > > Oozie Launcher failed, finishing Hadoop job gracefully
> > >
> > > Oozie Launcher, uploading action data to HDFS sequence file:
> > >
> > >
> >
> hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/admin/oozie-oozi/0000006-160224085347053-oozie-oozi-W/spark-b23b--spark/action-data.seq
> > > <
> > >
> >
> http://ip-10-0-4-248.us-west-1.compute.internal:8888/filebrowser/view=/user/admin/oozie-oozi/0000006-160224085347053-oozie-oozi-W/spark-b23b--spark/action-data.seq
> > > >
> > >
> > > Oozie Launcher ends
> > >
> > > Can you help to give any suggestion? Thanks a lot!
> > >
> > > --
> > > Cheers,
> > > -----
> > > Big Data - Big Wisdom - Big Value
> > > --------------
> > > Michelle Zhang (张莉苹)
> > >
> >
> > --
> > _____________________________________________________________
> > The information contained in this communication is intended solely for
> the
> > use of the individual or entity to whom it is addressed and others
> > authorized to receive it. It may contain confidential or legally
> privileged
> > information. If you are not the intended recipient you are hereby
> notified
> > that any disclosure, copying, distribution or taking any action in
> reliance
> > on the contents of this information is strictly prohibited and may be
> > unlawful. If you have received this communication in error, please notify
> > us immediately by responding to this email and then delete it from your
> > system. The firm is neither liable for the proper and complete
> transmission
> > of the information contained in this communication nor for any delay in
> its
> > receipt.
> >
>
>
>
> --
> Cheers,
> -----
> Big Data - Big Wisdom - Big Value
> --------------
> Michelle Zhang (张莉苹)
>

Re: Class org.apache.oozie.action.hadoop.SparkMain not found

Posted by Liping Zhang <zl...@gmail.com>.
Thanks Jaydeep for you quick response!

After adding oozie.action.sharelib.for.spark = spark, it threw
NoSuchMethodError exception.

Do you know how I can I add guava-16.0.1 into the oozie's class path?


>>> Invoking Spark class now >>>


<<< Invocation of Main class completed <<<

Failing Oozie Launcher, Main class
[org.apache.oozie.action.hadoop.SparkMain], main() threw exception,
com.google.common.reflect.TypeToken.isPrimitive()Z
java.lang.NoSuchMethodError: com.google.common.reflect.TypeToken.isPrimitive()Z
	at com.datastax.driver.core.TypeCodec.<init>(TypeCodec.java:142)
	at com.datastax.driver.core.TypeCodec.<init>(TypeCodec.java:136)
	at com.datastax.driver.core.TypeCodec$BlobCodec.<init>(TypeCodec.java:609)
	at com.datastax.driver.core.TypeCodec$BlobCodec.<clinit>(TypeCodec.java:606)
	at com.datastax.driver.core.CodecRegistry.<clinit>(CodecRegistry.java:147)
	at com.datastax.driver.core.Configuration$Builder.build(Configuration.java:259)
	at com.datastax.driver.core.Cluster$Builder.getConfiguration(Cluster.java:1135)
	at com.datastax.driver.core.Cluster.<init>(Cluster.java:111)
	at com.datastax.driver.core.Cluster.buildFrom(Cluster.java:178)
	at com.datastax.driver.core.Cluster$Builder.build(Cluster.java:1152)
	at com.datastax.spark.connector.cql.DefaultConnectionFactory$.createCluster(CassandraConnectionFactory.scala:85)
	at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:155)
	at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:150)
	at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:150)
	at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:31)
	at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:56)
	at com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:81)
	at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:109)
	at com.datastax.spark.connector.cql.CassandraConnector.withClusterDo(CassandraConnector.scala:120)
	at com.datastax.spark.connector.cql.Schema$.fromCassandra(Schema.scala:241)
	at com.datastax.spark.connector.writer.TableWriter$.apply(TableWriter.scala:263)
	at com.datastax.spark.connector.RDDFunctions.saveToCassandra(RDDFunctions.scala:36)
	at TestCassandra$.main(TestCassandra.scala:44)
	at TestCassandra.main(TestCassandra.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)


On Wed, Feb 24, 2016 at 5:00 AM, Jaydeep Vishwakarma <
jaydeep.vishwakarma@inmobi.com> wrote:

> Add spark sharelib in oozie setup it will work.
>
> Regards,
> Jaydep
>
> On Wed, Feb 24, 2016 at 3:13 PM, Liping Zhang <zl...@gmail.com>
> wrote:
>
> > Dear oozie user and dev,
> >
> > I set following values in job.properties and in action properties in CDH
> > 5.5.0 Hue oozie:
> > oozie.use.system.libpath=true
> >
> >
> oozie.libpath=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935
> >
> >
> oozie.action.sharelib.for.spark=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935/spark
> >
> > *Here is the workflow file:*
> > <workflow-app name="sparktest-cassandra" xmlns="uri:oozie:workflow:0.5">
> >     <start to="spark-b23b"/>
> >     <kill name="Kill">
> >         <message>Action failed, error
> > message[${wf:errorMessage(wf:lastErrorNode())}]</message>
> >     </kill>
> >     <action name="spark-b23b">
> >         <spark xmlns="uri:oozie:spark-action:0.1">
> >             <job-tracker>${jobTracker}</job-tracker>
> >             <name-node>${nameNode}</name-node>
> >             <configuration>
> >                 <property>
> >                     <name>spark.executor.extraClassPath</name>
> >                     <value>lib/guava-16.0.1.jar</value>
> >                 </property>
> >                 <property>
> >                     <name>oozie.action.sharelib.for.spark</name>
> >
> > <value>/user/oozie/share/lib/lib_20151201085935/spark</value>
> >                 </property>
> >                 <property>
> >                     <name>oozie.use.system.libpath</name>
> >                     <value>true</value>
> >                 </property>
> >                 <property>
> >                     <name>oozie.libpath</name>
> >
>  <value>/user/oozie/share/lib/lib_20151201085935</value>
> >                 </property>
> >             </configuration>
> >             <master>local[4]</master>
> >             <mode>client</mode>
> >             <name>sparktest-cassandra</name>
> >               <class>TestCassandra</class>
> >             <jar>lib/sparktest.jar</jar>
> >               <spark-opts>--driver-class-path
> > /opt/cloudera/parcels/CDH/jars/guava-16.0.1.jar</spark-opts>
> >               <arg>s3n://gridx-output/sparktest/ </arg>
> >               <arg>10</arg>
> >               <arg>3</arg>
> >               <arg>2</arg>
> >         </spark>
> >         <ok to="End"/>
> >         <error to="Kill"/>
> >     </action>
> >     <end name="End"/>
> > </workflow-app>
> >
> > *job.properties file:*
> > oozie.use.system.libpath=true
> > security_enabled=False
> >
> >
> oozie.libpath=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935
> >
> >
> oozie.action.sharelib.for.spark=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935/spark
> > dryrun=False
> > jobTracker=ip-10-0-4-248.us-west-1.compute.internal:8032
> > nameNode=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020
> >
> >
> > But I got following SparkMain class not found issue:
> >
> > <<< Invocation of Main class completed <<<
> >
> > Failing Oozie Launcher, Main class
> > [org.apache.oozie.action.hadoop.SparkMain], exception invoking main(),
> > java.lang.ClassNotFoundException: Class
> > org.apache.oozie.action.hadoop.SparkMain not found
> > java.lang.RuntimeException: java.lang.ClassNotFoundException: Class
> > org.apache.oozie.action.hadoop.SparkMain not found
> >         at
> > org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2199)
> >         at
> >
> org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:234)
> >         at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
> >         at
> org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
> >         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
> >         at
> >
> org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runSubtask(LocalContainerLauncher.java:378)
> >         at
> >
> org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runTask(LocalContainerLauncher.java:296)
> >         at
> >
> org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.access$200(LocalContainerLauncher.java:181)
> >         at
> >
> org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler$1.run(LocalContainerLauncher.java:224)
> >         at
> > java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> >         at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> >         at
> >
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> >         at
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> >         at java.lang.Thread.run(Thread.java:745)
> > Caused by: java.lang.ClassNotFoundException: Class
> > org.apache.oozie.action.hadoop.SparkMain not found
> >         at
> >
> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2105)
> >         at
> > org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2197)
> >         ... 13 more
> >
> > Oozie Launcher failed, finishing Hadoop job gracefully
> >
> > Oozie Launcher, uploading action data to HDFS sequence file:
> >
> >
> hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/admin/oozie-oozi/0000006-160224085347053-oozie-oozi-W/spark-b23b--spark/action-data.seq
> > <
> >
> http://ip-10-0-4-248.us-west-1.compute.internal:8888/filebrowser/view=/user/admin/oozie-oozi/0000006-160224085347053-oozie-oozi-W/spark-b23b--spark/action-data.seq
> > >
> >
> > Oozie Launcher ends
> >
> > Can you help to give any suggestion? Thanks a lot!
> >
> > --
> > Cheers,
> > -----
> > Big Data - Big Wisdom - Big Value
> > --------------
> > Michelle Zhang (张莉苹)
> >
>
> --
> _____________________________________________________________
> The information contained in this communication is intended solely for the
> use of the individual or entity to whom it is addressed and others
> authorized to receive it. It may contain confidential or legally privileged
> information. If you are not the intended recipient you are hereby notified
> that any disclosure, copying, distribution or taking any action in reliance
> on the contents of this information is strictly prohibited and may be
> unlawful. If you have received this communication in error, please notify
> us immediately by responding to this email and then delete it from your
> system. The firm is neither liable for the proper and complete transmission
> of the information contained in this communication nor for any delay in its
> receipt.
>



-- 
Cheers,
-----
Big Data - Big Wisdom - Big Value
--------------
Michelle Zhang (张莉苹)

Re: Class org.apache.oozie.action.hadoop.SparkMain not found

Posted by Liping Zhang <zl...@gmail.com>.
Thanks Jaydeep for you quick response!

After adding oozie.action.sharelib.for.spark = spark, it threw
NoSuchMethodError exception.

Do you know how I can I add guava-16.0.1 into the oozie's class path?


>>> Invoking Spark class now >>>


<<< Invocation of Main class completed <<<

Failing Oozie Launcher, Main class
[org.apache.oozie.action.hadoop.SparkMain], main() threw exception,
com.google.common.reflect.TypeToken.isPrimitive()Z
java.lang.NoSuchMethodError: com.google.common.reflect.TypeToken.isPrimitive()Z
	at com.datastax.driver.core.TypeCodec.<init>(TypeCodec.java:142)
	at com.datastax.driver.core.TypeCodec.<init>(TypeCodec.java:136)
	at com.datastax.driver.core.TypeCodec$BlobCodec.<init>(TypeCodec.java:609)
	at com.datastax.driver.core.TypeCodec$BlobCodec.<clinit>(TypeCodec.java:606)
	at com.datastax.driver.core.CodecRegistry.<clinit>(CodecRegistry.java:147)
	at com.datastax.driver.core.Configuration$Builder.build(Configuration.java:259)
	at com.datastax.driver.core.Cluster$Builder.getConfiguration(Cluster.java:1135)
	at com.datastax.driver.core.Cluster.<init>(Cluster.java:111)
	at com.datastax.driver.core.Cluster.buildFrom(Cluster.java:178)
	at com.datastax.driver.core.Cluster$Builder.build(Cluster.java:1152)
	at com.datastax.spark.connector.cql.DefaultConnectionFactory$.createCluster(CassandraConnectionFactory.scala:85)
	at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:155)
	at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:150)
	at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:150)
	at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:31)
	at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:56)
	at com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:81)
	at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:109)
	at com.datastax.spark.connector.cql.CassandraConnector.withClusterDo(CassandraConnector.scala:120)
	at com.datastax.spark.connector.cql.Schema$.fromCassandra(Schema.scala:241)
	at com.datastax.spark.connector.writer.TableWriter$.apply(TableWriter.scala:263)
	at com.datastax.spark.connector.RDDFunctions.saveToCassandra(RDDFunctions.scala:36)
	at TestCassandra$.main(TestCassandra.scala:44)
	at TestCassandra.main(TestCassandra.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)


On Wed, Feb 24, 2016 at 5:00 AM, Jaydeep Vishwakarma <
jaydeep.vishwakarma@inmobi.com> wrote:

> Add spark sharelib in oozie setup it will work.
>
> Regards,
> Jaydep
>
> On Wed, Feb 24, 2016 at 3:13 PM, Liping Zhang <zl...@gmail.com>
> wrote:
>
> > Dear oozie user and dev,
> >
> > I set following values in job.properties and in action properties in CDH
> > 5.5.0 Hue oozie:
> > oozie.use.system.libpath=true
> >
> >
> oozie.libpath=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935
> >
> >
> oozie.action.sharelib.for.spark=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935/spark
> >
> > *Here is the workflow file:*
> > <workflow-app name="sparktest-cassandra" xmlns="uri:oozie:workflow:0.5">
> >     <start to="spark-b23b"/>
> >     <kill name="Kill">
> >         <message>Action failed, error
> > message[${wf:errorMessage(wf:lastErrorNode())}]</message>
> >     </kill>
> >     <action name="spark-b23b">
> >         <spark xmlns="uri:oozie:spark-action:0.1">
> >             <job-tracker>${jobTracker}</job-tracker>
> >             <name-node>${nameNode}</name-node>
> >             <configuration>
> >                 <property>
> >                     <name>spark.executor.extraClassPath</name>
> >                     <value>lib/guava-16.0.1.jar</value>
> >                 </property>
> >                 <property>
> >                     <name>oozie.action.sharelib.for.spark</name>
> >
> > <value>/user/oozie/share/lib/lib_20151201085935/spark</value>
> >                 </property>
> >                 <property>
> >                     <name>oozie.use.system.libpath</name>
> >                     <value>true</value>
> >                 </property>
> >                 <property>
> >                     <name>oozie.libpath</name>
> >
>  <value>/user/oozie/share/lib/lib_20151201085935</value>
> >                 </property>
> >             </configuration>
> >             <master>local[4]</master>
> >             <mode>client</mode>
> >             <name>sparktest-cassandra</name>
> >               <class>TestCassandra</class>
> >             <jar>lib/sparktest.jar</jar>
> >               <spark-opts>--driver-class-path
> > /opt/cloudera/parcels/CDH/jars/guava-16.0.1.jar</spark-opts>
> >               <arg>s3n://gridx-output/sparktest/ </arg>
> >               <arg>10</arg>
> >               <arg>3</arg>
> >               <arg>2</arg>
> >         </spark>
> >         <ok to="End"/>
> >         <error to="Kill"/>
> >     </action>
> >     <end name="End"/>
> > </workflow-app>
> >
> > *job.properties file:*
> > oozie.use.system.libpath=true
> > security_enabled=False
> >
> >
> oozie.libpath=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935
> >
> >
> oozie.action.sharelib.for.spark=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935/spark
> > dryrun=False
> > jobTracker=ip-10-0-4-248.us-west-1.compute.internal:8032
> > nameNode=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020
> >
> >
> > But I got following SparkMain class not found issue:
> >
> > <<< Invocation of Main class completed <<<
> >
> > Failing Oozie Launcher, Main class
> > [org.apache.oozie.action.hadoop.SparkMain], exception invoking main(),
> > java.lang.ClassNotFoundException: Class
> > org.apache.oozie.action.hadoop.SparkMain not found
> > java.lang.RuntimeException: java.lang.ClassNotFoundException: Class
> > org.apache.oozie.action.hadoop.SparkMain not found
> >         at
> > org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2199)
> >         at
> >
> org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:234)
> >         at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
> >         at
> org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
> >         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
> >         at
> >
> org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runSubtask(LocalContainerLauncher.java:378)
> >         at
> >
> org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runTask(LocalContainerLauncher.java:296)
> >         at
> >
> org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.access$200(LocalContainerLauncher.java:181)
> >         at
> >
> org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler$1.run(LocalContainerLauncher.java:224)
> >         at
> > java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
> >         at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> >         at
> >
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> >         at
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> >         at java.lang.Thread.run(Thread.java:745)
> > Caused by: java.lang.ClassNotFoundException: Class
> > org.apache.oozie.action.hadoop.SparkMain not found
> >         at
> >
> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2105)
> >         at
> > org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2197)
> >         ... 13 more
> >
> > Oozie Launcher failed, finishing Hadoop job gracefully
> >
> > Oozie Launcher, uploading action data to HDFS sequence file:
> >
> >
> hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/admin/oozie-oozi/0000006-160224085347053-oozie-oozi-W/spark-b23b--spark/action-data.seq
> > <
> >
> http://ip-10-0-4-248.us-west-1.compute.internal:8888/filebrowser/view=/user/admin/oozie-oozi/0000006-160224085347053-oozie-oozi-W/spark-b23b--spark/action-data.seq
> > >
> >
> > Oozie Launcher ends
> >
> > Can you help to give any suggestion? Thanks a lot!
> >
> > --
> > Cheers,
> > -----
> > Big Data - Big Wisdom - Big Value
> > --------------
> > Michelle Zhang (张莉苹)
> >
>
> --
> _____________________________________________________________
> The information contained in this communication is intended solely for the
> use of the individual or entity to whom it is addressed and others
> authorized to receive it. It may contain confidential or legally privileged
> information. If you are not the intended recipient you are hereby notified
> that any disclosure, copying, distribution or taking any action in reliance
> on the contents of this information is strictly prohibited and may be
> unlawful. If you have received this communication in error, please notify
> us immediately by responding to this email and then delete it from your
> system. The firm is neither liable for the proper and complete transmission
> of the information contained in this communication nor for any delay in its
> receipt.
>



-- 
Cheers,
-----
Big Data - Big Wisdom - Big Value
--------------
Michelle Zhang (张莉苹)

Re: Class org.apache.oozie.action.hadoop.SparkMain not found

Posted by Jaydeep Vishwakarma <ja...@inmobi.com>.
Add spark sharelib in oozie setup it will work.

Regards,
Jaydep

On Wed, Feb 24, 2016 at 3:13 PM, Liping Zhang <zl...@gmail.com> wrote:

> Dear oozie user and dev,
>
> I set following values in job.properties and in action properties in CDH
> 5.5.0 Hue oozie:
> oozie.use.system.libpath=true
>
> oozie.libpath=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935
>
> oozie.action.sharelib.for.spark=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935/spark
>
> *Here is the workflow file:*
> <workflow-app name="sparktest-cassandra" xmlns="uri:oozie:workflow:0.5">
>     <start to="spark-b23b"/>
>     <kill name="Kill">
>         <message>Action failed, error
> message[${wf:errorMessage(wf:lastErrorNode())}]</message>
>     </kill>
>     <action name="spark-b23b">
>         <spark xmlns="uri:oozie:spark-action:0.1">
>             <job-tracker>${jobTracker}</job-tracker>
>             <name-node>${nameNode}</name-node>
>             <configuration>
>                 <property>
>                     <name>spark.executor.extraClassPath</name>
>                     <value>lib/guava-16.0.1.jar</value>
>                 </property>
>                 <property>
>                     <name>oozie.action.sharelib.for.spark</name>
>
> <value>/user/oozie/share/lib/lib_20151201085935/spark</value>
>                 </property>
>                 <property>
>                     <name>oozie.use.system.libpath</name>
>                     <value>true</value>
>                 </property>
>                 <property>
>                     <name>oozie.libpath</name>
>                     <value>/user/oozie/share/lib/lib_20151201085935</value>
>                 </property>
>             </configuration>
>             <master>local[4]</master>
>             <mode>client</mode>
>             <name>sparktest-cassandra</name>
>               <class>TestCassandra</class>
>             <jar>lib/sparktest.jar</jar>
>               <spark-opts>--driver-class-path
> /opt/cloudera/parcels/CDH/jars/guava-16.0.1.jar</spark-opts>
>               <arg>s3n://gridx-output/sparktest/ </arg>
>               <arg>10</arg>
>               <arg>3</arg>
>               <arg>2</arg>
>         </spark>
>         <ok to="End"/>
>         <error to="Kill"/>
>     </action>
>     <end name="End"/>
> </workflow-app>
>
> *job.properties file:*
> oozie.use.system.libpath=true
> security_enabled=False
>
> oozie.libpath=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935
>
> oozie.action.sharelib.for.spark=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935/spark
> dryrun=False
> jobTracker=ip-10-0-4-248.us-west-1.compute.internal:8032
> nameNode=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020
>
>
> But I got following SparkMain class not found issue:
>
> <<< Invocation of Main class completed <<<
>
> Failing Oozie Launcher, Main class
> [org.apache.oozie.action.hadoop.SparkMain], exception invoking main(),
> java.lang.ClassNotFoundException: Class
> org.apache.oozie.action.hadoop.SparkMain not found
> java.lang.RuntimeException: java.lang.ClassNotFoundException: Class
> org.apache.oozie.action.hadoop.SparkMain not found
>         at
> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2199)
>         at
> org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:234)
>         at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>         at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
>         at
> org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runSubtask(LocalContainerLauncher.java:378)
>         at
> org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runTask(LocalContainerLauncher.java:296)
>         at
> org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.access$200(LocalContainerLauncher.java:181)
>         at
> org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler$1.run(LocalContainerLauncher.java:224)
>         at
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>         at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>         at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>         at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>         at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.ClassNotFoundException: Class
> org.apache.oozie.action.hadoop.SparkMain not found
>         at
> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2105)
>         at
> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2197)
>         ... 13 more
>
> Oozie Launcher failed, finishing Hadoop job gracefully
>
> Oozie Launcher, uploading action data to HDFS sequence file:
>
> hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/admin/oozie-oozi/0000006-160224085347053-oozie-oozi-W/spark-b23b--spark/action-data.seq
> <
> http://ip-10-0-4-248.us-west-1.compute.internal:8888/filebrowser/view=/user/admin/oozie-oozi/0000006-160224085347053-oozie-oozi-W/spark-b23b--spark/action-data.seq
> >
>
> Oozie Launcher ends
>
> Can you help to give any suggestion? Thanks a lot!
>
> --
> Cheers,
> -----
> Big Data - Big Wisdom - Big Value
> --------------
> Michelle Zhang (张莉苹)
>

-- 
_____________________________________________________________
The information contained in this communication is intended solely for the 
use of the individual or entity to whom it is addressed and others 
authorized to receive it. It may contain confidential or legally privileged 
information. If you are not the intended recipient you are hereby notified 
that any disclosure, copying, distribution or taking any action in reliance 
on the contents of this information is strictly prohibited and may be 
unlawful. If you have received this communication in error, please notify 
us immediately by responding to this email and then delete it from your 
system. The firm is neither liable for the proper and complete transmission 
of the information contained in this communication nor for any delay in its 
receipt.

Re: Class org.apache.oozie.action.hadoop.SparkMain not found

Posted by Jaydeep Vishwakarma <ja...@inmobi.com>.
Add spark sharelib in oozie setup it will work.

Regards,
Jaydep

On Wed, Feb 24, 2016 at 3:13 PM, Liping Zhang <zl...@gmail.com> wrote:

> Dear oozie user and dev,
>
> I set following values in job.properties and in action properties in CDH
> 5.5.0 Hue oozie:
> oozie.use.system.libpath=true
>
> oozie.libpath=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935
>
> oozie.action.sharelib.for.spark=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935/spark
>
> *Here is the workflow file:*
> <workflow-app name="sparktest-cassandra" xmlns="uri:oozie:workflow:0.5">
>     <start to="spark-b23b"/>
>     <kill name="Kill">
>         <message>Action failed, error
> message[${wf:errorMessage(wf:lastErrorNode())}]</message>
>     </kill>
>     <action name="spark-b23b">
>         <spark xmlns="uri:oozie:spark-action:0.1">
>             <job-tracker>${jobTracker}</job-tracker>
>             <name-node>${nameNode}</name-node>
>             <configuration>
>                 <property>
>                     <name>spark.executor.extraClassPath</name>
>                     <value>lib/guava-16.0.1.jar</value>
>                 </property>
>                 <property>
>                     <name>oozie.action.sharelib.for.spark</name>
>
> <value>/user/oozie/share/lib/lib_20151201085935/spark</value>
>                 </property>
>                 <property>
>                     <name>oozie.use.system.libpath</name>
>                     <value>true</value>
>                 </property>
>                 <property>
>                     <name>oozie.libpath</name>
>                     <value>/user/oozie/share/lib/lib_20151201085935</value>
>                 </property>
>             </configuration>
>             <master>local[4]</master>
>             <mode>client</mode>
>             <name>sparktest-cassandra</name>
>               <class>TestCassandra</class>
>             <jar>lib/sparktest.jar</jar>
>               <spark-opts>--driver-class-path
> /opt/cloudera/parcels/CDH/jars/guava-16.0.1.jar</spark-opts>
>               <arg>s3n://gridx-output/sparktest/ </arg>
>               <arg>10</arg>
>               <arg>3</arg>
>               <arg>2</arg>
>         </spark>
>         <ok to="End"/>
>         <error to="Kill"/>
>     </action>
>     <end name="End"/>
> </workflow-app>
>
> *job.properties file:*
> oozie.use.system.libpath=true
> security_enabled=False
>
> oozie.libpath=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935
>
> oozie.action.sharelib.for.spark=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/oozie/share/lib/lib_20151201085935/spark
> dryrun=False
> jobTracker=ip-10-0-4-248.us-west-1.compute.internal:8032
> nameNode=hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020
>
>
> But I got following SparkMain class not found issue:
>
> <<< Invocation of Main class completed <<<
>
> Failing Oozie Launcher, Main class
> [org.apache.oozie.action.hadoop.SparkMain], exception invoking main(),
> java.lang.ClassNotFoundException: Class
> org.apache.oozie.action.hadoop.SparkMain not found
> java.lang.RuntimeException: java.lang.ClassNotFoundException: Class
> org.apache.oozie.action.hadoop.SparkMain not found
>         at
> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2199)
>         at
> org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:234)
>         at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>         at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
>         at
> org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runSubtask(LocalContainerLauncher.java:378)
>         at
> org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.runTask(LocalContainerLauncher.java:296)
>         at
> org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler.access$200(LocalContainerLauncher.java:181)
>         at
> org.apache.hadoop.mapred.LocalContainerLauncher$EventHandler$1.run(LocalContainerLauncher.java:224)
>         at
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
>         at java.util.concurrent.FutureTask.run(FutureTask.java:262)
>         at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>         at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>         at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.ClassNotFoundException: Class
> org.apache.oozie.action.hadoop.SparkMain not found
>         at
> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2105)
>         at
> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2197)
>         ... 13 more
>
> Oozie Launcher failed, finishing Hadoop job gracefully
>
> Oozie Launcher, uploading action data to HDFS sequence file:
>
> hdfs://ip-10-0-4-248.us-west-1.compute.internal:8020/user/admin/oozie-oozi/0000006-160224085347053-oozie-oozi-W/spark-b23b--spark/action-data.seq
> <
> http://ip-10-0-4-248.us-west-1.compute.internal:8888/filebrowser/view=/user/admin/oozie-oozi/0000006-160224085347053-oozie-oozi-W/spark-b23b--spark/action-data.seq
> >
>
> Oozie Launcher ends
>
> Can you help to give any suggestion? Thanks a lot!
>
> --
> Cheers,
> -----
> Big Data - Big Wisdom - Big Value
> --------------
> Michelle Zhang (张莉苹)
>

-- 
_____________________________________________________________
The information contained in this communication is intended solely for the 
use of the individual or entity to whom it is addressed and others 
authorized to receive it. It may contain confidential or legally privileged 
information. If you are not the intended recipient you are hereby notified 
that any disclosure, copying, distribution or taking any action in reliance 
on the contents of this information is strictly prohibited and may be 
unlawful. If you have received this communication in error, please notify 
us immediately by responding to this email and then delete it from your 
system. The firm is neither liable for the proper and complete transmission 
of the information contained in this communication nor for any delay in its 
receipt.