You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@kylin.apache.org by Jon Shoberg <jo...@gmail.com> on 2019/06/10 23:06:53 UTC

Fixed - Re: MetricRegistry - NoClassDefFoundError - Hbase Jar Issue - Spark - (Step - Convert Cuboid Data to HFile)?

This has been fixed now.  I had to manually search the jar files include
the proper jars in the "jars folder" of my spark setup.

find /opt -name "*.jar" | xargs grep
org.apache.hadoop.hbase.metrics.MetricRegistry

Afterwards the step completed and finished the job on the upgraded cluster.

J


On Mon, Jun 10, 2019 at 8:24 AM Jon Shoberg <jo...@gmail.com> wrote:

> Good Morning,
>
>
>  [On Error and Question]
>
>
>   Has anyone fixed the error in the "Convert Cuboid Data to HFile" step; spark 2.3.2 and Kylin 2.6.2
>
>
>    * java.lang.NoClassDefFoundError: Lorg/apache/hadoop/hbase/metrics/MetricRegistry;
>
>
>  I think this is a classpath issue. On all the slaves and master servers I have the library present and in the classpath.
>
>
>  There is a JIRA issue which should have fixed this in 2.6.0 but I'm still getting the error message.
>
>
>
>  [On The Command]
>
>
>   Following is the command for the step and I've manually added what I think are the required JARs but I get the same failure
>
>
>  export HADOOP_CONF_DIR=/opt/kylin/hadoop-conf && /opt/kylin/spark/bin/spark-submit --class org.apache.kylin.common.util.SparkEntry  --conf spark.max-partition=500  --conf spark.executor.instances=40  --conf spark.yarn.queue=default  --conf spark.history.fs.logDirectory=hdfs:///kylin/spark-history  --conf spark.io.compression.codec=snappy  --conf spark.dynamicAllocation.enabled=true  --conf spark.driver.extraClassPath=/opt/spark/jars/snappy*.jar  --conf spark.master=yarn  --conf spark.dynamicAllocation.schedulerBacklogTimeout=1  --conf spark.dynamicAllocation.executorIdleTimeout=300  --conf spark.hadoop.yarn.timeline-service.enabled=false  --conf spark.executor.memory=10G  --conf spark.eventLog.enabled=true  --conf spark.eventLog.dir=hdfs:///kylin/spark-history  --conf spark.dynamicAllocation.minExecutors=1  --conf spark.local.dir=/opt/volume/tmp1,/opt/volume/tmp2,/opt/volume/tmp3  --conf spark.executor.cores=1  --conf spark.hadoop.mapreduce.output.fileoutputformat.compress=true  --conf spark.executor.memoryOverhead=1024  --conf spark.hadoop.dfs.replication=2  --conf spark.dynamicAllocation.maxExecutors=1000  --conf spark.hadoop.mapreduce.output.fileoutputformat.compress.codec=org.apache.hadoop.io.compress.DefaultCodec  --conf spark.driver.extraLibraryPath=/opt/hadoop/lib/native  --conf spark.driver.memory=10G  --conf spark.submit.deployMode=cluster  --conf spark.executor.extraLibraryPath=/opt/hadoop/lib/native  --conf spark.shuffle.service.enabled=true --jars /opt/hbase/lib/hbase-common-1.4.8.jar,/opt/hbase/lib/hbase-server-1.4.8.jar,/opt/hbase/lib/hbase-client-1.4.8.jar,/opt/hbase/lib/hbase-protocol-1.4.8.jar,/opt/hbase/lib/hbase-hadoop-compat-1.4.8.jar,/opt/hbase/lib/htrace-core-3.1.0-incubating.jar,/opt/hbase/lib/hbase-metrics-1.4.8.jar,/opt/hbase/lib/metrics-core-2.2.0.jar,/opt/hbase/lib/hbase-hadoop-compat-1.4.8.jar,/opt/hbase/lib/hbase-hadoop2-compat-1.4.8.jar,/opt/hbase/lib/hbase-metrics-api-1.4.8.jar, /opt/kylin/lib/kylin-job-2.6.2.jar -className org.apache.kylin.storage.hbase.steps.SparkCubeHFile -partitions hdfs://master:9000/kylin/kylin_metadata/kylin-15389716-d96c-4778-ffe6-aa7ec8945c65/matt_vw_nodes/rowkey_stats/part-r-00000_hfile -counterOutput hdfs://master:9000/kylin/kylin_metadata/kylin-15389716-d96c-4778-ffe6-aa7ec8945c65/matt_vw_nodes/counter -cubename matt_vw_nodes -output hdfs://master:9000/kylin/kylin_metadata/kylin-15389716-d96c-4778-ffe6-aa7ec8945c65/matt_vw_nodes/hfile -input hdfs://master:9000/kylin/kylin_metadata/kylin-15389716-d96c-4778-ffe6-aa7ec8945c65/matt_vw_nodes/cuboid/ -segmentId 91dbfc05-1e54-8fd1-6550-855a6dd657da -metaUrl kylin_metadata@hdfs,path=hdfs://master:9000/kylin/kylin_metadata/kylin-15389716-d96c-4778-ffe6-aa7ec8945c65/matt_vw_nodes/metadata -hbaseConfPath hdfs://master:9000/kylin/kylin_metadata/kylin-15389716-d96c-4778-ffe6-aa7ec8945c65/hbase-conf.xml
>
>
>  [On Diagnostics]
>
>     Here is information on the error.
>
>
> [image: image.png]
>
>
>
> 2019-06-10 08:09:40 INFO  TransportClientFactory:184 - Found inactive connection to master.hadoop.lan/192.168.1.20:7337, creating a new one.
> 2019-06-10 08:09:40 INFO  TransportClientFactory:267 - Successfully created connection to master.hadoop.lan/192.168.1.20:7337 after 3 ms (0 ms spent in bootstraps)
> 2019-06-10 08:09:41 INFO  TransportClientFactory:184 - Found inactive connection to slave3.hadoop.lan/192.168.1.23:7337, creating a new one.
> 2019-06-10 08:09:41 INFO  TransportClientFactory:267 - Successfully created connection to slave3.hadoop.lan/192.168.1.23:7337 after 2 ms (0 ms spent in bootstraps)
> 2019-06-10 08:10:33 INFO  FileOutputCommitter:108 - File Output Committer Algorithm version is 1
> 2019-06-10 08:10:34 INFO  FileOutputCommitter:108 - File Output Committer Algorithm version is 1
> 2019-06-10 08:10:34 INFO  MetricsConfig:112 - loaded properties from hadoop-metrics2.properties
> 2019-06-10 08:10:35 INFO  MetricsSystemImpl:375 - Scheduled snapshot period at 10 second(s).
> 2019-06-10 08:10:35 INFO  MetricsSystemImpl:192 - HBase metrics system started
> 2019-06-10 08:10:35 ERROR Utils:91 - Aborting task
> java.lang.NoClassDefFoundError: Lorg/apache/hadoop/hbase/metrics/MetricRegistry;
> 	at java.lang.Class.getDeclaredFields0(Native Method)
> 	at java.lang.Class.privateGetDeclaredFields(Class.java:2583)
> 	at java.lang.Class.getDeclaredFields(Class.java:1916)
> 	at org.apache.hadoop.util.ReflectionUtils.getDeclaredFieldsIncludingInherited(ReflectionUtils.java:323)
> 	at org.apache.hadoop.metrics2.lib.MetricsSourceBuilder.initRegistry(MetricsSourceBuilder.java:92)
> 	at org.apache.hadoop.metrics2.lib.MetricsSourceBuilder.<init>(MetricsSourceBuilder.java:56)
> 	at org.apache.hadoop.metrics2.lib.MetricsAnnotations.newSourceBuilder(MetricsAnnotations.java:43)
> 	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:224)
> 	at org.apache.hadoop.hbase.metrics.BaseSourceImpl.<init>(BaseSourceImpl.java:115)
> 	at org.apache.hadoop.hbase.io.MetricsIOSourceImpl.<init>(MetricsIOSourceImpl.java:44)
> 	at org.apache.hadoop.hbase.io.MetricsIOSourceImpl.<init>(MetricsIOSourceImpl.java:36)
> 	at org.apache.hadoop.hbase.regionserver.MetricsRegionServerSourceFactoryImpl.createIO(MetricsRegionServerSourceFactoryImpl.java:73)
> 	at org.apache.hadoop.hbase.io.MetricsIO.<init>(MetricsIO.java:31)
> 	at org.apache.hadoop.hbase.io.hfile.HFile.<clinit>(HFile.java:191)
> 	at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$1.getNewWriter(HFileOutputFormat2.java:305)
> 	at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$1.write(HFileOutputFormat2.java:229)
> 	at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$1.write(HFileOutputFormat2.java:167)
> 	at org.apache.spark.internal.io.HadoopMapReduceWriteConfigUtil.write(SparkHadoopWriter.scala:356)
> 	at org.apache.spark.internal.io.SparkHadoopWriter$$anonfun$4.apply(SparkHadoopWriter.scala:130)
> 	at org.apache.spark.internal.io.SparkHadoopWriter$$anonfun$4.apply(SparkHadoopWriter.scala:127)
> 	at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1415)
> 	at org.apache.spark.internal.io.SparkHadoopWriter$.org$apache$spark$internal$io$SparkHadoopWriter$$executeTask(SparkHadoopWriter.scala:139)
> 	at org.apache.spark.internal.io.SparkHadoopWriter$$anonfun$3.apply(SparkHadoopWriter.scala:83)
> 	at org.apache.spark.internal.io.SparkHadoopWriter$$anonfun$3.apply(SparkHadoopWriter.scala:78)
> 	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
> 	at org.apache.spark.scheduler.Task.run(Task.scala:109)
> 	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> 	at java.lang.Thread.run(Thread.java:748)
> Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.metrics.MetricRegistry
> 	at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> 	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> 	... 30 more
> 2019-06-10 08:10:35 ERROR SparkHadoopWriter:70 - Task attempt_20190610074916_0003_r_000000_0 aborted.
> 2019-06-10 08:10:35 ERROR Executor:91 - Exception in task 0.0 in stage 1.0 (TID 56)
> org.apache.spark.SparkException: Task failed while writing rows
> 	at org.apache.spark.internal.io.SparkHadoopWriter$.org$apache$spark$internal$io$SparkHadoopWriter$$executeTask(SparkHadoopWriter.scala:155)
> 	at org.apache.spark.internal.io.SparkHadoopWriter$$anonfun$3.apply(SparkHadoopWriter.scala:83)
> 	at org.apache.spark.internal.io.SparkHadoopWriter$$anonfun$3.apply(SparkHadoopWriter.scala:78)
> 	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
> 	at org.apache.spark.scheduler.Task.run(Task.scala:109)
> 	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> 	at java.lang.Thread.run(Thread.java:748)
> Caused by: java.lang.NoClassDefFoundError: Lorg/apache/hadoop/hbase/metrics/MetricRegistry;
> 	at java.lang.Class.getDeclaredFields0(Native Method)
> 	at java.lang.Class.privateGetDeclaredFields(Class.java:2583)
> 	at java.lang.Class.getDeclaredFields(Class.java:1916)
> 	at org.apache.hadoop.util.ReflectionUtils.getDeclaredFieldsIncludingInherited(ReflectionUtils.java:323)
> 	at org.apache.hadoop.metrics2.lib.MetricsSourceBuilder.initRegistry(MetricsSourceBuilder.java:92)
> 	at org.apache.hadoop.metrics2.lib.MetricsSourceBuilder.<init>(MetricsSourceBuilder.java:56)
> 	at org.apache.hadoop.metrics2.lib.MetricsAnnotations.newSourceBuilder(MetricsAnnotations.java:43)
> 	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:224)
> 	at org.apache.hadoop.hbase.metrics.BaseSourceImpl.<init>(BaseSourceImpl.java:115)
> 	at org.apache.hadoop.hbase.io.MetricsIOSourceImpl.<init>(MetricsIOSourceImpl.java:44)
> 	at org.apache.hadoop.hbase.io.MetricsIOSourceImpl.<init>(MetricsIOSourceImpl.java:36)
> 	at org.apache.hadoop.hbase.regionserver.MetricsRegionServerSourceFactoryImpl.createIO(MetricsRegionServerSourceFactoryImpl.java:73)
> 	at org.apache.hadoop.hbase.io.MetricsIO.<init>(MetricsIO.java:31)
> 	at org.apache.hadoop.hbase.io.hfile.HFile.<clinit>(HFile.java:191)
> 	at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$1.getNewWriter(HFileOutputFormat2.java:305)
> 	at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$1.write(HFileOutputFormat2.java:229)
> 	at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$1.write(HFileOutputFormat2.java:167)
> 	at org.apache.spark.internal.io.HadoopMapReduceWriteConfigUtil.write(SparkHadoopWriter.scala:356)
> 	at org.apache.spark.internal.io.SparkHadoopWriter$$anonfun$4.apply(SparkHadoopWriter.scala:130)
> 	at org.apache.spark.internal.io.SparkHadoopWriter$$anonfun$4.apply(SparkHadoopWriter.scala:127)
> 	at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1415)
> 	at org.apache.spark.internal.io.SparkHadoopWriter$.org$apache$spark$internal$io$SparkHadoopWriter$$executeTask(SparkHadoopWriter.scala:139)
> 	... 8 more
> Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.metrics.MetricRegistry
> 	at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> 	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> 	... 30 more
> 2019-06-10 08:10:35 INFO  CoarseGrainedExecutorBackend:54 - Got assigned task 57
> 2019-06-10 08:10:35 INFO  Executor:54 - Running task 0.1 in stage 1.0 (TID 57)
> 2019-06-10 08:10:35 INFO  ShuffleBlockFetcherIterator:54 - Getting 56 non-empty blocks out of 56 blocks
> 2019-06-10 08:10:35 INFO  TransportClientFactory:184 - Found inactive connection to slave1.hadoop.lan/192.168.1.21:7337, creating a new one.
> 2019-06-10 08:10:35 INFO  TransportClientFactory:267 - Successfully created connection to slave1.hadoop.lan/192.168.1.21:7337 after 2 ms (0 ms spent in bootstraps)
> 2019-06-10 08:10:35 INFO  ShuffleBlockFetcherIterator:54 - Started 1 remote fetches in 9 ms
> 2019-06-10 08:10:35 INFO  TransportClientFactory:184 - Found inactive connection to slave2.hadoop.lan/192.168.1.22:7337, creating a new one.
> 2019-06-10 08:10:35 INFO  TransportClientFactory:267 - Successfully created connection to slave2.hadoop.lan/192.168.1.22:7337 after 2 ms (0 ms spent in bootstraps)
> 2019-06-10 08:11:07 INFO  ExternalSorter:54 - Thread 112 spilling in-memory map of 5.2 GB to disk (1 time so far)
>
>

Re: Fixed - Re: MetricRegistry - NoClassDefFoundError - Hbase Jar Issue - Spark - (Step - Convert Cuboid Data to HFile)?

Posted by ShaoFeng Shi <sh...@apache.org>.
Hi Joh, thank you for your sharing. :)

Best regards,

Shaofeng Shi 史少锋
Apache Kylin PMC
Email: shaofengshi@apache.org

Apache Kylin FAQ: https://kylin.apache.org/docs/gettingstarted/faq.html
Join Kylin user mail group: user-subscribe@kylin.apache.org
Join Kylin dev mail group: dev-subscribe@kylin.apache.org




Jon Shoberg <jo...@gmail.com> 于2019年6月11日周二 上午7:07写道:

> This has been fixed now.  I had to manually search the jar files include
> the proper jars in the "jars folder" of my spark setup.
>
> find /opt -name "*.jar" | xargs grep
> org.apache.hadoop.hbase.metrics.MetricRegistry
>
> Afterwards the step completed and finished the job on the upgraded cluster.
>
> J
>
>
> On Mon, Jun 10, 2019 at 8:24 AM Jon Shoberg <jo...@gmail.com> wrote:
>
>> Good Morning,
>>
>>
>>  [On Error and Question]
>>
>>
>>   Has anyone fixed the error in the "Convert Cuboid Data to HFile" step; spark 2.3.2 and Kylin 2.6.2
>>
>>
>>    * java.lang.NoClassDefFoundError: Lorg/apache/hadoop/hbase/metrics/MetricRegistry;
>>
>>
>>  I think this is a classpath issue. On all the slaves and master servers I have the library present and in the classpath.
>>
>>
>>  There is a JIRA issue which should have fixed this in 2.6.0 but I'm still getting the error message.
>>
>>
>>
>>  [On The Command]
>>
>>
>>   Following is the command for the step and I've manually added what I think are the required JARs but I get the same failure
>>
>>
>>  export HADOOP_CONF_DIR=/opt/kylin/hadoop-conf && /opt/kylin/spark/bin/spark-submit --class org.apache.kylin.common.util.SparkEntry  --conf spark.max-partition=500  --conf spark.executor.instances=40  --conf spark.yarn.queue=default  --conf spark.history.fs.logDirectory=hdfs:///kylin/spark-history  --conf spark.io.compression.codec=snappy  --conf spark.dynamicAllocation.enabled=true  --conf spark.driver.extraClassPath=/opt/spark/jars/snappy*.jar  --conf spark.master=yarn  --conf spark.dynamicAllocation.schedulerBacklogTimeout=1  --conf spark.dynamicAllocation.executorIdleTimeout=300  --conf spark.hadoop.yarn.timeline-service.enabled=false  --conf spark.executor.memory=10G  --conf spark.eventLog.enabled=true  --conf spark.eventLog.dir=hdfs:///kylin/spark-history  --conf spark.dynamicAllocation.minExecutors=1  --conf spark.local.dir=/opt/volume/tmp1,/opt/volume/tmp2,/opt/volume/tmp3  --conf spark.executor.cores=1  --conf spark.hadoop.mapreduce.output.fileoutputformat.compress=true  --conf spark.executor.memoryOverhead=1024  --conf spark.hadoop.dfs.replication=2  --conf spark.dynamicAllocation.maxExecutors=1000  --conf spark.hadoop.mapreduce.output.fileoutputformat.compress.codec=org.apache.hadoop.io.compress.DefaultCodec  --conf spark.driver.extraLibraryPath=/opt/hadoop/lib/native  --conf spark.driver.memory=10G  --conf spark.submit.deployMode=cluster  --conf spark.executor.extraLibraryPath=/opt/hadoop/lib/native  --conf spark.shuffle.service.enabled=true --jars /opt/hbase/lib/hbase-common-1.4.8.jar,/opt/hbase/lib/hbase-server-1.4.8.jar,/opt/hbase/lib/hbase-client-1.4.8.jar,/opt/hbase/lib/hbase-protocol-1.4.8.jar,/opt/hbase/lib/hbase-hadoop-compat-1.4.8.jar,/opt/hbase/lib/htrace-core-3.1.0-incubating.jar,/opt/hbase/lib/hbase-metrics-1.4.8.jar,/opt/hbase/lib/metrics-core-2.2.0.jar,/opt/hbase/lib/hbase-hadoop-compat-1.4.8.jar,/opt/hbase/lib/hbase-hadoop2-compat-1.4.8.jar,/opt/hbase/lib/hbase-metrics-api-1.4.8.jar, /opt/kylin/lib/kylin-job-2.6.2.jar -className org.apache.kylin.storage.hbase.steps.SparkCubeHFile -partitions hdfs://master:9000/kylin/kylin_metadata/kylin-15389716-d96c-4778-ffe6-aa7ec8945c65/matt_vw_nodes/rowkey_stats/part-r-00000_hfile -counterOutput hdfs://master:9000/kylin/kylin_metadata/kylin-15389716-d96c-4778-ffe6-aa7ec8945c65/matt_vw_nodes/counter -cubename matt_vw_nodes -output hdfs://master:9000/kylin/kylin_metadata/kylin-15389716-d96c-4778-ffe6-aa7ec8945c65/matt_vw_nodes/hfile -input hdfs://master:9000/kylin/kylin_metadata/kylin-15389716-d96c-4778-ffe6-aa7ec8945c65/matt_vw_nodes/cuboid/ -segmentId 91dbfc05-1e54-8fd1-6550-855a6dd657da -metaUrl kylin_metadata@hdfs,path=hdfs://master:9000/kylin/kylin_metadata/kylin-15389716-d96c-4778-ffe6-aa7ec8945c65/matt_vw_nodes/metadata -hbaseConfPath hdfs://master:9000/kylin/kylin_metadata/kylin-15389716-d96c-4778-ffe6-aa7ec8945c65/hbase-conf.xml
>>
>>
>>  [On Diagnostics]
>>
>>     Here is information on the error.
>>
>>
>> [image: image.png]
>>
>>
>>
>> 2019-06-10 08:09:40 INFO  TransportClientFactory:184 - Found inactive connection to master.hadoop.lan/192.168.1.20:7337, creating a new one.
>> 2019-06-10 08:09:40 INFO  TransportClientFactory:267 - Successfully created connection to master.hadoop.lan/192.168.1.20:7337 after 3 ms (0 ms spent in bootstraps)
>> 2019-06-10 08:09:41 INFO  TransportClientFactory:184 - Found inactive connection to slave3.hadoop.lan/192.168.1.23:7337, creating a new one.
>> 2019-06-10 08:09:41 INFO  TransportClientFactory:267 - Successfully created connection to slave3.hadoop.lan/192.168.1.23:7337 after 2 ms (0 ms spent in bootstraps)
>> 2019-06-10 08:10:33 INFO  FileOutputCommitter:108 - File Output Committer Algorithm version is 1
>> 2019-06-10 08:10:34 INFO  FileOutputCommitter:108 - File Output Committer Algorithm version is 1
>> 2019-06-10 08:10:34 INFO  MetricsConfig:112 - loaded properties from hadoop-metrics2.properties
>> 2019-06-10 08:10:35 INFO  MetricsSystemImpl:375 - Scheduled snapshot period at 10 second(s).
>> 2019-06-10 08:10:35 INFO  MetricsSystemImpl:192 - HBase metrics system started
>> 2019-06-10 08:10:35 ERROR Utils:91 - Aborting task
>> java.lang.NoClassDefFoundError: Lorg/apache/hadoop/hbase/metrics/MetricRegistry;
>> 	at java.lang.Class.getDeclaredFields0(Native Method)
>> 	at java.lang.Class.privateGetDeclaredFields(Class.java:2583)
>> 	at java.lang.Class.getDeclaredFields(Class.java:1916)
>> 	at org.apache.hadoop.util.ReflectionUtils.getDeclaredFieldsIncludingInherited(ReflectionUtils.java:323)
>> 	at org.apache.hadoop.metrics2.lib.MetricsSourceBuilder.initRegistry(MetricsSourceBuilder.java:92)
>> 	at org.apache.hadoop.metrics2.lib.MetricsSourceBuilder.<init>(MetricsSourceBuilder.java:56)
>> 	at org.apache.hadoop.metrics2.lib.MetricsAnnotations.newSourceBuilder(MetricsAnnotations.java:43)
>> 	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:224)
>> 	at org.apache.hadoop.hbase.metrics.BaseSourceImpl.<init>(BaseSourceImpl.java:115)
>> 	at org.apache.hadoop.hbase.io.MetricsIOSourceImpl.<init>(MetricsIOSourceImpl.java:44)
>> 	at org.apache.hadoop.hbase.io.MetricsIOSourceImpl.<init>(MetricsIOSourceImpl.java:36)
>> 	at org.apache.hadoop.hbase.regionserver.MetricsRegionServerSourceFactoryImpl.createIO(MetricsRegionServerSourceFactoryImpl.java:73)
>> 	at org.apache.hadoop.hbase.io.MetricsIO.<init>(MetricsIO.java:31)
>> 	at org.apache.hadoop.hbase.io.hfile.HFile.<clinit>(HFile.java:191)
>> 	at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$1.getNewWriter(HFileOutputFormat2.java:305)
>> 	at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$1.write(HFileOutputFormat2.java:229)
>> 	at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$1.write(HFileOutputFormat2.java:167)
>> 	at org.apache.spark.internal.io.HadoopMapReduceWriteConfigUtil.write(SparkHadoopWriter.scala:356)
>> 	at org.apache.spark.internal.io.SparkHadoopWriter$$anonfun$4.apply(SparkHadoopWriter.scala:130)
>> 	at org.apache.spark.internal.io.SparkHadoopWriter$$anonfun$4.apply(SparkHadoopWriter.scala:127)
>> 	at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1415)
>> 	at org.apache.spark.internal.io.SparkHadoopWriter$.org$apache$spark$internal$io$SparkHadoopWriter$$executeTask(SparkHadoopWriter.scala:139)
>> 	at org.apache.spark.internal.io.SparkHadoopWriter$$anonfun$3.apply(SparkHadoopWriter.scala:83)
>> 	at org.apache.spark.internal.io.SparkHadoopWriter$$anonfun$3.apply(SparkHadoopWriter.scala:78)
>> 	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
>> 	at org.apache.spark.scheduler.Task.run(Task.scala:109)
>> 	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
>> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>> 	at java.lang.Thread.run(Thread.java:748)
>> Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.metrics.MetricRegistry
>> 	at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
>> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>> 	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
>> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>> 	... 30 more
>> 2019-06-10 08:10:35 ERROR SparkHadoopWriter:70 - Task attempt_20190610074916_0003_r_000000_0 aborted.
>> 2019-06-10 08:10:35 ERROR Executor:91 - Exception in task 0.0 in stage 1.0 (TID 56)
>> org.apache.spark.SparkException: Task failed while writing rows
>> 	at org.apache.spark.internal.io.SparkHadoopWriter$.org$apache$spark$internal$io$SparkHadoopWriter$$executeTask(SparkHadoopWriter.scala:155)
>> 	at org.apache.spark.internal.io.SparkHadoopWriter$$anonfun$3.apply(SparkHadoopWriter.scala:83)
>> 	at org.apache.spark.internal.io.SparkHadoopWriter$$anonfun$3.apply(SparkHadoopWriter.scala:78)
>> 	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
>> 	at org.apache.spark.scheduler.Task.run(Task.scala:109)
>> 	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)
>> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>> 	at java.lang.Thread.run(Thread.java:748)
>> Caused by: java.lang.NoClassDefFoundError: Lorg/apache/hadoop/hbase/metrics/MetricRegistry;
>> 	at java.lang.Class.getDeclaredFields0(Native Method)
>> 	at java.lang.Class.privateGetDeclaredFields(Class.java:2583)
>> 	at java.lang.Class.getDeclaredFields(Class.java:1916)
>> 	at org.apache.hadoop.util.ReflectionUtils.getDeclaredFieldsIncludingInherited(ReflectionUtils.java:323)
>> 	at org.apache.hadoop.metrics2.lib.MetricsSourceBuilder.initRegistry(MetricsSourceBuilder.java:92)
>> 	at org.apache.hadoop.metrics2.lib.MetricsSourceBuilder.<init>(MetricsSourceBuilder.java:56)
>> 	at org.apache.hadoop.metrics2.lib.MetricsAnnotations.newSourceBuilder(MetricsAnnotations.java:43)
>> 	at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.register(MetricsSystemImpl.java:224)
>> 	at org.apache.hadoop.hbase.metrics.BaseSourceImpl.<init>(BaseSourceImpl.java:115)
>> 	at org.apache.hadoop.hbase.io.MetricsIOSourceImpl.<init>(MetricsIOSourceImpl.java:44)
>> 	at org.apache.hadoop.hbase.io.MetricsIOSourceImpl.<init>(MetricsIOSourceImpl.java:36)
>> 	at org.apache.hadoop.hbase.regionserver.MetricsRegionServerSourceFactoryImpl.createIO(MetricsRegionServerSourceFactoryImpl.java:73)
>> 	at org.apache.hadoop.hbase.io.MetricsIO.<init>(MetricsIO.java:31)
>> 	at org.apache.hadoop.hbase.io.hfile.HFile.<clinit>(HFile.java:191)
>> 	at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$1.getNewWriter(HFileOutputFormat2.java:305)
>> 	at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$1.write(HFileOutputFormat2.java:229)
>> 	at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$1.write(HFileOutputFormat2.java:167)
>> 	at org.apache.spark.internal.io.HadoopMapReduceWriteConfigUtil.write(SparkHadoopWriter.scala:356)
>> 	at org.apache.spark.internal.io.SparkHadoopWriter$$anonfun$4.apply(SparkHadoopWriter.scala:130)
>> 	at org.apache.spark.internal.io.SparkHadoopWriter$$anonfun$4.apply(SparkHadoopWriter.scala:127)
>> 	at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1415)
>> 	at org.apache.spark.internal.io.SparkHadoopWriter$.org$apache$spark$internal$io$SparkHadoopWriter$$executeTask(SparkHadoopWriter.scala:139)
>> 	... 8 more
>> Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.metrics.MetricRegistry
>> 	at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
>> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>> 	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
>> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>> 	... 30 more
>> 2019-06-10 08:10:35 INFO  CoarseGrainedExecutorBackend:54 - Got assigned task 57
>> 2019-06-10 08:10:35 INFO  Executor:54 - Running task 0.1 in stage 1.0 (TID 57)
>> 2019-06-10 08:10:35 INFO  ShuffleBlockFetcherIterator:54 - Getting 56 non-empty blocks out of 56 blocks
>> 2019-06-10 08:10:35 INFO  TransportClientFactory:184 - Found inactive connection to slave1.hadoop.lan/192.168.1.21:7337, creating a new one.
>> 2019-06-10 08:10:35 INFO  TransportClientFactory:267 - Successfully created connection to slave1.hadoop.lan/192.168.1.21:7337 after 2 ms (0 ms spent in bootstraps)
>> 2019-06-10 08:10:35 INFO  ShuffleBlockFetcherIterator:54 - Started 1 remote fetches in 9 ms
>> 2019-06-10 08:10:35 INFO  TransportClientFactory:184 - Found inactive connection to slave2.hadoop.lan/192.168.1.22:7337, creating a new one.
>> 2019-06-10 08:10:35 INFO  TransportClientFactory:267 - Successfully created connection to slave2.hadoop.lan/192.168.1.22:7337 after 2 ms (0 ms spent in bootstraps)
>> 2019-06-10 08:11:07 INFO  ExternalSorter:54 - Thread 112 spilling in-memory map of 5.2 GB to disk (1 time so far)
>>
>>