You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@phoenix.apache.org by "sunfl@certusnet.com.cn" <su...@certusnet.com.cn> on 2014/09/02 04:05:32 UTC

Unable to find cached index metadata

Hi, everyone,
   I used the latest 4.1 release to run some tests about local indexing. When I am trying to load data into 
   phoenix table with local index, I got the following error. Not sure whether got some relation with Hbase
   local index table, cause Hbase local index table is uniformly prefixed with '_LOCAL_IDX_' + TableRef.
   Any available hints? Also corrects me if I got some misunderstanding. 
   Best Regards, Sun.
     org.apache.phoenix.execute.CommitException: java.sql.SQLException: ERROR 2008 (INT10): Unable to find cached index metadata. ERROR 2008 (INT10): ERROR 2008 (INT10): Unable to find cached index metadata. key=-8614688887238479432 region=RANAPSIGNAL,\x0D\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00,1409566437551.9e47a9f579f7cf3865d1148480a3b1b9. Index update failed
        org.apache.phoenix.execute.MutationState.commit(MutationState.java:433)
        org.apache.phoenix.jdbc.PhoenixConnection$3.call(PhoenixConnection.java:384)
        org.apache.phoenix.jdbc.PhoenixConnection$3.call(PhoenixConnection.java:381)
        org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
        org.apache.phoenix.jdbc.PhoenixConnection.commit(PhoenixConnection.java:381)
        com.certusnet.spark.bulkload.ranap.RanapSignalJdbcPhoenix$$anonfun$13$$anonfun$apply$1.apply(RanapSignalJdbcPhoenix.scala:113)
        com.certusnet.spark.bulkload.ranap.RanapSignalJdbcPhoenix$$anonfun$13$$anonfun$apply$1.apply(RanapSignalJdbcPhoenix.scala:104)
        scala.collection.Iterator$class.foreach(Iterator.scala:727)
        scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
        com.certusnet.spark.bulkload.ranap.RanapSignalJdbcPhoenix$$anonfun$13.apply(RanapSignalJdbcPhoenix.scala:104)
        com.certusnet.spark.bulkload.ranap.RanapSignalJdbcPhoenix$$anonfun$13.apply(RanapSignalJdbcPhoenix.scala:89)
        scala.collection.Iterator$class.foreach(Iterator.scala:727)
        scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
        org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:759)
        org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:759)
        org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121)
        org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121)
        org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:62)
        org.apache.spark.scheduler.Task.run(Task.scala:54)
        org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:177)
        java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        java.lang.Thread.run(Thread.java:744)





CertusNet 


Re: RE: Unable to find cached index metadata

Posted by "sunfl@certusnet.com.cn" <su...@certusnet.com.cn>.
Hi, Rajeshbabu,
  Really appreciated for your suggestion. Actually I am running spark job to load data into phoenix, 
  and the data are storaged in HDFS as sequence file. I am trying to facilitate some tunning about 
  optimizing the data loading into phoenix as my current project requires heavy data write. Previous 
 test about no index data loading and global index data loading works fine though with different loading 
 speed. With the latest 4.1 release, I got the feature about local indexing with the following use case suggesion:
               Local indexing targets write heavy, space constrained use cases. 

  So I would like to test the local indexing for my projects. However, the data loading speed got extremely slow compared with
 my previous data loading. 
  Following are my scala code snippet for the data loading into Phoenix:
         

iter1.grouped(5000).zipWithIndex foreach { case (batch, batchIndex) => 
batch foreach { v =>                   //  batch JDBC upsert
//stmt.addBatch() 
hbaseUpsertExecutor.execute(v._2,true);     //  here will upsert each record in HDFS sequence file into phoenix table with ‘upsert into mytable values (....)’
//hbaseUpsertExecutor.executeBatch() 
//stmt.execute(); 
} 
hbaseUpsertExecutor.executeBatch() 
// connection.setAutoCommit(false); 
conn.commit();                      //  here got the error message.   ERROR 2008 (INT10): Unable to find cached index metadata.
// logger.info(" inserted batch " + batchIndex + " with " + batch.size + " elements") 
}
 
    Not quite sure about the error cause through stack trace and expecting to understand whether the local indexing needs some 
   additional configuration or something to get attention.
   Best regards, 
   Sun





CertusNet 

From: rajeshbabu chintaguntla
Date: 2014-09-02 16:47
To: user@phoenix.apache.org
Subject: RE: Re: Unable to find cached index metadata
bq. I am trying to load data into the phoenix table, as Phoenix may not support index related  
   data bulkload, I am tring to upsert data into phoenix through JDBC statements. 

In 4.1 release CSVBulkLoadTool can be used to build indexes when loading data. See [1].
And also some more work is going for the same[2].

1. https://issues.apache.org/jira/browse/PHOENIX-1069
2. https://issues.apache.org/jira/browse/PHOENIX-1056

Are you getting the exception for first attempt of upsert or in the middle of loading the data?

Can you provide the code snippet(or statements) which you are using to upsert  data?

Thanks,
Rajeshbabu.


This e-mail and its attachments contain confidential information from HUAWEI, which 
is intended only for the person or entity whose address is listed above. Any use of the 
information contained herein in any way (including, but not limited to, total or partial 
disclosure, reproduction, or dissemination) by persons other than the intended 
recipient(s) is prohibited. If you receive this e-mail in error, please notify the sender by 
phone or email immediately and delete it!


From: sunfl@certusnet.com.cn [sunfl@certusnet.com.cn]
Sent: Tuesday, September 02, 2014 8:27 AM
To: user
Subject: Re: Re: Unable to find cached index metadata

Hi,
   Thanks for your reply. Sorry for not completely describing my job information.
   I had configured the properties in hbase-site.xml in hmaster node and run sqlline to
  create table in Phoenix, while creating a local index on my table. 
   I am trying to load data into the phoenix table, as Phoenix may not support index related 
   data bulkload, I am tring to upsert data into phoenix through JDBC statements. Then I got the 
   following error, not quite sure about the reason. BTW, no local index data upserting works fine.
   Hoping for your reply and thks.  





CertusNet 
 
From: rajesh babu Chintaguntla
Date: 2014-09-02 10:55
To: user
Subject: Re: Unable to find cached index metadata
Hi Sun, 
Thanks for testing,

Have you configured following properties at master side and restarted it before creating local indexes?
<property>
  <name>hbase.master.loadbalancer.class</name>
  <value>org.apache.phoenix.hbase.index.balancer.IndexLoadBalancer</value>
</property>
<property>
  <name>hbase.coprocessor.master.classes</name>
  <value>org.apache.phoenix.hbase.index.master.IndexMasterObserver</value>
</property>




On Tue, Sep 2, 2014 at 7:35 AM, sunfl@certusnet.com.cn <su...@certusnet.com.cn> wrote:
Hi, everyone,
   I used the latest 4.1 release to run some tests about local indexing. When I am trying to load data into 
   phoenix table with local index, I got the following error. Not sure whether got some relation with Hbase
   local index table, cause Hbase local index table is uniformly prefixed with '_LOCAL_IDX_' + TableRef.
   Any available hints? Also corrects me if I got some misunderstanding. 
   Best Regards, Sun.
     org.apache.phoenix.execute.CommitException: java.sql.SQLException: ERROR 2008 (INT10): Unable to find cached index metadata. ERROR 2008 (INT10): ERROR 2008 (INT10): Unable to find cached index metadata. key=-8614688887238479432 region=RANAPSIGNAL,\x0D\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00,1409566437551.9e47a9f579f7cf3865d1148480a3b1b9. Index update failed
        org.apache.phoenix.execute.MutationState.commit(MutationState.java:433)
        org.apache.phoenix.jdbc.PhoenixConnection$3.call(PhoenixConnection.java:384)
        org.apache.phoenix.jdbc.PhoenixConnection$3.call(PhoenixConnection.java:381)
        org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
        org.apache.phoenix.jdbc.PhoenixConnection.commit(PhoenixConnection.java:381)
        com.certusnet.spark.bulkload.ranap.RanapSignalJdbcPhoenix$$anonfun$13$$anonfun$apply$1.apply(RanapSignalJdbcPhoenix.scala:113)
        com.certusnet.spark.bulkload.ranap.RanapSignalJdbcPhoenix$$anonfun$13$$anonfun$apply$1.apply(RanapSignalJdbcPhoenix.scala:104)
        scala.collection.Iterator$class.foreach(Iterator.scala:727)
        scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
        com.certusnet.spark.bulkload.ranap.RanapSignalJdbcPhoenix$$anonfun$13.apply(RanapSignalJdbcPhoenix.scala:104)
        com.certusnet.spark.bulkload.ranap.RanapSignalJdbcPhoenix$$anonfun$13.apply(RanapSignalJdbcPhoenix.scala:89)
        scala.collection.Iterator$class.foreach(Iterator.scala:727)
        scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
        org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:759)
        org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:759)
        org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121)
        org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121)
        org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:62)
        org.apache.spark.scheduler.Task.run(Task.scala:54)
        org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:177)
        java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        java.lang.Thread.run(Thread.java:744)





CertusNet 



RE: Re: Unable to find cached index metadata

Posted by rajeshbabu chintaguntla <ra...@huawei.com>.
bq. I am trying to load data into the phoenix table, as Phoenix may not support index related
   data bulkload, I am tring to upsert data into phoenix through JDBC statements.

In 4.1 release CSVBulkLoadTool can be used to build indexes when loading data. See [1].
And also some more work is going for the same[2].

1. https://issues.apache.org/jira/browse/PHOENIX-1069
2. https://issues.apache.org/jira/browse/PHOENIX-1056

Are you getting the exception for first attempt of upsert or in the middle of loading the data?

Can you provide the code snippet(or statements) which you are using to upsert  data?

Thanks,
Rajeshbabu.
________________________________
This e-mail and its attachments contain confidential information from HUAWEI, which
is intended only for the person or entity whose address is listed above. Any use of the
information contained herein in any way (including, but not limited to, total or partial
disclosure, reproduction, or dissemination) by persons other than the intended
recipient(s) is prohibited. If you receive this e-mail in error, please notify the sender by
phone or email immediately and delete it!
________________________________
From: sunfl@certusnet.com.cn [sunfl@certusnet.com.cn]
Sent: Tuesday, September 02, 2014 8:27 AM
To: user
Subject: Re: Re: Unable to find cached index metadata

Hi,
   Thanks for your reply. Sorry for not completely describing my job information.
   I had configured the properties in hbase-site.xml in hmaster node and run sqlline to
  create table in Phoenix, while creating a local index on my table.
   I am trying to load data into the phoenix table, as Phoenix may not support index related
   data bulkload, I am tring to upsert data into phoenix through JDBC statements. Then I got the
   following error, not quite sure about the reason. BTW, no local index data upserting works fine.
   Hoping for your reply and thks.

________________________________
________________________________

CertusNet


From: rajesh babu Chintaguntla<ma...@gmail.com>
Date: 2014-09-02 10:55
To: user<ma...@phoenix.apache.org>
Subject: Re: Unable to find cached index metadata
Hi Sun,
Thanks for testing,

Have you configured following properties at master side and restarted it before creating local indexes?

<property>
  <name>hbase.master.loadbalancer.class</name>
  <value>org.apache.phoenix.hbase.index.balancer.IndexLoadBalancer</value>
</property>
<property>
  <name>hbase.coprocessor.master.classes</name>
  <value>org.apache.phoenix.hbase.index.master.IndexMasterObserver</value>
</property>




On Tue, Sep 2, 2014 at 7:35 AM, sunfl@certusnet.com.cn<ma...@certusnet.com.cn> <su...@certusnet.com.cn>> wrote:
Hi, everyone,
   I used the latest 4.1 release to run some tests about local indexing. When I am trying to load data into
   phoenix table with local index, I got the following error. Not sure whether got some relation with Hbase
   local index table, cause Hbase local index table is uniformly prefixed with '_LOCAL_IDX_' + TableRef.
   Any available hints? Also corrects me if I got some misunderstanding.
   Best Regards, Sun.
     org.apache.phoenix.execute.CommitException: java.sql.SQLException: ERROR 2008 (INT10): Unable to find cached index metadata. ERROR 2008 (INT10): ERROR 2008 (INT10): Unable to find cached index metadata. key=-8614688887238479432 region=RANAPSIGNAL,\x0D\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00,1409566437551.9e47a9f579f7cf3865d1148480a3b1b9. Index update failed

        org.apache.phoenix.execute.MutationState.commit(MutationState.java:433)
        org.apache.phoenix.jdbc.PhoenixConnection$3.call(PhoenixConnection.java:384)
        org.apache.phoenix.jdbc.PhoenixConnection$3.call(PhoenixConnection.java:381)
        org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
        org.apache.phoenix.jdbc.PhoenixConnection.commit(PhoenixConnection.java:381)
        com.certusnet.spark.bulkload.ranap.RanapSignalJdbcPhoenix$$anonfun$13$$anonfun$apply$1.apply(RanapSignalJdbcPhoenix.scala:113)
        com.certusnet.spark.bulkload.ranap.RanapSignalJdbcPhoenix$$anonfun$13$$anonfun$apply$1.apply(RanapSignalJdbcPhoenix.scala:104)
        scala.collection.Iterator$class.foreach(Iterator.scala:727)
        scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
        com.certusnet.spark.bulkload.ranap.RanapSignalJdbcPhoenix$$anonfun$13.apply(RanapSignalJdbcPhoenix.scala:104)
        com.certusnet.spark.bulkload.ranap.RanapSignalJdbcPhoenix$$anonfun$13.apply(RanapSignalJdbcPhoenix.scala:89)
        scala.collection.Iterator$class.foreach(Iterator.scala:727)
        scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
        org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:759)
        org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:759)
        org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121)
        org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121)
        org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:62)
        org.apache.spark.scheduler.Task.run(Task.scala:54)
        org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:177)
        java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        java.lang.Thread.run(Thread.java:744)

________________________________
________________________________

CertusNet



Re: Re: Unable to find cached index metadata

Posted by "sunfl@certusnet.com.cn" <su...@certusnet.com.cn>.
Hi,
   Thanks for your reply. Sorry for not completely describing my job information.
   I had configured the properties in hbase-site.xml in hmaster node and run sqlline to
  create table in Phoenix, while creating a local index on my table. 
   I am trying to load data into the phoenix table, as Phoenix may not support index related 
   data bulkload, I am tring to upsert data into phoenix through JDBC statements. Then I got the 
   following error, not quite sure about the reason. BTW, no local index data upserting works fine.
   Hoping for your reply and thks.  





CertusNet 
 
From: rajesh babu Chintaguntla
Date: 2014-09-02 10:55
To: user
Subject: Re: Unable to find cached index metadata
Hi Sun,
Thanks for testing,

Have you configured following properties at master side and restarted it before creating local indexes?
<property>
  <name>hbase.master.loadbalancer.class</name>
  <value>org.apache.phoenix.hbase.index.balancer.IndexLoadBalancer</value>
</property>
<property>
  <name>hbase.coprocessor.master.classes</name>
  <value>org.apache.phoenix.hbase.index.master.IndexMasterObserver</value>
</property>




On Tue, Sep 2, 2014 at 7:35 AM, sunfl@certusnet.com.cn <su...@certusnet.com.cn> wrote:
Hi, everyone,
   I used the latest 4.1 release to run some tests about local indexing. When I am trying to load data into 
   phoenix table with local index, I got the following error. Not sure whether got some relation with Hbase
   local index table, cause Hbase local index table is uniformly prefixed with '_LOCAL_IDX_' + TableRef.
   Any available hints? Also corrects me if I got some misunderstanding. 
   Best Regards, Sun.
     org.apache.phoenix.execute.CommitException: java.sql.SQLException: ERROR 2008 (INT10): Unable to find cached index metadata. ERROR 2008 (INT10): ERROR 2008 (INT10): Unable to find cached index metadata. key=-8614688887238479432 region=RANAPSIGNAL,\x0D\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00,1409566437551.9e47a9f579f7cf3865d1148480a3b1b9. Index update failed
        org.apache.phoenix.execute.MutationState.commit(MutationState.java:433)
        org.apache.phoenix.jdbc.PhoenixConnection$3.call(PhoenixConnection.java:384)
        org.apache.phoenix.jdbc.PhoenixConnection$3.call(PhoenixConnection.java:381)
        org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
        org.apache.phoenix.jdbc.PhoenixConnection.commit(PhoenixConnection.java:381)
        com.certusnet.spark.bulkload.ranap.RanapSignalJdbcPhoenix$$anonfun$13$$anonfun$apply$1.apply(RanapSignalJdbcPhoenix.scala:113)
        com.certusnet.spark.bulkload.ranap.RanapSignalJdbcPhoenix$$anonfun$13$$anonfun$apply$1.apply(RanapSignalJdbcPhoenix.scala:104)
        scala.collection.Iterator$class.foreach(Iterator.scala:727)
        scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
        com.certusnet.spark.bulkload.ranap.RanapSignalJdbcPhoenix$$anonfun$13.apply(RanapSignalJdbcPhoenix.scala:104)
        com.certusnet.spark.bulkload.ranap.RanapSignalJdbcPhoenix$$anonfun$13.apply(RanapSignalJdbcPhoenix.scala:89)
        scala.collection.Iterator$class.foreach(Iterator.scala:727)
        scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
        org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:759)
        org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:759)
        org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121)
        org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121)
        org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:62)
        org.apache.spark.scheduler.Task.run(Task.scala:54)
        org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:177)
        java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        java.lang.Thread.run(Thread.java:744)





CertusNet 



Re: Unable to find cached index metadata

Posted by rajesh babu Chintaguntla <ch...@gmail.com>.
Hi Sun,
Thanks for testing,

Have you configured following properties at master side and restarted it
before creating local indexes?

<property>
  <name>hbase.master.loadbalancer.class</name>
  <value>org.apache.phoenix.hbase.index.balancer.IndexLoadBalancer</value>
</property>
<property>
  <name>hbase.coprocessor.master.classes</name>
  <value>org.apache.phoenix.hbase.index.master.IndexMasterObserver</value>
</property>




On Tue, Sep 2, 2014 at 7:35 AM, sunfl@certusnet.com.cn <
sunfl@certusnet.com.cn> wrote:

> Hi, everyone,
>    I used the latest 4.1 release to run some tests about local indexing.
> When I am trying to load data into
>    phoenix table with local index, I got the following error. Not sure
> whether got some relation with Hbase
>    local index table, cause Hbase local index table is uniformly prefixed
> with '_LOCAL_IDX_' + TableRef.
>    Any available hints? Also corrects me if I got some misunderstanding.
>    Best Regards, Sun.
>      org.apache.phoenix.execute.CommitException: java.sql.SQLException:
> ERROR 2008 (INT10): Unable to find cached index metadata. ERROR 2008
> (INT10): ERROR 2008 (INT10): Unable to find cached index metadata.
> key=-8614688887238479432
> region=RANAPSIGNAL,\x0D\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00,1409566437551.9e47a9f579f7cf3865d1148480a3b1b9.
> Index update failed
>
>         org.apache.phoenix.execute.MutationState.commit(MutationState.java:433)
>         org.apache.phoenix.jdbc.PhoenixConnection$3.call(PhoenixConnection.java:384)
>         org.apache.phoenix.jdbc.PhoenixConnection$3.call(PhoenixConnection.java:381)
>         org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
>         org.apache.phoenix.jdbc.PhoenixConnection.commit(PhoenixConnection.java:381)
>         com.certusnet.spark.bulkload.ranap.RanapSignalJdbcPhoenix$$anonfun$13$$anonfun$apply$1.apply(RanapSignalJdbcPhoenix.scala:113)
>         com.certusnet.spark.bulkload.ranap.RanapSignalJdbcPhoenix$$anonfun$13$$anonfun$apply$1.apply(RanapSignalJdbcPhoenix.scala:104)
>         scala.collection.Iterator$class.foreach(Iterator.scala:727)
>         scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
>         com.certusnet.spark.bulkload.ranap.RanapSignalJdbcPhoenix$$anonfun$13.apply(RanapSignalJdbcPhoenix.scala:104)
>         com.certusnet.spark.bulkload.ranap.RanapSignalJdbcPhoenix$$anonfun$13.apply(RanapSignalJdbcPhoenix.scala:89)
>         scala.collection.Iterator$class.foreach(Iterator.scala:727)
>         scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
>         org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:759)
>         org.apache.spark.rdd.RDD$$anonfun$foreach$1.apply(RDD.scala:759)
>         org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121)
>         org.apache.spark.SparkContext$$anonfun$runJob$4.apply(SparkContext.scala:1121)
>         org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:62)
>         org.apache.spark.scheduler.Task.run(Task.scala:54)
>         org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:177)
>         java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>         java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>         java.lang.Thread.run(Thread.java:744)
>
>
> ------------------------------
> ------------------------------
>
> CertusNet
>
>
>