You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@hbase.apache.org by "Yuexin Zhang (JIRA)" <ji...@apache.org> on 2017/08/11 09:27:00 UTC
[jira] [Created] (HBASE-18570) use hbase-spark without HBaseContext
runs into NPE
Yuexin Zhang created HBASE-18570:
------------------------------------
Summary: use hbase-spark without HBaseContext runs into NPE
Key: HBASE-18570
URL: https://issues.apache.org/jira/browse/HBASE-18570
Project: HBase
Issue Type: Improvement
Components: hbase
Affects Versions: 1.2.0
Reporter: Yuexin Zhang
Priority: Minor
I recently run into the same issue as described in stackoverflow :
https://stackoverflow.com/questions/38865558/sparksql-dataframes-does-not-work-in-spark-shell-and-application#
If we don't explicitly initialize a HBaseContext and don't set hbase.use.hbase.context option to false, it will run into NPE at:
{code}
val wrappedConf = new SerializableConfiguration(hbaseContext.config)
{code}
https://github.com/apache/hbase/blob/master/hbase-spark/src/main/scala/org/apache/hadoop/hbase/spark/DefaultSource.scala#L140
Should we safe guard with a NULL validation on hbaseContext?
Something like:
{code}
//create or get latest HBaseContext
val hbaseContext:HBaseContext = if (useHBaseContext && null != LatestHBaseContextCache.latest) {
LatestHBaseContextCache.latest
} else {
val config = HBaseConfiguration.create()
configResources.split(",").foreach( r => config.addResource(r))
new HBaseContext(sqlContext.sparkContext, config)
}
{code}
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)