You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "eblaas (JIRA)" <ji...@apache.org> on 2014/10/17 16:50:34 UTC

[jira] [Created] (SPARK-3991) Not Serializable , Nullpinter Exceptions in SQL server mode

eblaas created SPARK-3991:
-----------------------------

             Summary: Not Serializable , Nullpinter Exceptions in SQL server mode
                 Key: SPARK-3991
                 URL: https://issues.apache.org/jira/browse/SPARK-3991
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 1.1.0
            Reporter: eblaas
            Priority: Blocker


I'm working on connecting Mondrian with Spark SQL via JDBC. Good news, it works but there are some bugs to fix.

I customized the HiveThriftServer2 class to load, transform and register tables (ETL) with the HiveContext. Data tables are generated from Cassandra and from a relational database.

* 1 st problem : 
hiveContext.registerRDDAsTable(treeSchema,"tree") , does not register the table in hive metastore ("show tables;" via JDBC does not list the table, but I can query it e.g. select * from tree) dirty workaround create a table with same name and schema, this was necessary because mondrian validates table existence 

hiveContext.sql("CREATE TABLE tree (dp_id BIGINT, h1 STRING, h2 STRING, h3 STRING)")

* 2 nd problem :
mondrian creates complex joins, witch results in Serialization Exceptions
2 classes in hibeUdfs.scala have to be serializable
- DeferredObjectAdapter and HiveGenericUdaf

* 3 td  problem
Nullpointer Exception in InMemoryRelation
42: override lazy val statistics =  Statistics(sizeInBytes = child.sqlContext.defaultSizeInBytes)

the sqlContext in child was null, quick fix set default value from SparkContext
override lazy val statistics = Statistics(sizeInBytes = 10000)

I'm not sure how to fix this bugs but with the patch file it works at least. 




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org