You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Cinyoung Hur <ci...@gmail.com> on 2017/09/04 02:53:44 UTC

No rows in Apache Tajo table reading(Spark SQL)

Hi,

I want to read Apache Tajo table using spark sql.

Tajo JDBC driver is added to spark-shell, but Tajo table doesn't show
anything.
The followings are Spark code and the result.

$ spark-shell --jars tajo-jdbc-0.11.3.jar

scala> val componentDF = spark.sqlContext.load("jdbc", Map(
    "url"-> "jdbc:tajo://tajo-master-ip:26002/analysis",
    "driver"->"org.apache.tajo.jdbc.TajoDriver",
    "dbtable"->"component_usage_2015"
    ))
scala> componentDF.registerTempTable("components")
scala> val allComponents = sqlContext.sql("select * from components")
scala> allComponents.show(5)


warning: there was one deprecation warning; re-run with -deprecation for
details
componentDF: org.apache.spark.sql.DataFrame =
[analysis.component_usage_2015.gnl_nm_cd: string,
analysis.component_usage_2015.qty: double ... 1 more field]
warning: there was one deprecation warning; re-run with -deprecation for
details
allComponents: org.apache.spark.sql.DataFrame =
[analysis.component_usage_2015.gnl_nm_cd: string,
analysis.component_usage_2015.qty: double ... 1 more field]
+--------------------------------------------+--------------------------------------+--------------------------------------+
|analysis.component_usage_2015.gnl_nm_cd|analysis.component_usage_2015.qty|analysis.component_usage_2015.amt|
+--------------------------------------------+--------------------------------------+--------------------------------------+
+--------------------------------------------+--------------------------------------+--------------------------------------+

Regards,
Cinyoung Hur