You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by salemi <al...@udo.edu> on 2014/09/04 18:56:07 UTC
spark streaming - saving DStream into HBASE doesn't work
Hi,
I am using the following code to write data to hbase. I see the jobs are
send off but I never get anything in my hbase database. Spark doesn't throw
any error? How can such a problem be debugged. Is the code below correct for
writing data to hbase?
val conf = HBaseConfiguration.create()
conf.set(TableOutputFormat.OUTPUT_TABLE, tableName)
conf.set("hbase.rootdir", hdfsNameNodeUrl+"/hbase")
conf.setBoolean("hbase.cluster.distributed", true)
conf.set("hbase.zookeeper.quorum", hbaseZooKeepers)
conf.setInt("hbase.client.scanner.caching", 10000)
agents.foreachRDD(rdd =>
{
val jobConfig = new JobConf(conf)
jobConfig.setOutputFormat(classOf[TableOutputFormat])
jobConfig.set(TableOutputFormat.OUTPUT_TABLE, tableName)
new
PairRDDFunctions(rdd.map(convert)).saveAsHadoopDataset(jobConfig)
})
Thanks,
Ali
--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/spark-streaming-saving-DStream-into-HBASE-doesn-t-work-tp13473.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org