You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@phoenix.apache.org by tkg_cangkul <yu...@gmail.com> on 2015/10/01 11:07:30 UTC

error phoenix-spark integration

hi i want to integrate spark with phoenix & hbase.

i try an example from the apace phoenix web site like this;

import org.apache.spark.graphx._
import org.apache.phoenix.spark._
val rdd = sc.phoenixTableAsRDD("EMAIL_ENRON", Seq("MAIL_FROM", "MAIL_TO"), zkUrl=Some("localhost"))           // load from phoenix
val rawEdges = rdd.map{ e => (e("MAIL_FROM").asInstanceOf[VertexId], e("MAIL_TO").asInstanceOf[VertexId]) }   // map to vertexids
val graph = Graph.fromEdgeTuples(rawEdges, 1.0)                                                               // create a graph
val pr = graph.pageRank(0.001)                                                                                // run pagerank
pr.vertices.saveToPhoenix("EMAIL_ENRON_PAGERANK", Seq("ID", "RANK"), zkUrl = Some("localhost"))

  but when show an error message when the proccess come to

val graph = Graph.fromEdgeTuples(rawEdges, 1.0)

this is the message that i've got:

java.lang.NoSuchMethodError: 
org.apache.hadoop.hbase.client.Put.setWriteToWAL(Z)Lorg/apache/hadoop/hbase/client/Put;
     at 
org.apache.phoenix.schema.PTableImpl$PRowImpl.newMutations(PTableImpl.java:639)
     at 
org.apache.phoenix.schema.PTableImpl$PRowImpl.<init>(PTableImpl.java:632)
     at org.apache.phoenix.schema.PTableImpl.newRow(PTableImpl.java:557)
     at org.apache.phoenix.schema.PTableImpl.newRow(PTableImpl.java:573)
     at 
org.apache.phoenix.execute.MutationState.addRowMutations(MutationState.java:185)
     at 
org.apache.phoenix.execute.MutationState.access$200(MutationState.java:79)
     at 
org.apache.phoenix.execute.MutationState$2.init(MutationState.java:258)

anybody can help ??

-- 
Best Regards,
Yuza


Fwd: error phoenix-spark integration

Posted by tkg_cangkul <yu...@gmail.com>.
up


-------- Forwarded Message --------
Subject: 	error phoenix-spark integration
Date: 	Thu, 01 Oct 2015 16:07:30 +0700
From: 	tkg_cangkul <yu...@gmail.com>
To: 	user@phoenix.apache.org



hi i want to integrate spark with phoenix & hbase.

i try an example from the apace phoenix web site like this;

import org.apache.spark.graphx._
import org.apache.phoenix.spark._
val rdd = sc.phoenixTableAsRDD("EMAIL_ENRON", Seq("MAIL_FROM", "MAIL_TO"), zkUrl=Some("localhost"))           // load from phoenix
val rawEdges = rdd.map{ e => (e("MAIL_FROM").asInstanceOf[VertexId], e("MAIL_TO").asInstanceOf[VertexId]) }   // map to vertexids
val graph = Graph.fromEdgeTuples(rawEdges, 1.0)                                                               // create a graph
val pr = graph.pageRank(0.001)                                                                                // run pagerank
pr.vertices.saveToPhoenix("EMAIL_ENRON_PAGERANK", Seq("ID", "RANK"), zkUrl = Some("localhost"))

  but it show an error message when the proccess come to

val graph = Graph.fromEdgeTuples(rawEdges, 1.0)

this is the message that i've got:

java.lang.NoSuchMethodError: 
org.apache.hadoop.hbase.client.Put.setWriteToWAL(Z)Lorg/apache/hadoop/hbase/client/Put;
     at 
org.apache.phoenix.schema.PTableImpl$PRowImpl.newMutations(PTableImpl.java:639)
     at 
org.apache.phoenix.schema.PTableImpl$PRowImpl.<init>(PTableImpl.java:632)
     at org.apache.phoenix.schema.PTableImpl.newRow(PTableImpl.java:557)
     at org.apache.phoenix.schema.PTableImpl.newRow(PTableImpl.java:573)
     at 
org.apache.phoenix.execute.MutationState.addRowMutations(MutationState.java:185)
     at 
org.apache.phoenix.execute.MutationState.access$200(MutationState.java:79)
     at 
org.apache.phoenix.execute.MutationState$2.init(MutationState.java:258)

anybody can help ??

-- 
Best Regards,
Yuza