You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by dizzy5112 <da...@gmail.com> on 2015/08/17 07:19:05 UTC

grpah x issue spark 1.3

Hi using spark 1.3 and trying some sample code:


when i run:

all works well but with

it falls over and i get a whole heap of errors:
 
Is anyone else experiencing this? Ive tried different graphs and always end
up with the same results.

thanks



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/grpah-x-issue-spark-1-3-tp24292.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: grpah x issue spark 1.3

Posted by David Zeelen <da...@gmail.com>.
the code below is taken from the spark website and generates the error
detailed

Hi using spark 1.3 and trying some sample code:
val users: RDD[(VertexId, (String, String))] =
sc.parallelize(Array((3L, ("rxin", "student")), (7L, ("jgonzal",
"postdoc")),
(5L, ("franklin", "prof")), (2L, ("istoica", "prof"))))
// Create an RDD for edges
val relationships: RDD[Edge[String]] =
sc.parallelize(Array(Edge(3L, 7L, "collab"), Edge(5L, 3L, "advisor"),
Edge(2L, 5L, "colleague"), Edge(5L, 7L, "pi")))
// Define a default user in case there are relationship with missing user
val defaultUser = ("John Doe", "Missing")
// Build the initial Graph
val graph = Graph(users, relationships, defaultUser)


when i run:
graph.numEdges
all works well but with
graph.numVertices
it falls over and i get a whole heap of errors:
Failed to open file: /tmp/spark..........shuffle_0_21_0.index
at
org.apache.spark.network.shuffle.ExternalShuffleBlockManager.getSortBasedShuffleBlockData(ExternalShuffleBlockManager.java:202)
at
org.apache.spark.network.shuffle.ExternalShuffleBlockManager.getBlockData(ExternalShuffleBlockManager.java:112)
at
org.apache.spark.network.shuffle.ExternalShuffleBlockHandler.receive(ExternalShuffleBlockHandler.java:74)
at org.apache.spark.network.server.TransporSLF4J: Class path contains
multiple SLF4J bindings.

Is anyone else experiencing this? Ive tried different graphs and always end
up with the same results.

thanks

On Tue, 18 Aug 2015 at 12:15 am, Sonal Goyal <so...@gmail.com> wrote:

> I have been using graphx in production on 1.3 and 1.4 with no issues.
> What's the  exception you see and what are you trying to do?
> On Aug 17, 2015 10:49 AM, "dizzy5112" <da...@gmail.com> wrote:
>
>> Hi using spark 1.3 and trying some sample code:
>>
>>
>> when i run:
>>
>> all works well but with
>>
>> it falls over and i get a whole heap of errors:
>>
>> Is anyone else experiencing this? Ive tried different graphs and always
>> end
>> up with the same results.
>>
>> thanks
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/grpah-x-issue-spark-1-3-tp24292.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> For additional commands, e-mail: user-help@spark.apache.org
>>
>>

Re: grpah x issue spark 1.3

Posted by Sonal Goyal <so...@gmail.com>.
I have been using graphx in production on 1.3 and 1.4 with no issues.
What's the  exception you see and what are you trying to do?
On Aug 17, 2015 10:49 AM, "dizzy5112" <da...@gmail.com> wrote:

> Hi using spark 1.3 and trying some sample code:
>
>
> when i run:
>
> all works well but with
>
> it falls over and i get a whole heap of errors:
>
> Is anyone else experiencing this? Ive tried different graphs and always end
> up with the same results.
>
> thanks
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/grpah-x-issue-spark-1-3-tp24292.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>