You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by "Stephan Ewen (JIRA)" <ji...@apache.org> on 2015/08/05 22:17:04 UTC

[jira] [Resolved] (FLINK-2361) CompactingHashTable loses entries

     [ https://issues.apache.org/jira/browse/FLINK-2361?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Stephan Ewen resolved FLINK-2361.
---------------------------------
       Resolution: Fixed
         Assignee: Stephan Ewen
    Fix Version/s: 0.9.1
                   0.10

Fixed in 0.10 via 925ac1f76bb84986764495407049a77552169d84 and in 0.9.1 via 9219dff34a1f5cfe489ab4c648c3ec55c9c1318f

> CompactingHashTable loses entries
> ---------------------------------
>
>                 Key: FLINK-2361
>                 URL: https://issues.apache.org/jira/browse/FLINK-2361
>             Project: Flink
>          Issue Type: Bug
>          Components: Gelly
>    Affects Versions: 0.10
>            Reporter: Andra Lungu
>            Assignee: Stephan Ewen
>            Priority: Critical
>             Fix For: 0.10, 0.9.1
>
>
> When running the simple Connected Components algorithm (currently in Gelly) on the twitter follower graph, with 1, 100 or 10000 iterations, I get the following error:
> Caused by: java.lang.Exception: Target vertex '657282846' does not exist!.
> 	at org.apache.flink.graph.spargel.VertexCentricIteration$VertexUpdateUdfSimpleVV.coGroup(VertexCentricIteration.java:300)
> 	at org.apache.flink.runtime.operators.CoGroupWithSolutionSetSecondDriver.run(CoGroupWithSolutionSetSecondDriver.java:220)
> 	at org.apache.flink.runtime.operators.RegularPactTask.run(RegularPactTask.java:496)
> 	at org.apache.flink.runtime.iterative.task.AbstractIterativePactTask.run(AbstractIterativePactTask.java:139)
> 	at org.apache.flink.runtime.iterative.task.IterationTailPactTask.run(IterationTailPactTask.java:107)
> 	at org.apache.flink.runtime.operators.RegularPactTask.invoke(RegularPactTask.java:362)
> 	at org.apache.flink.runtime.taskmanager.Task.run(Task.java:559)
> 	at java.lang.Thread.run(Thread.java:722)
> Now this is very bizzare as the DataSet of vertices is produced from the DataSet of edges... Which means there cannot be a an edge with an invalid target id... The method calls flatMap to isolate the src and trg ids and distinct to ensure their uniqueness.  
> The algorithm works fine for smaller data sets... 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)