You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "hanyingjun (Jira)" <ji...@apache.org> on 2022/06/10 04:53:00 UTC

[jira] [Updated] (SPARK-39436) graph.connectedComponents(maxIterations) get ArrayIndexOutOfBoundsException: -1

     [ https://issues.apache.org/jira/browse/SPARK-39436?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

hanyingjun updated SPARK-39436:
-------------------------------
    Description: 
val graph = Graph(vertices, edges).partitionBy(PartitionStrategy.RandomVertexCut)

The following exceptions are reported during execution,There is no problem when the data volume is small, but an error will be reported when the data volume is large
h1. java.lang.ArrayIndexOutOfBoundsException: -1

> graph.connectedComponents(maxIterations) get ArrayIndexOutOfBoundsException: -1
> -------------------------------------------------------------------------------
>
>                 Key: SPARK-39436
>                 URL: https://issues.apache.org/jira/browse/SPARK-39436
>             Project: Spark
>          Issue Type: Bug
>          Components: GraphX
>    Affects Versions: 2.4.3
>         Environment: val graph = Graph(vertices, edges).partitionBy(PartitionStrategy.RandomVertexCut)
> The following exceptions are reported during execution,There is no problem when the data volume is small, but an error will be reported when the data volume is large
> h1. java.lang.ArrayIndexOutOfBoundsException: -1
>            Reporter: hanyingjun
>            Priority: Major
>
> val graph = Graph(vertices, edges).partitionBy(PartitionStrategy.RandomVertexCut)
> The following exceptions are reported during execution,There is no problem when the data volume is small, but an error will be reported when the data volume is large
> h1. java.lang.ArrayIndexOutOfBoundsException: -1



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org