You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by bearrito <j....@gmail.com> on 2014/03/28 04:38:47 UTC

ArrayIndexOutOfBoundsException in ALS.implicit

Usage of negative product id's causes the above exception.

The cause is the use of the product id's as a mechanism to index into the
the in and out block structures.

Specifically on 9.0 it occurs at
org.apache.spark.mllib.recommendation.ALS$$anonfun$org$apache$spark$mllib$recommendation$ALS$$makeInLinkBlock$2.apply(ALS.scala:262)

It seems reasonable to expect that product id's are positive, if a bit
opinionated.  I ran across this because the hash function I was using on my
product id's includes the negatives in it's range.





--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/ArrayIndexOutOfBoundsException-in-ALS-implicit-tp3400.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: ArrayIndexOutOfBoundsException in ALS.implicit

Posted by Xiangrui Meng <me...@gmail.com>.
Hi bearrito, this issue was fixed by Tor in
https://github.com/apache/spark/pull/407. You can either try the
master branch or wait for the 1.0 release. -Xiangrui

On Fri, Mar 28, 2014 at 12:19 AM, Xiangrui Meng <me...@gmail.com> wrote:
> Hi bearrito,
>
> This is a known issue
> (https://spark-project.atlassian.net/browse/SPARK-1281) and it should
> be easy to fix by switching to a hash partitioner.
>
> CC'ed dev list in case someone volunteers to work on it.
>
> Best,
> Xiangrui
>
> On Thu, Mar 27, 2014 at 8:38 PM, bearrito <j....@gmail.com> wrote:
>> Usage of negative product id's causes the above exception.
>>
>> The cause is the use of the product id's as a mechanism to index into the
>> the in and out block structures.
>>
>> Specifically on 9.0 it occurs at
>> org.apache.spark.mllib.recommendation.ALS$$anonfun$org$apache$spark$mllib$recommendation$ALS$$makeInLinkBlock$2.apply(ALS.scala:262)
>>
>> It seems reasonable to expect that product id's are positive, if a bit
>> opinionated.  I ran across this because the hash function I was using on my
>> product id's includes the negatives in it's range.
>>
>>
>>
>>
>>
>> --
>> View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/ArrayIndexOutOfBoundsException-in-ALS-implicit-tp3400.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: ArrayIndexOutOfBoundsException in ALS.implicit

Posted by Xiangrui Meng <me...@gmail.com>.
Hi bearrito, this issue was fixed by Tor in
https://github.com/apache/spark/pull/407. You can either try the
master branch or wait for the 1.0 release. -Xiangrui

On Fri, Mar 28, 2014 at 12:19 AM, Xiangrui Meng <me...@gmail.com> wrote:
> Hi bearrito,
>
> This is a known issue
> (https://spark-project.atlassian.net/browse/SPARK-1281) and it should
> be easy to fix by switching to a hash partitioner.
>
> CC'ed dev list in case someone volunteers to work on it.
>
> Best,
> Xiangrui
>
> On Thu, Mar 27, 2014 at 8:38 PM, bearrito <j....@gmail.com> wrote:
>> Usage of negative product id's causes the above exception.
>>
>> The cause is the use of the product id's as a mechanism to index into the
>> the in and out block structures.
>>
>> Specifically on 9.0 it occurs at
>> org.apache.spark.mllib.recommendation.ALS$$anonfun$org$apache$spark$mllib$recommendation$ALS$$makeInLinkBlock$2.apply(ALS.scala:262)
>>
>> It seems reasonable to expect that product id's are positive, if a bit
>> opinionated.  I ran across this because the hash function I was using on my
>> product id's includes the negatives in it's range.
>>
>>
>>
>>
>>
>> --
>> View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/ArrayIndexOutOfBoundsException-in-ALS-implicit-tp3400.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: ArrayIndexOutOfBoundsException in ALS.implicit

Posted by Xiangrui Meng <me...@gmail.com>.
Hi bearrito,

This is a known issue
(https://spark-project.atlassian.net/browse/SPARK-1281) and it should
be easy to fix by switching to a hash partitioner.

CC'ed dev list in case someone volunteers to work on it.

Best,
Xiangrui

On Thu, Mar 27, 2014 at 8:38 PM, bearrito <j....@gmail.com> wrote:
> Usage of negative product id's causes the above exception.
>
> The cause is the use of the product id's as a mechanism to index into the
> the in and out block structures.
>
> Specifically on 9.0 it occurs at
> org.apache.spark.mllib.recommendation.ALS$$anonfun$org$apache$spark$mllib$recommendation$ALS$$makeInLinkBlock$2.apply(ALS.scala:262)
>
> It seems reasonable to expect that product id's are positive, if a bit
> opinionated.  I ran across this because the hash function I was using on my
> product id's includes the negatives in it's range.
>
>
>
>
>
> --
> View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/ArrayIndexOutOfBoundsException-in-ALS-implicit-tp3400.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: ArrayIndexOutOfBoundsException in ALS.implicit

Posted by Xiangrui Meng <me...@gmail.com>.
Hi bearrito,

This is a known issue
(https://spark-project.atlassian.net/browse/SPARK-1281) and it should
be easy to fix by switching to a hash partitioner.

CC'ed dev list in case someone volunteers to work on it.

Best,
Xiangrui

On Thu, Mar 27, 2014 at 8:38 PM, bearrito <j....@gmail.com> wrote:
> Usage of negative product id's causes the above exception.
>
> The cause is the use of the product id's as a mechanism to index into the
> the in and out block structures.
>
> Specifically on 9.0 it occurs at
> org.apache.spark.mllib.recommendation.ALS$$anonfun$org$apache$spark$mllib$recommendation$ALS$$makeInLinkBlock$2.apply(ALS.scala:262)
>
> It seems reasonable to expect that product id's are positive, if a bit
> opinionated.  I ran across this because the hash function I was using on my
> product id's includes the negatives in it's range.
>
>
>
>
>
> --
> View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/ArrayIndexOutOfBoundsException-in-ALS-implicit-tp3400.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.