You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Bryan Cutler (JIRA)" <ji...@apache.org> on 2015/10/29 18:59:27 UTC
[jira] [Commented] (SPARK-10158) ALS should print better errors
when given Long IDs
[ https://issues.apache.org/jira/browse/SPARK-10158?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14980920#comment-14980920 ]
Bryan Cutler commented on SPARK-10158:
--------------------------------------
I made a quick fix for this. When using Long values for Ratings user or product ids, it will raise an exception with a more informative message. I'll post the PR soon.
{quote}
net.razorvine.pickle.PickleException: Ratings id 1205640308657491975 exceeds max value of 2147483647
at org.apache.spark.mllib.api.python.SerDe$RatingPickler.ratingsIdCheckLong(PythonMLLibAPI.scala:1454)
at org.apache.spark.mllib.api.python.SerDe$RatingPickler.construct(PythonMLLibAPI.scala:1445)
at net.razorvine.pickle.Unpickler.load_reduce(Unpickler.java:707)
...
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassCastException: java.lang.Long cannot be cast to java.lang.Integer
at scala.runtime.BoxesRunTime.unboxToInt(BoxesRunTime.java:106)
at org.apache.spark.mllib.api.python.SerDe$RatingPickler.ratingsIdCheckLong(PythonMLLibAPI.scala:1451)
{quote}
> ALS should print better errors when given Long IDs
> --------------------------------------------------
>
> Key: SPARK-10158
> URL: https://issues.apache.org/jira/browse/SPARK-10158
> Project: Spark
> Issue Type: Improvement
> Components: ML, MLlib, PySpark
> Reporter: Joseph K. Bradley
> Priority: Minor
>
> See [SPARK-10115] for the very confusing messages you get when you try to use ALS with Long IDs. We should catch and identify these errors and print meaningful error messages.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org