You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by pun <pu...@gmail.com> on 2017/10/22 17:20:52 UTC
Spark ML - LogisticRegression interpreting prediction
Hello,
I have a LogisticRegression model for predicting a binary label. Once I
train the model, I run it to get some predictions. I get the following
values for RawPrediction. How should I interpret these? Whdo they mean?
+----------------------------------------+|rawPrediction
|+----------------------------------------+|[30.376879013053156,-30.376879013053156]||[32.08591062636529,-32.08591062636529]
||[34.67079346038218,-34.67079346038218] |
From scikit-learn, I believe, the two values for each user add up to 1.TIA
--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
Re: Spark ML - LogisticRegression interpreting prediction
Posted by Weichen Xu <we...@databricks.com>.
The values you want to get (add up to 1.0) is "probability", not
"rawPrediction".
Thanks!
On Mon, Oct 23, 2017 at 1:20 AM, pun <pu...@gmail.com> wrote:
> Hello,
> I have a LogisticRegression model for predicting a binary label. Once I
> train the model, I run it to get some predictions. I get the following
> values for RawPrediction. How should I interpret these? Whdo they mean?
>
> +----------------------------------------+
> |rawPrediction |
> +----------------------------------------+
> |[30.376879013053156,-30.376879013053156]|
> |[32.08591062636529,-32.08591062636529] |
> |[34.67079346038218,-34.67079346038218] |
>
> From scikit-learn, I believe, the two values for each user add up to 1.
> TIA
> ------------------------------
> Sent from the Apache Spark User List mailing list archive
> <http://apache-spark-user-list.1001560.n3.nabble.com/> at Nabble.com.
>
Re: Spark ML - LogisticRegression interpreting prediction
Posted by pun <pu...@gmail.com>.
Thanks a lot! You are right!
--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org