You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by "ASF GitHub Bot (JIRA)" <ji...@apache.org> on 2016/06/01 01:44:12 UTC
[jira] [Commented] (FLINK-1979) Implement Loss Functions
[ https://issues.apache.org/jira/browse/FLINK-1979?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15309038#comment-15309038 ]
ASF GitHub Bot commented on FLINK-1979:
---------------------------------------
Github user chiwanpark commented on a diff in the pull request:
https://github.com/apache/flink/pull/1985#discussion_r65291353
--- Diff: flink-libraries/flink-ml/src/main/scala/org/apache/flink/ml/optimization/PartialLossFunction.scala ---
@@ -47,21 +47,106 @@ object SquaredLoss extends PartialLossFunction {
/** Calculates the loss depending on the label and the prediction
*
- * @param prediction
- * @param label
- * @return
+ * @param prediction The predicted value
+ * @param label The true value
+ * @return The loss
*/
override def loss(prediction: Double, label: Double): Double = {
0.5 * (prediction - label) * (prediction - label)
}
/** Calculates the derivative of the [[PartialLossFunction]]
*
- * @param prediction
- * @param label
- * @return
+ * @param prediction The predicted value
+ * @param label The true value
+ * @return The derivative of the loss function
*/
override def derivative(prediction: Double, label: Double): Double = {
(prediction - label)
}
}
+
+/** Logistic loss function which can be used with the [[GenericLossFunction]]
+ *
+ *
+ * The [[LogisticLoss]] function implements `log(1 + -exp(prediction*label))`
+ * for binary classification with label in {-1, 1}
+ */
+object LogisticLoss extends PartialLossFunction {
+
+ /** Calculates the loss depending on the label and the prediction
+ *
+ * @param prediction The predicted value
+ * @param label The true value
+ * @return The loss
+ */
+ override def loss(prediction: Double, label: Double): Double = {
+ val z = prediction * label
+
+ // based on implementation in scikit-learn
+ // approximately equal and saves the computation of the log
+ if (z > 18) {
+ return math.exp(-z)
+ }
+ else if (z < -18) {
+ return -z
+ }
+
+ math.log(1 + math.exp(-z))
--- End diff --
Using `return` is not recommended in Scala. Could you change this like following?
```scala
if (z > 18) {
math.exp(-z)
} else if (z < -18) {
-z
} else {
math.log(1 + math.exp(-z))
}
```
> Implement Loss Functions
> ------------------------
>
> Key: FLINK-1979
> URL: https://issues.apache.org/jira/browse/FLINK-1979
> Project: Flink
> Issue Type: Improvement
> Components: Machine Learning Library
> Reporter: Johannes Günther
> Assignee: Johannes Günther
> Priority: Minor
> Labels: ML
>
> For convex optimization problems, optimizer methods like SGD rely on a pluggable implementation of a loss function and its first derivative.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)