You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by MechCoder <gi...@git.apache.org> on 2016/06/07 00:46:00 UTC

[GitHub] spark pull request #12370: [SPARK-14599][ML] BaggedPoint should support samp...

Github user MechCoder commented on a diff in the pull request:

    https://github.com/apache/spark/pull/12370#discussion_r65994490
  
    --- Diff: mllib/src/main/scala/org/apache/spark/ml/tree/impl/BaggedPoint.scala ---
    @@ -33,13 +33,20 @@ import org.apache.spark.util.random.XORShiftRandom
      * this datum has 1 copy, 0 copies, and 4 copies in the 3 subsamples, respectively.
      *
      * @param datum  Data instance
    - * @param subsampleWeights  Weight of this instance in each subsampled dataset.
    - *
    - * TODO: This does not currently support (Double) weighted instances.  Once MLlib has weighted
    - *       dataset support, update.  (We store subsampleWeights as Double for this future extension.)
    + * @param subsampleCounts  Number of samples of this instance in each subsampled dataset.
    + * @param sampleWeight The weight of this instance.
      */
    -private[spark] class BaggedPoint[Datum](val datum: Datum, val subsampleWeights: Array[Double])
    -  extends Serializable
    +private[spark] class BaggedPoint[Datum](
    +    val datum: Datum,
    +    val subsampleCounts: Array[Int],
    +    val sampleWeight: Double) extends Serializable {
    +
    +  /**
    +   * Subsample counts weighted by the sample weight.
    +   */
    +  def weightedCounts: Array[Double] = subsampleCounts.map(_ * sampleWeight)
    --- End diff --
    
    Should this be a `val`?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org