You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2020/02/26 03:25:16 UTC

[GitHub] [spark] zhengruifeng commented on issue #27519: [SPARK-30770][ML] avoid vector conversion in GMM.transform

zhengruifeng commented on issue #27519: [SPARK-30770][ML] avoid vector conversion in GMM.transform
URL: https://github.com/apache/spark/pull/27519#issuecomment-591219332
 
 
   Crrent Master impl and commit [7686e04](https://github.com/apache/spark/commit/7686e04c648384251b98c0c335c084b1f654188e), all need to create two vector in `logpdf`,
   while the initial commit [bc1586e](https://github.com/apache/spark/pull/27519/commits/bc1586eafa58748b8ae7855184d903c22c1088a4) only need to create one vector.
   
   All the scala tests passed in `bc1586e`, however, it will fail in the py side. We can see that the model coefficients are almost the same, the only significient difference is the `logLikelihood`. 
   
   The issue of `logLikelihood` is the same as https://github.com/apache/spark/pull/26735, @huaxingao had helped testing it, and found that if we set `maxIter>25`, then all impls will convergen to the same cost. 
   It looks like a littler perturbation (in https://github.com/apache/spark/pull/26735, the way to accumulate sum of weight'; in `bc1586e`, the way to compute `logpdf`: `A*(x-mean) -> A*x - A*mean`) will cause the py test converge to `26.193922336279954` at iteration=5, so I am wondering if we can update the py test by setting a larger `maxIter`?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org