You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@systemml.apache.org by Janardhan Pulivarthi <ja...@gmail.com> on 2017/10/09 15:20:26 UTC

Regarding `enable remote hyperparameter tuning`[BLOCKER issue]. Thanks.

@niketan - I don't have a time preference, please give me any time (or)
date for meeting at your convenience. Thanks.

Hi Mike,

This issue [https://issues.apache.org/jira/browse/SYSTEMML-1159] has been
marked as a blocker. I've gone through the reference paper you have
attached there.

In this paper, to my knowledge is stressing out on the point that
1. `random sampling` is better & is equivalent to expert trained or better
than `Grid search`.
2. They proved that with some `mnist` variants.

So, this random sampling can simulated by `Sobol Sequence generation`,
method that we are trying implement for Bayesian optimization case.

Conclusion: Niketan, Sasha and I trying to schedule a conversation, can you
please join us.

Thanks,
Janardhan

Re: Regarding `enable remote hyperparameter tuning`[BLOCKER issue]. Thanks.

Posted by Janardhan Pulivarthi <ja...@gmail.com>.
Hi all,

*Agenda: *First of all, I just like to have a decision support. For,
bayesian optimization is a huge algorithm with a lot of functions & there
may be many ways to exploit parallelism.

Bayes:
1. An intro an bayesian optimization, and how we are going to use it.
(understanding the definition)
2. there are around 20 functions needs to be implemented in total. (there
are four functions I did not understand.)
3. there are around 5 distributions, some which are already supported by
SystemML - DML builtin functions.
4. Inputs - observations, functions. outputs - optimized selection of
hyperparameters. Discussion of input & output behaviour.

Sobolev:
1. An intro about the sampling robustness of sobol sequence generation for
sampling.
2. Implementation approach, depending on primitive polynomial used.
3. There are 5 steps in the implementation, little discussion on how to
implement.

surrogate slice sampling:
1. The slice sampling algorithm, a brief on how it actually works.
2. Discussion of some of the functions & how actually they need to be
implemented.

let's move this forward.

Thank you very much,
Janardhan


On Mon, Oct 9, 2017 at 11:40 PM, Niketan Pansare <np...@us.ibm.com> wrote:

> Hi Janardhan,
>
> I am available anytime on thursday or friday this week works for me. I
> would recommend sending an agenda before scheduling the meeting.
>
> Thanks,
>
> Niketan.
>
> ----- Original message -----
> From: Janardhan Pulivarthi <ja...@gmail.com>
> To: Mike Dusenberry <du...@gmail.com>, dev@systemml.apache.org,
> Niketan Pansare <np...@us.ibm.com>, Alexandre V Evfimievski <
> evfimi@us.ibm.com>
> Cc:
> Subject: Regarding `enable remote hyperparameter tuning`[BLOCKER issue].
> Thanks.
> Date: Mon, Oct 9, 2017 8:21 AM
>
> @niketan - I don't have a time preference, please give me any time (or)
> date for meeting at your convenience. Thanks.
>
> Hi Mike,
>
> This issue [https://issues.apache.org/jira/browse/SYSTEMML-1159
> <https://urldefense.proofpoint.com/v2/url?u=https-3A__issues.apache.org_jira_browse_SYSTEMML-2D1159&d=DwMFaQ&c=jf_iaSHvJObTbx-siA1ZOg&r=HzVC6v79boGYQrpc383_Kao_6a6SaOkZrfiSrYZVby0&m=BsXCzntJwXhYDKUBFpMJEomcZUpBW5Tzl146Y44X56c&s=MCgYXLiG7ThIFh2qi3HH-NyQ39VjVFBga7vP04qLst0&e=>]
> has been marked as a blocker. I've gone through the reference paper you
> have attached there.
>
> In this paper, to my knowledge is stressing out on the point that
> 1. `random sampling` is better & is equivalent to expert trained or better
> than `Grid search`.
> 2. They proved that with some `mnist` variants.
>
> So, this random sampling can simulated by `Sobol Sequence generation`,
> method that we are trying implement for Bayesian optimization case.
>
> Conclusion: Niketan, Sasha and I trying to schedule a conversation, can
> you please join us.
>
> Thanks,
> Janardhan
>
>
>
>
>
>