You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@tvm.apache.org by GitBox <gi...@apache.org> on 2019/11/12 16:52:34 UTC

[GitHub] [incubator-tvm] mbarrett97 commented on issue #4150: [RFC] [AutoTVM] Implementing an auto-tuning library/cache

mbarrett97 commented on issue #4150: [RFC] [AutoTVM] Implementing an auto-tuning library/cache
URL: https://github.com/apache/incubator-tvm/issues/4150#issuecomment-552982585
 
 
   @comaniac I've done some refactoring to disentangle 'TuningJob' from the ConfigLibrary. The tuning loop now looks like this:
   
   ```
   def tune_kernels(tasks,
                    n_trial,
                    config_library,
                    measure_option,
                    log_filename='tuning.log'):
   
       # Create a tuning job and point it at a config library
       job = TuningJob(
           log_filename,
           target,
           config_library=config_library,
       )
       # Use the tuning job during the tuning loop
       with job:
           for i, tsk in enumerate(tasks):
               prefix = "[Task %2d/%2d] " % (i+1, len(tasks))
   
               # Convert conv2d tasks to conv2d_NCHWc tasks
               task = autotvm.task.create("topi_x86_conv2d_NCHWc", args=tsk.args,
                                          target=target, template_key='direct')
               task.workload = tsk.workload
   
               # Create tuner
               tuner_obj = GridSearchTuner(task)
   
               # Do tuning - the tuner will skip tasks which have already been tuned
               # in the config library
               tuner_obj.tune(
                   n_trial=n_trial,
                   early_stopping=n_trial,
                   measure_option=measure_option,
                   callbacks=[autotvm.callback.progress_bar(n_trial, prefix=prefix)],
               )
   ```
   
   Using 'with job' puts the job into the global tuning scope. The job will then automatically register it's own callback to the tuners and a new tuner method `load_library` is called if the TuningJob has a ConfigLibrary attached. This is where the resume logic can be implemented. Currently I have only implemented the basic logic to skip completed tasks, but it should be possible to implement more advanced resume logic fairly easily as you have access to the full ConfigLibrary.
   
   If you don't specify a ConfigLibrary with a job, it will just log all the results to the specified log file.
   
   Config files are indexed within the library by target, so to use configs from the library you can simply do `with config_library.load(target):`. This just returns ApplyHistoryBest, it doesn't implement a new DispatchContext.
    
   I've updated my PR (https://github.com/apache/incubator-tvm/pull/4151) accordingly. Note that the PR does not include every feature discussed here but is intended as initial infrastructure on top of which more advanced features can be developed.
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services