You are viewing a plain text version of this content. The canonical link for it is here.
Posted to discuss-archive@tvm.apache.org by Jeremiah Morrill via TVM Discuss <no...@discuss.tvm.ai> on 2019/11/12 19:26:48 UTC

[TVM Discuss] [Questions] XGBoost's cuda acceleration


It seems [XGBoost supports GPU acceleration](http://tracking.discuss.tvm.ai/tracking/click?d=16FMB7EwcxvJDCA2R-NliyJmm7vGiGsXVdu32-HbyXzgzHrax6cTTZF8vPk3tcPUdOYhHHQGI8McfylgvP47UvwPiIJFsNZq28iWJAHqZiWQAUNj2QyjvxwXmLYmOoAbUc_Qx_XPrgLlsOX54dR0pLn7p1ZfVXr664BMoqjouLIT0) via cuda (9?) with the `gpu_hist` parameter to `xgb_params`

In xgboost_code_model.py I added: '`tree_method`': '`gpu_hist`' and ran a few tests (16 core, 1080gtx)

**WITH '`gpu_hist`'**

**First run:**

`[Task  1/42]  Current/Best:  178.70/2169.96 GFLOPS | Progress: (256/256) | 901.07 s Done.`

**Second run:**

`[Task  1/42]  Current/Best: 1669.95/1804.79 GFLOPS | Progress: (256/256) | 904.57 s Done.`

**WITHOUT '`gpu_hist`'**

**First run:**
`[Task  1/42]  Current/Best:   48.44/1714.60 GFLOPS | Progress: (256/256) | 980.04 s Done.`

**Second Run:**

`[Task  1/42]  Current/Best:  113.77/1672.49 GFLOPS | Progress: (256/256) | 1038.44 s Done.`

Even though I only run each test twice, you do see the '`gpu_hist`' does complete a bit faster.
I did see the cuda usage on my GPU use about 2 - 4% when running the xgboost cost model.
Is this something that should be exposed in the public API or was there a reason why it was excluded?

Can someone else verify benefit?





---
[Visit Topic](http://tracking.discuss.tvm.ai/tracking/click?d=QyEy8pU8esc2KdnA4bDd22jMbn41PomAb6fXkxiCYnMxyIxxNa012cwZRXjsnxmLCmhqfbJtdFtZe_4mhh37BRMkwgWbklOB5kyAb335WC84CcEeZvVPCEnqAhM2igIiFuZNeSOVidiA_P255skquqdmzn13xTewjXrS6kdNb7q10) to respond.

You are receiving this because you enabled mailing list mode.

To unsubscribe from these emails, [click here](http://tracking.discuss.tvm.ai/tracking/click?d=7cFgOaAA4XIBVlVKt_oyC07uihTjg4Q6cjeBRNRTiPq1fjxjdESMQRgj7zvNvoIla3deNr5fP-0-IVl1oDF5zIFzMiKoyy8VNDbUX0oanfV25aqSXSz_CH6BWej3WL4P932k0BX_2BxhnfEEBLmccF8r7jTiDJpm9xWEVd6Rg7OxLcusYiDOPG_sDl19XvsJl52a4AKclspPib7Rb5rhjXDuGk5_Pm4rbbl5vMtFW0tF0).

Tianqi Chen, UW, Seattle, WA, 98105, United States
http://tracking.discuss.tvm.ai/tracking/unsubscribe?msgid=PLJCm0d3-DSWIErm-x5zcQ2