Skip to content

Commit 0a2aae2

Browse files
committed
Add GIF animations of Dashboard
1 parent bc6b5c1 commit 0a2aae2

File tree

1 file changed

+10
-7
lines changed

1 file changed

+10
-7
lines changed

README.md

+10-7
Original file line numberDiff line numberDiff line change
@@ -9,12 +9,6 @@ Distributed hyperparameter optimization framework, inspired by [Optuna](https://
99
This library is particularly designed for machine learning, but everything will be able to optimize if you can define the objective function
1010
(e.g. Optimizing the number of goroutines of your server and the memory buffer size of the caching systems).
1111

12-
**Key features:**
13-
14-
| State-of-the-art algorithms | Optuna compatible RDB backend |
15-
| --------------------------- | ----------------------------- |
16-
| <img width="750" alt="state-of-the-art-algorithms" src="https://user-images.githubusercontent.com/5564044/88860180-2d66cc00-d236-11ea-9a2f-de731c54a870.png"> | <img width="750" alt="optuna-compatibility" src="https://user-images.githubusercontent.com/5564044/88843168-a3aa0500-d21b-11ea-8fc1-d1cdca890a3f.png"> |
17-
1812
**Supported algorithms:**
1913

2014
Goptuna supports various state-of-the-art Bayesian optimization, Evolution strategy and Multi-armed bandit algorithms.
@@ -28,6 +22,12 @@ These algorithms are implemented in pure Go and continuously benchmarked on GitH
2822
* Median Stopping Rule [6]
2923
* ASHA: Asynchronous Successive Halving Algorithm (Optuna flavored version) [1,7,8]
3024

25+
**Built-in dashboard:**
26+
27+
| Manage optimization results | Interactive live-updating graphs |
28+
| --------------------------- | -------------------------------- |
29+
| <img width="750" alt="state-of-the-art-algorithms" src="https://user-images.githubusercontent.com/5564044/97099702-4107be80-16cf-11eb-9d97-f5ceec98ce52.gif"> | <img width="750" alt="visualization" src="https://user-images.githubusercontent.com/5564044/97099797-66e19300-16d0-11eb-826c-6977e3941fb0.gif"> |
30+
3131
**Projects using Goptuna:**
3232

3333
* [Kubeflow/Katib: Kubernetes-based system for hyperparameter tuning and neural architecture search.](https://github.com/kubeflow/katib)
@@ -96,7 +96,6 @@ Furthermore, I recommend you to use RDB storage backend for following purposes.
9696
* Continue from where we stopped in the previous optimizations.
9797
* Scale studies to tens of workers that connecting to the same RDB storage.
9898
* Check optimization results via built-in dashboard.
99-
* Visualize parameters on Jupyter notebook using Optuna.
10099

101100
### Advanced usage
102101

@@ -205,6 +204,10 @@ References:
205204
* [8] [Liam Li, Kevin Jamieson, Afshin Rostamizadeh, Ekaterina Gonina, Moritz Hardt, Benjamin Recht, and Ameet Talwalkar. Massively parallel hyperparameter tuning. arXiv preprint arXiv:1810.05934, 2018.](https://arxiv.org/abs/1810.05934)
206205
* [9] [J. Snoek, H. Larochelle, and R. Adams. Practical Bayesian optimization of machine learning algorithms. In Advances in Neural Information Processing Systems 25, pages 2960–2968, 2012.](https://arxiv.org/abs/1206.2944)
207206

207+
Presentations:
208+
209+
* :jp: [Goptuna Distributed Bayesian Optimization Framework at Go Conference 2019 Autumn](https://www.slideshare.net/c-bata/goptuna-distributed-bayesian-optimization-framework-at-go-conference-2019-autumn-187538495)
210+
208211
Blog posts:
209212

210213
* [Practical bayesian optimization using Goptuna](https://medium.com/@c_bata_/practical-bayesian-optimization-in-go-using-goptuna-edf97195fcb5).

0 commit comments

Comments
 (0)