hyperparameter-optimization
Here are 338 public repositories matching this topic...
-
Updated
Sep 3, 2020 - Python
What would you like to be added: As title
Why is this needed: All pruning schedule except AGPPruner only support level, L1, L2. While there are FPGM, APoZ, MeanActivation and Taylor, it would be much better if we can choose any pruner with any pruning schedule.
**Without this feature, how does current nni
with the Power Transformer.
Motivation
From following resources, the search space of examples/xgboost_simple.py
seems not to be practical.
- https://www.analyticsvidhya.com/blog/2016/03/complete-guide-parameter-tuning-xgboost-with-codes-python/
- https://www.youtube.com/watch?v=VC8Jc9_lNoY
- https://www.amazon.co.jp/dp/B07YTDBC3Z/
Description
Improve the search space of examples/xgboost_simple.py
.
-
Updated
Sep 7, 2020
-
Updated
Jan 24, 2020 - Python
-
Updated
May 19, 2020 - Python
-
Updated
Jun 25, 2020 - Python
-
Updated
Jun 6, 2018 - Python
-
Updated
Aug 4, 2020 - Python
-
Updated
Sep 3, 2020
-
Updated
Jul 19, 2020 - Python
-
Updated
Apr 11, 2020 - JavaScript
-
Updated
Sep 11, 2020 - Go
-
Updated
Aug 15, 2018 - Python
-
Updated
Sep 11, 2020 - Python
If enter_data()
is called with the same train_path
twice in a row and the data itself hasn't changed, a new Dataset does not need to be created.
We should add a column which stores some kind of hash of the actual data. When a Dataset would be created, if the metadata and data hash are exactly the same as an existing Dataset, nothing should be added to the ModelHub database and the existing
-
Updated
Jan 29, 2018 - Python
-
Updated
Sep 6, 2020 - Python
-
Updated
Jun 7, 2018 - Python
-
Updated
Aug 24, 2020 - JavaScript
-
Updated
Jul 19, 2019
-
Updated
Dec 6, 2016 - Jupyter Notebook
-
Updated
Jun 30, 2020 - Python
-
Updated
Aug 30, 2020 - Jupyter Notebook
-
Updated
Aug 27, 2020 - Python
-
Updated
Feb 4, 2020 - C++
Improve this page
Add a description, image, and links to the hyperparameter-optimization topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the hyperparameter-optimization topic, visit your repo's landing page and select "manage topics."
Describe your feature request
Currently, if you run
ray.wait([pg.ready()])
, it prints the warning message that the ready task is infeasible. This is an expected behavior, so we should not print warning messages.Impl Idea