Skip to content
#

hyperparameter-optimization

Here are 559 public repositories matching this topic...

nni
nzw0301
nzw0301 commented Jan 24, 2022

What is an issue?

Optuna's CIs start to use Python 3.8 by default optuna/optuna#3026. However, optuna still supports older Python versions, precisely, 3.6 and 3.7, so users still develop Optuna with one of the other versions. In some settings, the recommended procedure by https://github.com/optuna/optuna/blob/master/CONTRIBUTING.md#documentation might not work accord

mljar-supervised
sonichi
sonichi commented Jan 25, 2022

In principle it seems getting the parameters from FLAML to C# LightGBM seems to work, but I dont have any metrics yet. The names of parameters are slightly different but documentation is adequate to match them. Microsoft.ML seems to have version 2.3.1 of LightGBM.

Another approach that might be useful, especially for anyone working with .NET, would be having some samples about conversion to ONN

Notes, programming assignments and quizzes from all courses within the Coursera Deep Learning specialization offered by deeplearning.ai: (i) Neural Networks and Deep Learning; (ii) Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization; (iii) Structuring Machine Learning Projects; (iv) Convolutional Neural Networks; (v) Sequence Models

  • Updated Feb 7, 2022
  • Jupyter Notebook
Gradient-Free-Optimizers

A list of high-quality (newest) AutoML works and lightweight models including 1.) Neural Architecture Search, 2.) Lightweight Structures, 3.) Model Compression, Quantization and Acceleration, 4.) Hyperparameter Optimization, 5.) Automated Feature Engineering.

  • Updated Jun 19, 2021
bcyphers
bcyphers commented Jan 31, 2018

If enter_data() is called with the same train_path twice in a row and the data itself hasn't changed, a new Dataset does not need to be created.

We should add a column which stores some kind of hash of the actual data. When a Dataset would be created, if the metadata and data hash are exactly the same as an existing Dataset, nothing should be added to the ModelHub database and the existing

Neuraxle

Improve this page

Add a description, image, and links to the hyperparameter-optimization topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the hyperparameter-optimization topic, visit your repo's landing page and select "manage topics."

Learn more