Skip to content
#

hyperparameter-optimization

Here are 339 public repositories matching this topic...

edoakes
edoakes commented Sep 10, 2020

I often run into issues by accidentally starting a new cluster when one is already running. Then, this later causes problems when I try to connect and there are two running clusters. I'm then forced to ray stop both clusters and ray start my new one again.

My workflow would be improved if I just got an error when trying to start the second cluster and knew to immediately tear down the exist

nni

A list of high-quality (newest) AutoML works and lightweight models including 1.) Neural Architecture Search, 2.) Lightweight Structures, 3.) Model Compression, Quantization and Acceleration, 4.) Hyperparameter Optimization, 5.) Automated Feature Engineering.

  • Updated Sep 3, 2020
mljar-supervised
pplonski
pplonski commented Sep 11, 2020

There can be a situation when all features are dropped during feature selection. Need to handle it. Maybe by throwing exception or raising a warning.

Code to reproduce:

import numpy as np
from supervised import AutoML

X = np.random.uniform(size=(1000, 31))
y = np.random.randint(0, 2, size=(1000,))

automl = AutoML(
    algorithms=["CatBoost", "Xgboost", "LightGBM"],
    model_t
bcyphers
bcyphers commented Jan 31, 2018

If enter_data() is called with the same train_path twice in a row and the data itself hasn't changed, a new Dataset does not need to be created.

We should add a column which stores some kind of hash of the actual data. When a Dataset would be created, if the metadata and data hash are exactly the same as an existing Dataset, nothing should be added to the ModelHub database and the existing

Improve this page

Add a description, image, and links to the hyperparameter-optimization topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the hyperparameter-optimization topic, visit your repo's landing page and select "manage topics."

Learn more

You can’t perform that action at this time.