2
$\begingroup$

So i’m currently fine tuning a pretrained model with 35k images across 5 classes. Very high class imbalance with one being 73% across the distribution.

Handled this with by using a weighted loss function with CrossEntropy Essentially total/no.of.classes * class sample

Then instead of tuning on the entire dataset i created a subset of 20% from the full datast and trained for 30 Epochs and at epoch 6 unfroze 2 layersand at Epoch 8 used a scheduling decay rate of 10^-5. The aim is however to increase the subset size whilst keeping Hyperparameters fixed until reaching diminishing returns.

Now the issue is :

  1. If 30 Epochs on 20% of the data is already overfitting highly and triggering early stopping by the 20th epoch, could i now set the epoch to 20 and try to tune the HyperParameters and only move on when i see less overfitting and a better trend for loss and accuracy. With this approach would the next step be to increase subset size or no. Of Epochs after seeing promising results at epoch 20?

  2. If the subset of 20% is not giving promising results then scaling up in size is trivial so what would be the best way to handle these sorts of problems

Im using a subset of 20% just for faster iterations but will eventually train the final model on the entire dataset

New contributor
Abas1 is a new contributor to this site. Take care in asking for clarification, commenting, and answering. Check out our Code of Conduct.
$\endgroup$
1
  • $\begingroup$ If you provide a code snippet of the model or at least say what kind of model it is, it is easier for people to help you with your question. $\endgroup$
    – pyrochlor
    Commented 2 days ago

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.