Disclaimer : This website is going to be used for Academic Research Purposes.


Overfitting refers to when a model learns too much information from the training set including potential noise and outliers. As a result, it becomes too complex, too conditioned on the particular training set, and fails to adequately perform on unseen data. Overfitting leads to high variance on the bias-variance tradeoff.  


Leave a Reply

Your email address will not be published. Required fields are marked *

Scroll to top