Bagging
Bagging is short for bootstrap aggregating. This is one of the most popular ensemble learning approaches. The main idea of bagging is that it samples subsets from the training set … Read More
The difference between simple linear regression and multiple linear regression is that simple linear regression has only one independent variable whereas multiple linear regression has more than one independent variable. … Read More
One of the common problem in deep learning is vanishing gradients problem or exploding gradients problem. It makes hard for lower layers to train. In this post, we will look … Read More
In a previous post, we have looked at Logistic Regression for binary classification. In this post, we will see how the logistic regression can be generalized for multiple classes. This … Read More
You might guess that this algorithm is only for regression problems because of its name. However, Logistic Regression also can be used for classification problems. Logistic Regression estimates the probability … Read More
Regularization is a good way to reduce overfitting. For a linear model, the model can be regularized by penalizing the weights of the model. Simply speaking, the regularization prevents the … Read More
In machine learning, L1 and L2 techniques are widely used as cost function and regularization. It is worth to know key differences between L1 and L2 for a better model. … Read More
Have you seen this problem in your dataset? You might see the imbalanced classes problem when you deal with classification problems. This is very common. Imagine that you create a … Read More