Gradient Descent
Gradient Descent is an optimization algorithm. The general idea of Gradient Descent is to tweak parameters $latex \theta$ iteratively in order to minimize a cost function by measuring the gradient … Read More
Gradient Descent is an optimization algorithm. The general idea of Gradient Descent is to tweak parameters $latex \theta$ iteratively in order to minimize a cost function by measuring the gradient … Read More
Machine learning algorithms can be classified into batch or online methods by whether or not the algorithms can learn incrementally as new data arrive. Batch Learning Batch learning methods are … Read More
Semantic information can be extracted by public collaborative efforts, such as Wikipedia, or curated by closed groups, and this information can be managed in a knowledge graph (KG). The knowledge … Read More
Recently, embedding models have been gaining considerable attention in various areas such as natural language processing (NLP), recommender system, knowledge graph completion, etc. This is due to their high accuracy … Read More
Image from https://medium.com/@ilango100/batch-normalization-speed-up-neural-network-training-245e39a62f85 In 2015, a very effective approach (called Batch Normalization) has been proposed to address the vanishing/exploding gradients problems. This learns two parameters to find the optimal scale … Read More
Gradient boosting is one of popular boosting technique, especially with decision trees, for regression and classification problems. Like other boosting methods, gradient boosting adds predictors sequentially to an ensemble that … Read More
When we train a linear regression model (i.e., $latex \hat{y} = \theta^T \cdot \textbf{x}$), we might consider two different ways to train it: Using a “closed-form” solution that directly computes … Read More
In this post, we will look at a common way to deal with categorical values (e.g., small, medium, large). Most machine learning algorithms work with numerical values, so we need … Read More
One of data cleaning processes is about dealing with missing values. It is very common to find missing values in your datasets. To train your model better, you need to … Read More
We can determine if the performance of a model is poor by looking at prediction errors on the training set and the evaluation set. The model might be too simple … Read More