# How Bayes’ Theorem Coincides with Machine Learning

Just why is Bayes so naive?

In this blog post, we’ll look at how to apply some of the theories for machine learning that are acquainted with Bayes’ theorem and foundational principles of Bayesian statistics. Classification problems are a natural implementation of Bayes’ theorem when you’re trying to predict a…

# Going Over Using K-Nearest Neighbors

A quick debriefing on a really cool supervised learning algorithm

K-Nearest Neighbors, or KNN, is a supervised learning algorithm that can be applied on classification and regression problems. KNN is a distance-based classifier, which means it automatically implies that the closer two points are, the more identical they are. Euclidean…

# Accuracy and F1 Score — The Better Choices for Evaluating Model Success

Discussing the two most useful metrics that are used to describe a model’s efficiency

In my most recent blog post, I went over two of the easier and more common metrics used to explore model performance in machine learning, precision and recall. …

# Precision and Recall — What Are the Differences?

Precision and recall are two of the most fundamental evaluation metrics that we have at our hands.

It’s imperative to compare your models to each other and pick the best fit models when performing tasks about classification. When you are estimating values in regression, it makes sense to speak about…

# Different Methods of Feature Selection in Machine Learning

How to automate the process of selecting features

In data science, there are many different approaches to building features to model complicated relationships — although, this may sometimes be troublesome. …

# Derivatives are Confusing — So Here’s a 3 Minute Explanation

Hooray for calculus!

You have probably heard about the central principle of mathematical functions if you have studied linear regression. You can articulate this with the following example — assume that you have used the number of bathrooms in a house as a predictor and the house rental price as…

# Regression, Big O Notation, and Gradient Descent — How Are They All Related?

A quick discussion on computational complexity

In this blog post, I will be exposing you to some of the complexity in computation in relation to OLS regression. You will read about this concept and see that this might not be the most powerful algorithm for estimating regression parameters while regression…

# 4 Ways to Normalize and Scale Features for Regression Modeling

Normal features, or features that are as normally distributed as possible, will lead to better outcomes. This is what makes scaling and the normalization of features in regression modeling so significant. There are a number of ways to scale your features, and…

# Dealing with Multicollinearity in Multiple Linear Regression

The biggest issue hindering quality results in regression modeling

If you are familiar with data science, especially with regression modelling, then you are probably familiar with the concepts of covariance and correlation. …

# Clarifying Regression Diagnostics for Linear Regression

How to test the required assumptions 