ml-505: Simplified Cost Function and Gradient Descent for Logistic Regression

February 18, 2014

Hello, welcome back! In the last post we defined the cost function for our logistic regression implementation. Today we will simplify it, and also move onto the next stage, viz minimization of the cost function.

As a short recap, in the last post, we defined the cost function as

Continue reading →

ml-504: Cost Function and Logistic Regression

February 17, 2014

Hello. In the previous few posts (here, here and here we learned about linear regression and also formulated its hypothesis function using the sigmoid function so that the output variable "y" always is either 0 or 1. Today we will move onto the next stage and learn about the cost function that shall be used in the logistic regression ML problem.

In post we defined our hypothesis function as

Continue reading →

ml-503: Non Linear Decision Boundary

February 10, 2014

Hi there! Welcome back to this series of posts on ML algorithms. In the last post we looked at the hypothesis representation and decision boundary concepts in the logistic regression ML algorithm. Today we will look at some more examples of decision boundaries and get a deeper understanding regarding them.

But, let's do a quick review of what we learned last time so that we can continue from there. We defined our hypothesis function as:

Continue reading →

ml-502: Hypothesis Representation and Decision Boundary

February 9, 2014

Hello! In the last post we took an initial swing at a new type of ML problem, viz classification. We also looked at how the ML linear regression algorithm fails to work for this type of problem, and saw that there is another ML algorithm, logistic regression, that is used to solve them. Today we will do a deeper dive into various concepts surrounding the logistic regression ML algorithm.

Continue reading →

ml-501: Logistic Regression

February 8, 2014

Hello :) I have a good news today! If you have not noticed, we have moved from the 40x series into the 50x series of ML posts today. Yay! This means that we will start learning a new section, or type of ML algorithms. But wait, isn't that logistic "regression" there in the title? Haven't we been learning regression algorithms already in the 40x series here, here, here, here, here and here. The reason this is so is because we will look at a different type of ML problem, viz "classification" in this 50x series, but for historical reasons these algorithms are called logistic regression. Note the "logistic" part, which means "logical", or "classes". Ready to start? Here we go!

Today we will be learning the "classification" type of ML problem, where the output variable "y" is a set of "discrete" values. If you remember, we had given a brief introduction to these problems in this 10x introductory post. But now, let's take a deeper look using some examples:

Continue reading →