Forums | OpenR

OpenR: R & Statistics /
Decisions boundary in logistic regression


Yilin Li's profile picture
Posts: 2

23 March 2022, 8:36 PM

I’m the presenter of this week’s oral presentation “logistic regression”, and I’m Yilin Li. In my part, maybe I’m too focused on the python part that some basic knowledge of “decisions boundary” is not that clear, and if I speak both of them, the presentation time would be out of the bound. And thanks to Mr Peng, who let me notice it. So here I’d like to tell you what the “decisions boundary” is. And also I would want you to help me.

Decisions boundary:

  1. You can think of it as the regression part of the logistic regression. (similar to the log(odds) in the video in watching)
  2. If we go deeper into the sigmoid function, you can find that it would convert the negative number to a number below 0.5, and convert the positive number to a number above 0.5. And as we should know, we classified the different types according to whether they are greater than 0.5 or not. So, in my understanding, the regression part of the logistic regression is to find a suitable decision boundary to let the input number convert to a positive number or not, depending on its genre. If you still remember the graph in my ppt, you can find that one genre is above the line, and the other is below the line.
  3. So how exactly does this regression work? The likelihood function would help us to find the most suitable parameter (theta in my presentation) for this regression. The likelihood function is a function that would punish the model if it is not good, which means it would give a higher score if the model gives the wrong answer. So we need to find the parameter with the lowest score in likelihood function.

A question I would want you guys to help me:

In my part, I have depicted the decision boundary with 1 and 2 independent values. And I think the 3 independent values can also work. But, how about more than 3 independent values? I have tried to find some, but I couldn’t find any useful information. If you know something related, could you comment on this post?

 

If I have said anything wrong, please let me know. And if you still cannot understand it or want to know more about likelihood function (cost function), you can go and look at the video I post here. It would tell you how the likelihood function work and some decision boundary knowledge. I hope this can help you.

https://www.bilibili.com/video/BV1As411j7zw?spm_id_from=333.880.my_history.page.click

Edits to this post:
Rong Zhu's profile picture
Posts: 5

24 March 2022, 2:27 PM

Give you a thumbs up!

2 results