Transcript:

All right, so we continue from where we stopped from the first tutorial, so we are trying to perform linear regression in a beta site this way so have TV radio newspaper, and then we have cells, so the idea behind these is that we are trying to see how TV newspaper and radio are faked cells. So basically, we have a company that is placing some adverts in different channels like TV, newspaper, radio and radio, and then they see how it effects the cells let me reduce the problem of lesions. Alright, so we’ve succeeded in doing the pair plots and this is what is shown here, so we completed step 5 If you can’t remember so step 6 we now want to extract the features, so we want to extract the features. What it means is that we want to separate the features. The features refers to these three items. TV radio newspaper. They are more like independent variables, so we have cells is the dependent variables, so in machine learning language, we call it the features, instead of saying independent variables, we say features and the cells we call it the output variable, the target variable or the result or the response as well. All right, so what we are going to do? Nice is to lets. See, we done a pearpod extract the features, all right, so let’s see how to do this we are going to. [MUSIC] Start by specifying features so features, yeah, feature columns should be a TV radio and newspapers are the future columns. So each time you are trying to place item in a collection always place in square bracket. So we have TV under nice on this radio and the next one is well is newspaper. We guessed right so this makes up the feature columns right so. I’m going to click on run, okay. Everything went fine, nice now. I’m going to say is that X will now be data from these columns. I hope you understand so we are defining that. This columns of data are the future columns. We want to use to form the new data, say X so in the second line. We are saying that data st. X is a subset of our data. Remember, data contains everything so but now we are taking from data. We are taking these three columns, so I could actually have done it in one single line, but I’m going to do it. You bring into line, so we have an errors. X is equal to says. Add a boolean or fancy integer. Okay, newspaper. You can see this newspaper, all right so. I’m going to run and I’m going to run again, all right, if I also say X equals now. I’m going to put everything in line, so I’m going to say TV, radio and newspaper. So I’m going to put one more a braces around it. So this is the same. This line is there’s at Li. The same with these two lines. So and that is fine. Alright, so now let’s see what is contained in text. So I told you about a function that is X dot heads so that hate displays the first five items in the data sight. So let’s see what is in eggs, eggs dot hate. I could actually say X just X and then run. Oh, oh, so remember. We need to put data here. So that is where the problem is coming from so. I’m going to run this and I’m going to run this. So X contains only the features or the independent variables. TV radio newspaper. And that is just fine. Alright, so let’s see where we are in this. The you know the the structure features we’ve done it and also how we examine takes. Yes, we’re assuming. X, so we’ve also done this. So the next I’m going to do is to extract the output variable Y and also examining so in this, I can trust that you can do it so what I can simply do is to copy. It’s to copy this and then using, but let’s write it down, so the more we write the more we get history, so what? I’m going to do all this time. I can just say y equal to data instructing cells the just cells column, and that is fine. So if I run is fine, so if I say Y and click on run, it displays only the Y value. Right, okay, so fine, so 1/2 X, and we have Y so let’s see where we are. We almost done so we extracted Y into y, and we have eggs as well. So we are done with these. We examined wine to see how it looks like so we are going to now import the library. We are going to use to split because we actually want to split the data into first data sites and training data site so to do that. I’ll say from SK Line. SK Line is a machine learning a library in Python that you can use for splitting your data into interface data and training data from SK Line model selection imports. Train based splits. Right, so this is what we are going to use so now. I’m going to split this data and allow the system to decide how to split this data. How many data goes to the training data site and how many data how many what quantity of data goes to the trace data right, unless see how it does it normally is going to not divide it. It’s not going to divide it equally. X test, so X train X case, right, y train Y taste. So we have this extreme as X of training data and also the white training data X ties data under white as data. So we are going to split it by calling train a split test splits. And then we are going to split eggs. Why, and we are going to split them randomly to allow it to the side? Stay equals one so run. So let’s see where we are so moving. Pave this library for splitting. We have splitted, so lets. Now see how this bleating went. So if I say X, just permit me to say Prince X train, alright? So let’s see what eggs training data site looks like, so he says 150 rows and 3 columns now. I’m going to look at Oh. What is this, so I’m going to look at the white training dataset. So I’m going to print Y trained dog shape, lets. See how it looks like. I’m going to run. And it says 150 as well for Y training dataset. So at this time, I’m going to look at X test data to see how it looks like eggs taste. And now I expect it to be just okay, so permit me to say that shape adult shape. I expect it to be only 50 so you have 50 rows and 3 columns, right, so I’m going to perform linear regression, so I’m going to also important library for linear regression, so lets. Move down to this cell. Okay, so to do that. So the library for linear regression is simply linear regression and it’s available from SK line, the linear model import linear regression. Right, so now comes, we are going to perform linear regression and to do that simply define a linear regression variable. I’m going to call it clean rag report. Oh, linear regression, remember? The problem of linear regression is to find the coefficient’s. So this is fine and I’m going to say feet X train Y train. So we are plotting X training data against white training data and we are fitting a regression line truck, So I’m going to run. I’m going to move down. I’m going to Raleigh. Okay, so everything went fine, so keep in mind that the focus is to find the intercept and the slope, so we are actually looking for Beta 1 Beta 2 Beta 3 The coefficient of x1 x2 and x3 Palestine, the coefficient of TV, radio and newspapers. So I’m going to say print. Let’s see what it’s hard for the intercept line break intercept. So if you want this whole notebook, I can send it across to you. Just let me know in the comment box below. So this is how to find to show the intercept or to be displayed. Killing Ray intersects the intercept No. I think it’s actually lyric dog Intersect. So this is the inter-site. Two point, eight, seven, eight, seven, nine, six six, so I can also just say for the next one print linreg dot coefficients. So these are the coefficients permit me to get back to. Powerpoint, and I’m going to write the equation of this line for you so that you can see how it actually works, so let’s write the equation now, and you appreciate exactly what is happening, so I’m going to take a writing material, so I can get it from this word pain. Yeah, good, so let me use this. So we have a slope. Remember, we say? Y is equal to Beta 1 Beta 0 plus Beta 1 X 1 plus Beta 2 X 2 plus beta 3 X 3 Let me take a lighter thing. So the question, based on what we have is equal to Y is equal to Beta 0 is the as the slope. Sorry, the intercept, which is Beta 0 we mentioned so the intercept is two point one to point a let’s call it two point. Eight, eight plus Beta 1 which is in this in our data, say it corresponds to TV, so it corresponds so X corresponds to TV X 1 corresponds to TV on with a bigger one to be what to be zero point zero four six. Let’s call it 0.05 0.05 TV plus nice One is 0.18 0.18 radio. Let’s call it a d-plu’s pot zero point, zero zero, two three zero point zero zero three and that to be newspaper. So I want you to see that This is how linear regression walks in in in final and I’m going to stop here, but we are going to continue so that we actually take some final note and tell do a little clarification on this. I like to thank you for viewing. Remember to click and subscribe if you’re not subscribed and also leave a comment if you have any challenges following these blessings.