WHT

# Tensor Outer Product | Inner & Outer Products | Lecture 5 | Matrix Algebra For Engineers

Subscribe Here

1,428

62,533

## Inner & Outer Products | Lecture 5 | Matrix Algebra For Engineers

Transcript:

So, in this video, I want to talk about inner and outer products. It’s a good time to introduce these, because now we know about the transpose operator, so let’s start with the inner product. This is the inner product between two vectors, so we have two vectors. Let’s say here we’re going to use column vectors, which is standard for matrix algebra. So let’s call the first vector. U and that will have well do three rows, one column, three by one. So this is Mu 1 Mu 2 Mmm, and the second Vector will call V which is V 1 V 2 V 3 So the inner product is the same thing as the dot product. If you’ve learned the dot product before so the dot product between these two vectors or the inner product should be U 1 V 1 plus U 2 V 2 plus U 3 V 3 How do we do that? In matrix algebra. Well, we can use the transpose operator. So if we write U Transpose Times V then the transpose of a column vector is a row vector, so we have U 1 U 2 U 3 Times V, which is V 1 V 2 3 So here we have a 1 row 3 columns so 1 by 3 times 3 by 1 and we end up with a 1 by 1 which is what we call a scalar. So this is a straightforward multiplication U 1 V 1 plus U 2 mu 2 plus 3 Okay, we have some definitions here that are use useful, so if this turned out to be 0 so if you and u transpose. V equals law. Then we say that U and V are orthogonal. I hope so that word we use in matrix algebra is orthogonal. This is the same, meaning as perpendicular for the dot product. Okay, we have another definition, so we we talk about the norm of a vector. This is represented, say the norm of the what do you is written with these double absolute value signs you think of this as the length of U, this is given in terms of matrices is U Transpose Times U. That’s a scalar, right, and we Rosal to the One-half power, which means taking the square root. So this is just you you. This is just you. Won’t squared plus, U 2 squared, plus U 3 squared all to the one-half power or to the square root. Okay, one more one more word we use, we say that. U is normalized, so we say U is normal. Lowest if the norm of U is equal to 1 Okay, so we say vectors are normalized if their norms are one. So usually you do that, you normalize a vector by dividing by this by the norm of the vector, and then the vector is normalized. Okay, then there’s another terminology. If two vectors are orthogonal plus normalized. Okay, then, so that means we have two vectors. U and V they’re orthogonal, so U Transpose V equals zero, and they’re normalized. So the norm of U is 1 and the norm of V is 1 Then you say that the vectors are also normal. Okay, so that’s a word that is used very frequently in Matrix Algebra. OK, so this is all about inner products. Let’s see what is the outer product? The inner product is used all the time. The outer product is not used really used that often, but there are some numerical methods. There are some techniques that make use of the outer product so as a student in matrix algebra, you should know what an outer product is. The inner product between two vectors was u transpose. V for two column vectors that gave us a scalar, but what would happen if we did U V transpose? So what would you he transpose? Look like, okay, So now U is a column vector, right, so U 1 U 2 U 3 and V Transpose then is a row vector so that would be V 1 V 2 V 3 So it’s this funny multiplication, right, so what is this? This is 3 by 1 right 3 rows, one column, 3 by 1 This is 1 by 3 1 row 3 columns 3 by 1 times 1 by 3 is 3 by 3 So this is a 3 by 3 matrix. Okay, rather odd looking, so we can calculate it, so go across the first row down the first column. Kind of boring, right, There’s only one hell in the first row and one element in the first column, so that’s U 1 U 1 U 1 V 2 U 1 V. Thing, right, and then the second row first column, U 2 V 1 U 2 V 2 until V for me. And then finally, you 3 U 1 U 3 V 2 OK, 3×3 Matrix. It’s a rather strange matrix! The first row has U 1 the second row, has you two? The third row has U 3 the first column has V 1 the first column as V 2 the first column that is V 3 Well, we’ll see that this. This type of matrix in some sense lives in a very low dimensional space, but we’ll talk about that more when we talk about more advanced topics in matrix algebra. Okay, so let’s recap. We’re talking about inner and outer product spaces here, so the inner product between two column matrices is a U transpose. V That gives us a scalar that’s equivalent to the dot product in vector calculus. If U Transpose V equals zero. Then we say the two vectors are orthogonal. The norm of the vector, which is equivalent to the length of the vector, is U transpose use U raised to the 1/2 power so square root of the sum of the squares of the components. We say u is normalized. If the norm is equal to 1 and if we have a set of vectors that are mutually orthogonal and all normalized to norm of 1 then we say the set of vectors. R is orthonormal okay, so that’s. This important word orthonormal. We also talked about an outer product, which is a funny matrix as UV transpose, and that will for three for a column vector. That’s three by one that works out two or three by three matrix. Okay, this has some more less use, but it’s still interesting vector product. I’m Jeff Jazz. Knopf, Thanks for watching and Ill. See you in the next video, you?

## 0.3.0 | Wor Build 0.3.0 Installation Guide

Transcript: [MUSIC] Okay, so in this video? I want to take a look at the new windows on Raspberry Pi build 0.3.0 and this is the latest version. It's just been released today and this version you have to build by yourself. You have to get your own whim, and then you...

## Youtube Neural Network | But What Is A Neural Network? | Chapter 1, Deep Learning

Transcript: These are three, sloppily written and provided at a very low resolution of 28 x 28 pixels But your mind has no problem recognizing it as three and I want you to take a moment to appreciate How your brain can do this effortlessly and smoothly I mean this...