WHT

Torch Dot Product | Dot Products And Duality | Chapter 9, Essence Of Linear Algebra

Subscribe Here

24,756

1,360,071

Dot Products And Duality | Chapter 9, Essence Of Linear Algebra

Transcript:

Traditionally, point products or something like that were introduced really early on in linear algebra Course Usually in the beginning. So it might seem odd to push them back that far in the series. I did this because there is a standard way to display which topic It requires nothing more than a basic understanding of vectors, But a full understanding of the role point products play in mathematics, It can only really be light under the linear transformations. Before that, let me close briefly The standard way in which the products are presented. What I’m assuming is at least a partial review of a number of viewers. Numerically, if you have two vectors of the same dimension To a list of numbers with the same length, Take a point product, mean, Pair all coordinates, By multiplying those pairs together, And add the result. So the vector [1, 2] is dotted with [3, 4], It will be 1 x 3 + 2 x 4. The vector [6, 2, 8, 3] dotted with [1, 8, 5, 3] would be: 6 x 1 + 2 x 8 + 8 x 5 + 3 x 3. Fortunately, this account really has a wonderful geometric interpretation. To think about the product of a point between two vectors v and w, Imagine the projection of w to the line that passes through the origin and tip of the fifth. Multiplying the length of this projection by the length of v, you have a point product P · w. Except when this projection from w points in the opposite direction from v, This product will actually be negative. So when two vectors are generally pointing in the same direction, Their product is a positive point. When they are orthogonal, that is, Projecting one over the other is 0 vector, The product point is 0. And if they indicate the opposite in general direction, the product is a negative point. Now, this interpretation is incoherently strange. Treat vectors very differently, So when I first found out about this I was surprised it didn’t matter. Could you alternatively project v on w. Multiply the projected length of v by the length of w And get the same result. I mean, don’t I feel completely different therapist? Here’s the hunch as to why nothing happens: If v and w happen to be of the same length, We can take advantage of some symmetry. Since the projection of w to v Then the length of this projection is multiplied by the length of the fifth, It is a complete mirror image of projecting v to w and then multiplying the length of that Projection of the length of w. Now, if you “expand” one of them, say v by some constant like 2, So that they are not of equal length, The symmetry is broken. But let us think by how to interpret the product point between this 2V and new vector Th. If you are thinking of getting to the fifth w Then the product will be a point 2V ・ w Exactly twice the product point v ・ w. This is because when “scale” by 2 The width length does not change from w But it doubles the length of the vector that you are dropping on. But, on the other hand, let’s say you are thinking about getting the projected v over w. Well, in this case, projection length is something that should be “measured” when we’re multiplying T. by 2. The length of the vector that you would expect on a fixed stay. So the overall effect is still a product multiplier one point. Therefore, despite the symmetry in this case, The effect of this “scaling” on the product value point, is the same Under each of the explanations. There’s also another big question that confused me when I first learned this stuff: Why on Earth is this numerical process of matching coordinates and multiplying pairs and Adding them together, You have anything to do with projection? Well, to give a satisfactory answer, And also to achieve full justice of the significance of the product point, We need to discover something a little deeper going on here That is often called “duplication.” But before getting into that, I need to spend some time talking about linear transitions From multiple dimensions to one dimension And it’s just the line number. These are the 2D vector jobs and spit out some numbers. But linear transformations are, of course, Much more restricted than your treadmill function with 2D input and 1D output. As with transitions with higher dimensions, Like the one I talked about in Chapter 3, There are some formal characteristics that make these functions linear. But I will intentionally ignore those here in order not to distract us from our ultimate goal, Instead, focusing on a specific visual feature equates to all formal things. If you take a line of evenly spaced points And apply a transformation, Linear transformation will keep these points evenly spaced, Once it lands in the ejection space, which is the number line. Otherwise, if there is a line of points it gets unevenly spaced Then your shift is not linear. As with the cases we have seen before, One of these is linear transformations It is completely determined by where I take a hat and a hat But this time, each of those fundamental vectors is just Earth on a number. So when we record where the columns land from the array Each of these columns only has one number. This is a 1 x 2 matrix. Let’s walk through an example of what it means to apply one of these transformations to vector. Suppose you have a linear transformation that takes i-hat to 1 and j-hat to -2. To follow the destination with coordinates, say, [4, 3] ends, Consider crashing this vector 4 times i-hat + 3 times j-hat. The result of linearity, is then the transformation The vector will be: 4 times the location I cap lands, 1, Plus 3 times where j-hat is located, -2. Which in this case means that it is descending -2. When you do this calculation purely numerically, it’s vector multiplication. Now, this numerical process of multiplying a 1 matrix by 2 by a vector, It just feels like taking the two-point product. Doesn’t only the 1 x 2 matrix look like the vector we saw on its side? In fact, we can now say that there is a good correlation between 1 x 2 matrices Two-dimensional vectors, Determined by tilting the numerical representation of a vector on its side to obtain its conjugate Matrix, Or the matrix hint again for the associated vector. Since we are only looking at numerical expressions now, Going back and forth between vectors and 1×2 matrices might look like a silly thing To do. But this indicates something really cool from an engineering point of view: There is a type of communication between linear shifts that takes vectors to numbers And carriers themselves. Let me give an example that illustrates this significance And that happens until the puzzle also answers the product point from earlier. Find out what I learned And imagine you don’t already know that product point related to projection. What I’m going to do here is take a copy of the number line And put it diagonally and somehow space with the number 0 sits in the origin. Now think of the two-dimensional unit vector, Which tips sit where the number 1 on the line figure is. I want to give this guy the name yo hat. This little man plays an important role in what is going to happen, Just keep it in the back of your mind. If we were showing a two-dimensional vector directly on this diagonal number line In fact, we have just defined a function that takes 2D vectors into numbers. What’s more, this functionality is actually linear Because it passes our visual test Any of the evenly spaced line stays evenly spaced when you land on the number Line. Just to be clear, Although I have included the number line in 2D space like this, The function of the function is numbers, not two-dimensional vectors. You should consider a function that you take in a coordinate and output a single coordinate. But this u-hat vector is a two-dimensional vector Living in input space. It just rests in such a way that it interferes with the inclusion of a number line. With this projection, we only defined two-dimensional vector shift lines to numbers, So we will be able to find some kind of 1 × 2 matrix that describes this transformation. To find that 1 x 2 matrix, let’s zoom in on this diagonal number line setting And thinking about where both the i-hat and the j-hat are, Because these spots will be the columns of the matrix. This one is awesome, we can think of it with an elegant piece of symmetry: Since i-hat and u-hat are unit vectors, Drop the i-hat on the line passing the El Hat It appears perfectly symmetrical for the u-hat protection on the x-axis. So when we asked what is the Earth’s I-hat at the time of getting it The answer will be the same as whoever yo is on the lands when it is said on X axis But projecting the u-hat on the x-axis It just means taking the x-of u-hat format. So, by symmetry, the number of any land caps when displayed on this country number Line It will be the x-coordinate of the u-hat. Isn’t that cool? The logic is almost identical to the J-Hat case. Think about it for a moment. For all the same reasons, the y coordinate of yo-hat Gives us the number j-hat falls into when displayed on the number line version. Pause and meditate for a moment. I just think that’s really cool. So the entries of the 1 x 2 matrix describe the projection transformation It will be the coordinates of U-hat. And calculate this projection transformation of arbitrary vectors in space, Which requires multiplying that matrix by those vectors, It is mathematically matched to take a product point with a yo hat. That is why the product took a point with Till Victory, It can be interpreted as a vector projection over the span of a vector control unit and a take Length. What about non-unit vectors? For example, Let’s say we’re taking that u-hat vector unit, But we are “expanding” this number by 3. Numerically, each of its components is multiplied by 3, So consider the matrix associated with that vector, The i-hat and j-hat take up to 3 times the values ​​where they landed before Since this is all linear, This generally means, That the new matrix can be interpreted as the projection of any vector onto the number line copies And hit where you land 3. That is why the product dot with a non-vector unit It can be interpreted as the first projection onto this vector Then increase the length of this projection along the vector. Take a moment to think about what happened here. We had a linear shift from 2D space to a number line Which are not defined in terms of number vectors or digital products. It was defined only by the projection of space onto a diagonal version of a number line. But because the shift is linear, It was necessarily described by some 1 × 2 matrix, And since the 1 x 2 matrix was multiplied by a two-dimensional vector It is the same as turning that matrix sideways and taking a point product, This transformation was inevitably related to some 2D vectors. The lesson here is that anytime you have one of these linear transformations Its output space is the number line, Regardless of how it was defined there to be some unique vector against v Corresponding to this transformation, Meaning, applying transformation is the same thing as taking a point product With this vector. For me, this is so beautiful. It is an example of something in mathematics called “duality”. “Dualism” appears in many different ways and models throughout mathematics And it’s very difficult to actually pinpoint them. Loosely speaking, it refers to situations where you have a normal but surprising correspondence Between two kinds of mathematical thing. For the case of linear algebra that you only know about, You say that the “double” of a vector is a linear transformation that encodes it. And binary for a linear transformation from space to one dimension, Is a mapped vector in that space. So, to summarize, on the surface, the product point is a very useful engineering tool to understand Expectations For the purpose of testing whether the vectors are inclined or not pointing in the same direction. This is probably the most important thing to remember about the product point, But on a deeper level, two vectors raster together It is a way to translate one of them into the world of transformations: Again, numerically, this might sound like a silly point to emphasize, It’s just two accounts happening that look similar. But the reason I find this so important, Is that during math, when you deal with a vector, Once you get to know her personality Sometimes you realize it’s easier to understand, not like an arrow in space, But as the physical embodiment of sin is a conversion. It’s as if the vector is actually just a conceptual shorthand for some transformation, Because it’s easier for us to think about stocks and space Instead of transferring all that space to the line number. In the next video, you’ll see another really great example of this ‘duplication’ at work I also talk about the product across.

0.3.0 | Wor Build 0.3.0 Installation Guide

Transcript: [MUSIC] Okay, so in this video? I want to take a look at the new windows on Raspberry Pi build 0.3.0 and this is the latest version. It's just been released today and this version you have to build by yourself. You have to get your own whim, and then you...

read more

Youtube Neural Network | But What Is A Neural Network? | Chapter 1, Deep Learning

Transcript: These are three, sloppily written and provided at a very low resolution of 28 x 28 pixels But your mind has no problem recognizing it as three and I want you to take a moment to appreciate How your brain can do this effortlessly and smoothly I mean this...

read more

Youtube Artificial Intelligence For The Pc | Machine Learning & Artificial Intelligence: Crash Course Computer Science #34

Transcript: Hi, this is Carrie Ann, and welcome to the Computer Science course! As we touched upon several times in this series, computers are miraculous at storing, organizing and Fetch and process huge amounts of data. This is ideal for things like e-commerce sites...

read more

Youtube Ai | Talking Tech And Ai With Google Ceo Sundar Pichai!

Transcript: [music] yeah. I know soon so as soon as we hit record, then the noises start so now a little landscaping, a little little construction. Uh, have you had it through the pandemic? Has it been hard kind of doing stuff like this? Yeah, weirdly, it's gotten...

read more