WHT

# Pytorch Transpose | Pytorch Tutorial 11: Transpose Of A Tensor Pytorch

Subscribe Here

6

104

## Pytorch Tutorial 11: Transpose Of A Tensor Pytorch

Transcript:

In this video, we will learn about transposing a tensor. Let’s import torch and make a tensor import torch. Let’s make variable X and in this, let’s store some tensor Dodge Dot Tensor, 1.0 2.0 I am storing all the float. Numbers, 3.0 4.0 5.0 6.0 Let’s print our variable X. We have a tensor of dimension three rows and two columns we can check the dimension X dot size says that we have a size of three by two. We can also use dim method. X dot beam. It tells us that we have two dimension rows and columns. We will transpose our variable X. We have three rows and two columns. After transposing the variable X, we will have three columns and two rows, let’s transpose and store In a new variable. We will make one new variable X underscore T and we will say X dot t let’s print our new variable X underscore T. We can see that now. We have three columns and two rows. Let’s check the size of our new variable X underscore t x underscore t dot size. A new variable has a size of 2 by 3 while our original variable was of size three by two. So when we transpose any variable, it becomes opposite. Rows will become columns and columns will become rows. Let’s verify that two tensors that is X and X underscore T share the same storage or not whether they are stored at the same memory address or not, and for that we will use the method ID. Id of X dot storage is equal to ID of X underscore T dot storage. We got the output. As true that is, they are stored in the same memory address and we can say that they share the same storage, but they differ only in shape. We have seen that our tensor X has a shape of three by two, while our tensor X underscore T, which is a transpose of X, has a shape of 2 by 3 so their shape is different, but they are stored at the same memory address. We will see transposing in higher dimension. We can transpose a multi-dimensional array by specifying the two dimension along which we are transposing. Let’s make a new variable and store a multi-dimensional tensor, make a variable a and I will store some tensor off once and this time our dimension is going to be 3 4 and 5 let’s print a these are the elements of our tensor a and we are going to transpose this multi-dimensional tensor A and we just have to specify the dimension on which we are going to transpose. Let’s make a new variable transpose underscore a and in this we will store the result of transpose. We will say a dot transpose. We want to transpose on dimension 0 and 1 let’s print our transpose variable transpose of a we can see in the output that this is the new tensor with a new dimension. This is our new transpose tensor. This is how we can transpose a multi-dimensional array by specifying the two dimensional along which we are going to transpose. Let’s also check the shape of our variable a and the transpose of a and let’s also check the dimension first. We will check the shape of our variable a. Let’s call the method size, our variable, a or tensor a has a size of three, four five and let’s also check its dimension a dot dim. It has a three dimensional that is why it is known as multi-dimensional. Let’s also check the size of our new resultant transpose, a transpose of a dot size. It has a size of 4 3 5 and let’s also check the dimension transpose a dot dim. It also has a dimension of three because it is a multi-dimensional tensor. This video was about transposing a tensor. I hope you enjoyed this video. I’ll see you in the next video. Please subscribe to my channel. Thank you for watching you.

## 0.3.0 | Wor Build 0.3.0 Installation Guide

Transcript: [MUSIC] Okay, so in this video? I want to take a look at the new windows on Raspberry Pi build 0.3.0 and this is the latest version. It's just been released today and this version you have to build by yourself. You have to get your own whim, and then you...

## Youtube Neural Network | But What Is A Neural Network? | Chapter 1, Deep Learning

Transcript: These are three, sloppily written and provided at a very low resolution of 28 x 28 pixels But your mind has no problem recognizing it as three and I want you to take a moment to appreciate How your brain can do this effortlessly and smoothly I mean this...