Hello, guys in this video. I will demonstrate how to perform Multi-way time series prediction with LSTM for predicting future Google stock prices. The main idea is open. Stock price is a result of multiple different features that can impact our predictions. The problem we will solve is to predict future stock prices based on historical data prediction will start on the next day of historical data set and we will predict future stock price for defined period. Let’s say for the next 20 days, so we have part number one part number two and part number three and part number one is data prep processing part number two is create a model and train it part number three is make future predictions and please remember many different features make impact to the actual price. Let’s take a quick look to raw data. I have downloaded A data set with Google Stock prices from official Yahoo finance website for this demonstration as usual, when analyzing stock prices, We have such default features. Open price hike price, low price close price adjusted price and volume. The wall way we want to predict, is open price for the next day. This wallet can be impacted by rest of features, right, let’s make this assumption for this tutorial. Please be noticed that the implementation algorithm demonstrated in this tutorial can be adopted to any real-life multi-word time series problem by including LSTM. So let’s come back to the scope and download the data. We will work with in this tutorial. My special message to you with this video tutorial. I want to demonstrate how to perform Multi-way time series prediction with Lstm. I will not going into mathematical details behind just how to implement the idea in python. I’m not finance expert. So the demonstrated model could not be very accuracy. This is just a demonstration of style How to do use deep learning versions. Tensorflow is 1.15.0 and Kera’s is 2.3.1 I assume that you have experience in python programming and some practical experience in machine learning and deep learning. This video has three parts part number. One is data preprocessing part number two is create a model and train it. Part number three is make future predictions. So part number one is consists of three steps step number zero fire the system when we import all basic python libraries for starting with the data step number one is read the data step number two data preprocessing it is a shaping and transformation of data in part number two. We have step number three and step number Four step number three is building up the lstm based neural network and step number Four is start training part number three has steps number five and step number six and step number five is to make prediction for future date and step number six is visualize the predictions, so welcome to the part number one what we will do. We will fire the system. We will import all required libraries to the model and we will read the original data. It is a stock prices CSV file and we will make a data shaping and transformations that will work with lstm in part number. Two, so let’s do it. We begin our project with step number zero. It is a fire the system in this step, we will import all basic models and packages that we will use in our project numpy for numerical calculation matplotlib for data visualization. RC params for controlling our bloatings in our notebook, Datetime is for Timestamp manipulations, pandas for data structuring, and also we will import careless Cal backs for controlling our training procedures when we have imported all necessary libraries to our notebook. We have to load the data to our notebook. In this step, we load the data to our notebook. It is a Google stock price CSV file, and then we select features that should be involved to our training and prediction procedures. It is the columns from one to six, and then we extract dates that we will use in visualization. The idea for this is, the python should understand that we have a timestamp, not just simple strings but timestamp for visualizations and for this we are using Datetime library in the end of this step, we output some basic information about our data, so we have four thousand and six records in our database and seven columns. Let’s say it’s our features, and we have features selected for our modeling. It’s a open, high, low close price and adjusted close price in step number two, we’ll perform data preprocessing we remove all commas and convert data to a matrix shape format in the way we begin. We should be sure that our column names has a string format. Then we have to remove all commas from column names for this. We will iterate through each columns to remove the commas after this. We have to be sure that our data set has a float format. It is necessary to make a numerical calculation, numerical computation training, transforming and all numerical functions. Then we have to reshape our data set as a matrix and that means that our data set should have columns, it’s a features or predictors and rows. It’s a matrix. Then we have print the some information, and we see that we have four thousand and six rows and five features in our data set and we have the situation and always we have a situation that different features or different predictors has different measurements. That’s mean that one feature can be Y from 0 to let’s say 100 and second feature can worry from minus 50 to minus 25 in order to make in one single scaling. We should use a standard scaler and we make a square scaling for all features that should be used as predictors and for open prices that we want to predict one standard scaler for all independent variables and one standard scaler for dependent variables, It is open prices that we want to predict, and the last action that we will make in this step is creating a data structure with time step and one output. Okay, in the very beginning of this action, we have to define Xtrain and Itrain list where we storage our data for training. Also, we have to define days for future that we want to make a prediction past days that our neural network will be take into consideration to make a prediction so and future. Let’s say it will be 60 60 days. We will want to make a prediction for the next 60 days. Okay, and I think for this prediction 90 days from history from historical data should be enough. I don’t know, but let’s try to do it. Okay, and past it is a 90 and then we append the wireless to X-train and i-train. Then we should convert the data to numpy arrays, so X-train and I-terrain is a numpy arrays, different numpy arrays and it is, it should have a shape. Extreme is 3857 records and 90 days from historical data and four features. That’s our predictors and Itrain has a shape 3857 rows and only one features. It is open price. This is the wallet that you want to predict. We have just finished the part number one, so welcome to the part number two. In this part, we’ll create a model and training, so we will make a lstm based neural network and we’ll start training based on defined hyper parameters for the neural network. Welcome to the part number two. In this part, we will create a lstm based neural network and will train it with our training data set generated in previous part. Firstly, import all carriers modules that are required to build neural network that we want so sequential dense LCM dropout and one of OPTIMIZER package should be added. I decided to choose Adam as Optimizer. Just for try now, we are ready to build a simple neural network with lstm as always start building up a new neural network by defining it should be as sequential. I decided to choose 64 nodes in the first lsm layer, take a look in input shape parameters where get shape values from our data set. It is important. If you have multiple features for prediction, not only single one, the secondary seam layer will consist only of 10 nodes. Let’s do like this and we will see what’s happened. In order to avoid overfitting effect, Let’s include a dropout at the end with the level of 0.25 last layer as always is dense with only one output unit because we want to predict only one malware. It is open stock price now. The structure of our network is done. I don’t know how accurately it will work, but let’s see so we are ready to compile our neural network With Adam Optimizer. I decided to set learning rate to 0.01 and loss function to mean Square error in step number four. We are ready to start training. When we have defined the structure of our neural network. We are ready to start training. I think it will take a couple of minutes for such a month of data. We are having in order to not waste time in training while the accuracy not increasing, we defined early stopping alert. Here we have a parameter such as monitor. It is a quantity to be monetized. Minimum delta is a minimum change in the monetized quantity to qualify as an improvement patience. It is a number of epochs with no improvements after which training will be stopped. Let’s set a tensorboard logs parameter. Maybe we will use it. And, Lastly, that’s the moment we have waited for start training on our training data set Xtrain and Itrain You will see a real-time result for each e-book below for this training. I use Only 10 Epoch’s. Batch size is 256 I decided it randomly so take a look into trading. We have just finished the most important part of this tutorial and welcome to the last part. It is a make future predictions in this part we will make prediction for future date. I think that you have been waiting for this part, and we will visualize the prediction to be sure that our prediction is in the context based on the scope. So let’s do it. Welcome to the part number. Three make future predictions in this part. We start with step number five here. We will make a prediction for future dates. First of all, we define a list of dates representing period for which we will make predictions this that is variable data list feature. Remember that we have data list training variable from the very beginning, and we are using it here. Then simply convert panda’s timestamp to datetime object. It will let us to make prediction visualization correctly on the same timeline as historical data. Now it’s a funny part. Let’s make future prediction that we are talking about from the very beginning model predict, and that’s it know the fact that we are working on skilled training data and our prediction has been made on scaled data. We want to invert such prediction to original units as it was provided in original data set. We will do it by using secret. Learn in words method in order to have a nice and structured result table. We build a simple panda’s data frame with our predictions. Please take your attention to Dave Dataframe index. It is recommended that it should be timestamp. If you are going to work with several Dave dataframes in the same timelines, let’s look how it looks like. It seems that it looks good and we are ready to visualize data now. Finally, we have step number six. Visualize the predictions. This is the last part of our tutorial. We will plot our prediction as well as historical data on the same timeline. Firstly, let’s define the size of the plot. We are going to generate. Secondly, let’s define a date that we will start our graph. It’s like starting position of plotting, and finally we will use a standard matplotliff methods to plot our results take special attention When defining the period of plotting the data for this, we are using a start date for plotting variable, And as I mentioned it before for axis, it is timeline, we will use a data frame index field for I access. It’s our predictions open columns from our data frame draw a vertical line to better representing starting date for predictions. Let’s put the gray grid on the top to make visualization More readable, add legend label for X-axi’s label for I-axi’s style for ticks on horizontal axis. And that’s it, that’s almost all for this tutorial, and I just want to tell you that some parts of my code can be improved and full project can be organized in a better manner, but my goal was to introduce the way I perform Multi-worded Time series prediction implementation with LSTM as I do in very clean manner without any unnecessary of topics. Additional features to the project will be added in my next videos. Please subscribe me to get it first and write your ideas to the video comments. Okay, looking good, and now I want to add one more line to our visualization. It is predicted wireless for our training set, not for future. Why I want to understand how my model performed during the training. I will make the extra prediction for training period in the same way, I did for future prediction by using circuitland predict method and the structuring data to pan this data frame and finally put the wireless on the same graph as our historical data and predictions be sure the timestamp values are formatted correctly in pundit data frame and as it was recommended set time’s time values on Dataframe index field. So that’s our results feel free to improve the model as you want. I think that good idea in the scope would be to add more features from our stocks, from example, Tesla, Yahoo etc, and grab all those features into one model to make the original data more complex and robust to get more accuracy results. In our tutorial, we get a result that the accuracy is less or more good, but at the end of our prediction is getting worse because standard duration in original data set is is increasing, that’s. All what I wanted to show you today. Thank you for watching. Write your ideas below what you want to see related to this project or to any another topic in data science. See you in the next videos. Never stop learning bye bye.