COVID Impact on Stock Market 

by Gu LinganJan 8, 2021
0 likes0 duplicates
Share
Twitter iconTwitter
Facebook iconFacebook
Email
Copy link
Save as PDF
  1. Import Raw Packages
  2. I. Import Data
      1. API and DateTime package
      2. Daily returns
      3. Data imported check
  3. II. Generate calculated fields
  4. III. Plot data

Import Raw Packages

#Install datareader for Data visualizaton !pip install pandas_datareader==0.9.0 #Import Pandas, Numpy and PlotLy for Data visualization. import numpy as np import pandas as pd import plotly.graph_objs as go

I. Import Data

API and DateTime package

#Import the right API library in order to import data from Yahoo finance. #Furthermore don't forget datetime package to get the dates in the right format !! #PDR package from pandas_datareader import data #Datetime package from datetime import date

Daily returns

df[pct_daily_return] = (1 + df[closing_price]).pct_change(1)

df[pct_daily_return] = (df[closing_price] / df[closing_price].shift(1) ) - 1

b. Daily cumulative return

The formulae for a daily cumulative return is the following:

$ ii = (1+r_t) * i{t-1} $

We can notice that we are only multiplying our previous investment i at t-1 by 1 + our percentage return. Pandas simplify the way to calculate with its cumprod () method. Using the following command:

df[daily_cumulative_return] = (1 + df[pct_daily_return]).cumprod()
#Define start and end date start = pd.to_datetime('2019-02-01') end = pd.to_datetime('2019-12-22')
#Set your tickers tickers = ['SPY'] # Getting length of list length = len(tickers) i = 0 # Iterating using while loop while i < length: print(tickers[i] + " is uploading data") #Create Variable in which each dataframe will be stored locals()[str(tickers[i])+"_data_2019"] = data.DataReader(tickers[i],'yahoo' ,start=start, end=end) #Create a CSV file in which you are going to store your dataframe in order to keep track of the change overtime. locals()[str(tickers[i])+"_data_2019"].to_csv(str(tickers[i])+"_data.csv") i += 1
#Define start and end date start = pd.to_datetime('2020-02-01') end = pd.to_datetime('2020-12-22')
#Set your tickers tickers = ['SPY'] # Getting length of list length = len(tickers) i = 0 # Iterating using while loop while i < length: print(tickers[i] + " is uploading data") #Create Variable in which each dataframe will be stored locals()[str(tickers[i])+"_data_2020"] = data.DataReader(tickers[i],'yahoo' ,start=start, end=end) #Create a CSV file in which you are going to store your dataframe in order to keep track of the change overtime. locals()[str(tickers[i])+"_data_2020"].to_csv(str(tickers[i])+"_data.csv") i += 1

Data imported check

SPY_data_2020
SPY_data_2019

II. Generate calculated fields

#Set a list grouping the different dataframes which will be impacted. df_list = [SPY_data_2020, SPY_data_2019] tickers = ['CY', 'PY'] # Getting length of list length = len(df_list) #Setting i=0 i = 0 # Iterating using while loop while i < length: print("Dataframe " + str(i) + " is calculating the daily return") #Generating a "Returns" column using PCT methodology. df_list[i]['Returns'] = df_list[i]['Close'].pct_change(1) #Generating a "Cumulative Returns" column using Cum Prod methodology. df_list[i]['Cumulative Return'] = (1 + df_list[i]['Returns']).cumprod() print(tickers[i] + " is setting Returns and Cumulative Returns variables") #Create Variable in which each dataframe will be stored locals()[str(tickers[i])+"_Returns"] = df_list[i]['Returns'] locals()[str(tickers[i])+"_Cum_Returns"] = df_list[i]['Cumulative Return'] i += 1

III. Plot data

import plotly.figure_factory as ff # Add histogram data x1 = CY_Returns.fillna(0) x2 = PY_Returns.fillna(0) # Group data together hist_data = [x1, x2] group_labels = ['Covid19', 'PY'] # Create distplot with custom bin_size (set bin size = 0.01) fig = ff.create_distplot(hist_data, group_labels, bin_size=.01) fig.show()

Recommended on Deepnote

Stock Market Analysis

Stock Market Analysis

Last update a month ago
The 10 Best Ways to Create NumPy Arrays

The 10 Best Ways to Create NumPy Arrays

Last update 2 months ago
Wide Residual Networks

Wide Residual Networks

Last update 3 months ago