Neural Networks with Tensoflow for stock trend & price prediction: Microsoft stock
Loading in the data
1986-03-13T00:00:00.000000
0.05589847997
0.06411891883
1986-03-14T00:00:00.000000
0.06137834941
0.06466690297
1986-03-17T00:00:00.000000
0.06357029244
0.06521488422
1986-03-18T00:00:00.000000
0.06466690755
0.0652148948
1986-03-19T00:00:00.000000
0.06302232466
0.06357031199
Data visualisation
Demand and traded volume of stock
No artists with labels found to put in legend. Note that artists whose label start with an underscore are ignored when legend() is called with no argument.
Correlations
Simple moving averages and relative strenght index
['ma5', 'rsi5', 'ma30', 'rsi30', 'ma600', 'rsi600']
Open High Low Close Volume Dividends \
Date
1988-07-27 0.255378 0.258666 0.252090 0.252638 68140800 0.0
1988-07-28 0.252090 0.256474 0.249898 0.255378 57945600 0.0
1988-07-29 0.256474 0.263051 0.255378 0.260858 59385600 0.0
1988-08-01 0.261955 0.263051 0.256474 0.256474 51825600 0.0
1988-08-02 0.256474 0.257570 0.248802 0.252090 65476800 0.0
Stock Splits ma5 rsi5 ma30 rsi30 ma600 \
Date
1988-07-27 0.0 1.037309 11.932208 1.128632 42.231798 0.704880
1988-07-28 0.0 1.016738 21.358043 1.113233 43.423957 0.698569
1988-07-29 0.0 0.987816 37.958436 1.087606 45.740639 0.685148
1988-08-01 0.0 0.997863 31.342546 1.103632 44.241417 0.698116
1988-08-02 0.0 1.013478 25.735278 1.119782 42.790421 0.711514
rsi600
Date
1988-07-27 53.836012
1988-07-28 53.886779
1988-07-29 53.988146
1988-08-01 53.893215
1988-08-02 53.798453
Data preparation
1988-07-27T00:00:00.000000
0.2553783789
0.258666292
1988-07-28T00:00:00.000000
0.2520904632
0.2564743602
1988-07-29T00:00:00.000000
0.2564743411
0.2630508258
1988-08-01T00:00:00.000000
0.2619549689
0.263050953
1988-08-02T00:00:00.000000
0.2564742599
0.2575702244
1988-07-27T00:00:00.000000
0.2553783789
0.258666292
1988-07-28T00:00:00.000000
0.2520904632
0.2564743602
1988-07-29T00:00:00.000000
0.2564743411
0.2630508258
1988-08-01T00:00:00.000000
0.2619549689
0.263050953
1988-08-02T00:00:00.000000
0.2564742599
0.2575702244
2020-04-27T00:00:00.000000000
Open High Low Close Volume ma5 \
Date
1988-08-11 -0.290613 0.223522 -0.052958 1.165131 0.001706 -2.071299
1988-08-12 0.674153 0.222185 1.555990 0.187037 -1.064693 -0.580705
1988-08-15 0.187684 -0.055767 -1.615036 -1.230156 -0.662325 0.791665
1988-08-16 -1.471498 0.497480 0.215174 1.642649 1.952657 -1.665053
1988-08-17 1.900861 1.177400 1.147787 -0.050111 -0.757229 0.412697
rsi5 ma30 rsi30 ma600 rsi600 target
Date
1988-08-11 7.648493 -1.578646 1.729813 -1.122789 0.597843 1
1988-08-12 0.236418 -0.651099 0.297733 -0.169886 0.116998 0
1988-08-15 -0.730474 0.738206 -0.941768 1.275995 -0.684264 0
1988-08-16 2.415575 -2.124848 2.249828 -1.573517 0.837182 0
1988-08-17 -0.122226 -0.459207 -0.022959 0.066867 -0.002761 0
Open
High
Low
Close
Volume
ma5
rsi5
ma30
rsi30
ma600
rsi600
target
Open High Low Close Volume ma5 \
Date
2020-05-12 1.189940 -0.318500 -0.335002 -1.559113 -0.001050 1.781732
2020-05-13 -1.590933 -1.344873 -2.360783 -1.076244 1.201682 0.867295
2020-05-14 -1.896721 -1.517743 -0.467064 0.171837 -0.338378 -0.550759
2020-05-15 0.449274 2.556725 0.410557 0.827829 0.254043 -1.144557
2020-05-18 2.325778 -0.477710 2.664517 0.506303 -0.942130 -0.820968
rsi5 ma30 rsi30 ma600 rsi600 target
Date
2020-05-12 -1.156650 1.763636 -0.901935 1.565795 -1.589587 1
2020-05-13 -0.797105 1.251851 -0.601271 1.062539 -1.041049 1
2020-05-14 0.163241 0.053414 0.123306 -0.196233 0.222292 1
2020-05-15 0.690647 -0.606888 0.458481 -0.836530 0.785288 1
2020-05-18 0.235902 -0.251979 0.292109 -0.522079 0.513778 0
Open
High
Low
Close
Volume
ma5
rsi5
ma30
rsi30
ma600
rsi600
target
(7160, 60, 11)
Creating a neural network
Collecting tensorflow
Downloading tensorflow-2.7.0-cp39-cp39-manylinux2010_x86_64.whl (489.7 MB)
|████████████████████████████████| 489.7 MB 4.8 kB/s
Requirement already satisfied: wheel<1.0,>=0.32.0 in /usr/local/lib/python3.9/site-packages (from tensorflow) (0.37.0)
Collecting flatbuffers<3.0,>=1.12
Downloading flatbuffers-2.0-py2.py3-none-any.whl (26 kB)
Collecting astunparse>=1.6.0
Downloading astunparse-1.6.3-py2.py3-none-any.whl (12 kB)
Collecting tensorflow-estimator<2.8,~=2.7.0rc0
Downloading tensorflow_estimator-2.7.0-py2.py3-none-any.whl (463 kB)
|████████████████████████████████| 463 kB 42.3 MB/s
Requirement already satisfied: numpy>=1.14.5 in /shared-libs/python3.9/py/lib/python3.9/site-packages (from tensorflow) (1.21.4)
Requirement already satisfied: protobuf>=3.9.2 in /shared-libs/python3.9/py/lib/python3.9/site-packages (from tensorflow) (3.19.1)
Collecting tensorboard~=2.6
Downloading tensorboard-2.7.0-py3-none-any.whl (5.8 MB)
|████████████████████████████████| 5.8 MB 43.2 MB/s
Collecting tensorflow-io-gcs-filesystem>=0.21.0
Downloading tensorflow_io_gcs_filesystem-0.23.1-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (2.1 MB)
|████████████████████████████████| 2.1 MB 8.4 MB/s
Requirement already satisfied: six>=1.12.0 in /shared-libs/python3.9/py-core/lib/python3.9/site-packages (from tensorflow) (1.16.0)
Collecting opt-einsum>=2.3.2
Downloading opt_einsum-3.3.0-py3-none-any.whl (65 kB)
|████████████████████████████████| 65 kB 2.2 MB/s
Requirement already satisfied: grpcio<2.0,>=1.24.3 in /shared-libs/python3.9/py/lib/python3.9/site-packages (from tensorflow) (1.42.0)
Collecting google-pasta>=0.1.1
Downloading google_pasta-0.2.0-py3-none-any.whl (57 kB)
|████████████████████████████████| 57 kB 4.6 MB/s
Collecting h5py>=2.9.0
Downloading h5py-3.6.0-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (4.5 MB)
|████████████████████████████████| 4.5 MB 13.9 MB/s
Requirement already satisfied: typing-extensions>=3.6.6 in /shared-libs/python3.9/py-core/lib/python3.9/site-packages (from tensorflow) (4.0.1)
Collecting libclang>=9.0.1
Downloading libclang-12.0.0-py2.py3-none-manylinux1_x86_64.whl (13.4 MB)
|████████████████████████████████| 13.4 MB 770 kB/s
Collecting termcolor>=1.1.0
Downloading termcolor-1.1.0.tar.gz (3.9 kB)
Collecting absl-py>=0.4.0
Downloading absl_py-1.0.0-py3-none-any.whl (126 kB)
|████████████████████████████████| 126 kB 19.8 MB/s
Collecting gast<0.5.0,>=0.2.1
Downloading gast-0.4.0-py3-none-any.whl (9.8 kB)
Requirement already satisfied: keras<2.8,>=2.7.0rc0 in /shared-libs/python3.9/py/lib/python3.9/site-packages (from tensorflow) (2.7.0)
Collecting keras-preprocessing>=1.1.1
Downloading Keras_Preprocessing-1.1.2-py2.py3-none-any.whl (42 kB)
|████████████████████████████████| 42 kB 2.2 MB/s
Collecting wrapt>=1.11.0
Downloading wrapt-1.13.3-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (81 kB)
|████████████████████████████████| 81 kB 9.4 MB/s
Collecting markdown>=2.6.8
Downloading Markdown-3.3.6-py3-none-any.whl (97 kB)
|████████████████████████████████| 97 kB 4.3 MB/s
Collecting tensorboard-plugin-wit>=1.6.0
Downloading tensorboard_plugin_wit-1.8.0-py3-none-any.whl (781 kB)
|████████████████████████████████| 781 kB 14.5 MB/s
Requirement already satisfied: google-auth<3,>=1.6.3 in /shared-libs/python3.9/py/lib/python3.9/site-packages (from tensorboard~=2.6->tensorflow) (2.3.3)
Collecting tensorboard-data-server<0.7.0,>=0.6.0
Downloading tensorboard_data_server-0.6.1-py3-none-manylinux2010_x86_64.whl (4.9 MB)
|████████████████████████████████| 4.9 MB 17.1 MB/s
Collecting werkzeug>=0.11.15
Downloading Werkzeug-2.0.2-py3-none-any.whl (288 kB)
|████████████████████████████████| 288 kB 10.5 MB/s
Requirement already satisfied: setuptools>=41.0.0 in /root/venv/lib/python3.9/site-packages (from tensorboard~=2.6->tensorflow) (57.4.0)
Requirement already satisfied: requests<3,>=2.21.0 in /shared-libs/python3.9/py/lib/python3.9/site-packages (from tensorboard~=2.6->tensorflow) (2.26.0)
Collecting google-auth-oauthlib<0.5,>=0.4.1
Downloading google_auth_oauthlib-0.4.6-py2.py3-none-any.whl (18 kB)
Requirement already satisfied: cachetools<5.0,>=2.0.0 in /shared-libs/python3.9/py/lib/python3.9/site-packages (from google-auth<3,>=1.6.3->tensorboard~=2.6->tensorflow) (4.2.4)
Requirement already satisfied: pyasn1-modules>=0.2.1 in /shared-libs/python3.9/py/lib/python3.9/site-packages (from google-auth<3,>=1.6.3->tensorboard~=2.6->tensorflow) (0.2.8)
Requirement already satisfied: rsa<5,>=3.1.4 in /shared-libs/python3.9/py/lib/python3.9/site-packages (from google-auth<3,>=1.6.3->tensorboard~=2.6->tensorflow) (4.8)
Collecting requests-oauthlib>=0.7.0
Downloading requests_oauthlib-1.3.0-py2.py3-none-any.whl (23 kB)
Collecting importlib-metadata>=4.4
Downloading importlib_metadata-4.10.0-py3-none-any.whl (17 kB)
Collecting zipp>=0.5
Downloading zipp-3.6.0-py3-none-any.whl (5.3 kB)
Requirement already satisfied: pyasn1<0.5.0,>=0.4.6 in /shared-libs/python3.9/py/lib/python3.9/site-packages (from pyasn1-modules>=0.2.1->google-auth<3,>=1.6.3->tensorboard~=2.6->tensorflow) (0.4.8)
Requirement already satisfied: charset-normalizer~=2.0.0 in /shared-libs/python3.9/py-core/lib/python3.9/site-packages (from requests<3,>=2.21.0->tensorboard~=2.6->tensorflow) (2.0.9)
Requirement already satisfied: certifi>=2017.4.17 in /shared-libs/python3.9/py/lib/python3.9/site-packages (from requests<3,>=2.21.0->tensorboard~=2.6->tensorflow) (2021.10.8)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in /shared-libs/python3.9/py/lib/python3.9/site-packages (from requests<3,>=2.21.0->tensorboard~=2.6->tensorflow) (1.26.7)
Requirement already satisfied: idna<4,>=2.5 in /shared-libs/python3.9/py-core/lib/python3.9/site-packages (from requests<3,>=2.21.0->tensorboard~=2.6->tensorflow) (3.3)
Collecting oauthlib>=3.0.0
Downloading oauthlib-3.1.1-py2.py3-none-any.whl (146 kB)
|████████████████████████████████| 146 kB 44.5 MB/s
Building wheels for collected packages: termcolor
Building wheel for termcolor (setup.py) ... done
Created wheel for termcolor: filename=termcolor-1.1.0-py3-none-any.whl size=4847 sha256=1cf048392d235e748a0c1cc2f5e46e3feb2b85ea383cf0f91e5012302631273d
Stored in directory: /root/.cache/pip/wheels/b6/0d/90/0d1bbd99855f99cb2f6c2e5ff96f8023fad8ec367695f7d72d
Successfully built termcolor
Installing collected packages: zipp, oauthlib, requests-oauthlib, importlib-metadata, werkzeug, tensorboard-plugin-wit, tensorboard-data-server, markdown, google-auth-oauthlib, absl-py, wrapt, termcolor, tensorflow-io-gcs-filesystem, tensorflow-estimator, tensorboard, opt-einsum, libclang, keras-preprocessing, h5py, google-pasta, gast, flatbuffers, astunparse, tensorflow
Successfully installed absl-py-1.0.0 astunparse-1.6.3 flatbuffers-2.0 gast-0.4.0 google-auth-oauthlib-0.4.6 google-pasta-0.2.0 h5py-3.6.0 importlib-metadata-4.10.0 keras-preprocessing-1.1.2 libclang-12.0.0 markdown-3.3.6 oauthlib-3.1.1 opt-einsum-3.3.0 requests-oauthlib-1.3.0 tensorboard-2.7.0 tensorboard-data-server-0.6.1 tensorboard-plugin-wit-1.8.0 tensorflow-2.7.0 tensorflow-estimator-2.7.0 tensorflow-io-gcs-filesystem-0.23.1 termcolor-1.1.0 werkzeug-2.0.2 wrapt-1.13.3 zipp-3.6.0
WARNING: You are using pip version 21.2.3; however, version 21.3.1 is available.
You should consider upgrading via the '/root/venv/bin/python -m pip install --upgrade pip' command.
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
lstm (LSTM) (None, 60, 128) 71680
dropout (Dropout) (None, 60, 128) 0
batch_normalization (BatchN (None, 60, 128) 512
ormalization)
lstm_1 (LSTM) (None, 60, 128) 131584
dropout_1 (Dropout) (None, 60, 128) 0
batch_normalization_1 (Batc (None, 60, 128) 512
hNormalization)
lstm_2 (LSTM) (None, 64) 49408
dropout_2 (Dropout) (None, 64) 0
batch_normalization_2 (Batc (None, 64) 256
hNormalization)
dense (Dense) (None, 32) 2080
dropout_3 (Dropout) (None, 32) 0
dense_1 (Dense) (None, 1) 33
=================================================================
Total params: 256,065
Trainable params: 255,425
Non-trainable params: 640
_________________________________________________________________
/shared-libs/python3.9/py/lib/python3.9/site-packages/keras/optimizer_v2/gradient_descent.py:102: UserWarning: The `lr` argument is deprecated, use `learning_rate` instead.
super(SGD, self).__init__(name, **kwargs)
Epoch 1/50
112/112 [==============================] - 97s 774ms/step - loss: 1.0756 - accuracy: 0.5078 - val_loss: 0.6790 - val_accuracy: 0.5000
Epoch 2/50
112/112 [==============================] - 86s 770ms/step - loss: 0.7150 - accuracy: 0.5214 - val_loss: 0.4817 - val_accuracy: 0.5000
Epoch 3/50
112/112 [==============================] - 85s 755ms/step - loss: 0.5983 - accuracy: 0.5311 - val_loss: 0.3047 - val_accuracy: 0.4933
Epoch 4/50
112/112 [==============================] - 84s 752ms/step - loss: 0.5392 - accuracy: 0.5359 - val_loss: 0.2857 - val_accuracy: 0.5034
Epoch 5/50
112/112 [==============================] - 85s 762ms/step - loss: 0.4967 - accuracy: 0.5275 - val_loss: 0.2833 - val_accuracy: 0.5235
Epoch 6/50
112/112 [==============================] - 86s 765ms/step - loss: 0.4598 - accuracy: 0.5447 - val_loss: 0.2847 - val_accuracy: 0.4966
Epoch 7/50
112/112 [==============================] - 87s 777ms/step - loss: 0.4300 - accuracy: 0.5419 - val_loss: 0.2794 - val_accuracy: 0.5201
Epoch 8/50
112/112 [==============================] - 82s 736ms/step - loss: 0.4089 - accuracy: 0.5271 - val_loss: 0.2995 - val_accuracy: 0.4899
Epoch 9/50
112/112 [==============================] - 135s 1s/step - loss: 0.3895 - accuracy: 0.5404 - val_loss: 0.2775 - val_accuracy: 0.4899
Epoch 10/50
112/112 [==============================] - 155s 1s/step - loss: 0.3723 - accuracy: 0.5334 - val_loss: 0.2826 - val_accuracy: 0.4933
Epoch 11/50
112/112 [==============================] - 89s 788ms/step - loss: 0.3595 - accuracy: 0.5457 - val_loss: 0.2813 - val_accuracy: 0.4866
Epoch 12/50
112/112 [==============================] - 87s 775ms/step - loss: 0.3527 - accuracy: 0.5487 - val_loss: 0.2775 - val_accuracy: 0.4631
Epoch 13/50
112/112 [==============================] - 85s 760ms/step - loss: 0.3299 - accuracy: 0.5520 - val_loss: 0.2807 - val_accuracy: 0.4765
Epoch 14/50
112/112 [==============================] - 85s 762ms/step - loss: 0.3278 - accuracy: 0.5507 - val_loss: 0.2762 - val_accuracy: 0.4664
Epoch 15/50
112/112 [==============================] - 85s 761ms/step - loss: 0.3188 - accuracy: 0.5553 - val_loss: 0.2739 - val_accuracy: 0.4832
Epoch 16/50
112/112 [==============================] - 84s 746ms/step - loss: 0.3240 - accuracy: 0.5541 - val_loss: 0.2736 - val_accuracy: 0.4765
Epoch 17/50
112/112 [==============================] - 85s 758ms/step - loss: 0.3114 - accuracy: 0.5598 - val_loss: 0.2696 - val_accuracy: 0.4866
Epoch 18/50
112/112 [==============================] - 83s 742ms/step - loss: 0.2937 - accuracy: 0.5620 - val_loss: 0.2706 - val_accuracy: 0.5067
Epoch 19/50
112/112 [==============================] - 85s 757ms/step - loss: 0.3041 - accuracy: 0.5571 - val_loss: 0.2714 - val_accuracy: 0.4698
Epoch 20/50
112/112 [==============================] - 86s 773ms/step - loss: 0.2900 - accuracy: 0.5595 - val_loss: 0.2708 - val_accuracy: 0.4765
Epoch 21/50
112/112 [==============================] - 82s 730ms/step - loss: 0.2802 - accuracy: 0.5714 - val_loss: 0.2655 - val_accuracy: 0.4832
Epoch 22/50
112/112 [==============================] - 83s 739ms/step - loss: 0.2777 - accuracy: 0.5656 - val_loss: 0.2645 - val_accuracy: 0.4966
Epoch 23/50
112/112 [==============================] - 84s 753ms/step - loss: 0.2803 - accuracy: 0.5626 - val_loss: 0.2628 - val_accuracy: 0.4933
Epoch 24/50
112/112 [==============================] - 85s 756ms/step - loss: 0.2833 - accuracy: 0.5682 - val_loss: 0.2639 - val_accuracy: 0.4765
Epoch 25/50
112/112 [==============================] - 131s 1s/step - loss: 0.2766 - accuracy: 0.5761 - val_loss: 0.2670 - val_accuracy: 0.4765
Epoch 26/50
112/112 [==============================] - 144s 1s/step - loss: 0.2718 - accuracy: 0.5723 - val_loss: 0.2721 - val_accuracy: 0.4497
Epoch 27/50
112/112 [==============================] - 83s 743ms/step - loss: 0.2710 - accuracy: 0.5744 - val_loss: 0.2680 - val_accuracy: 0.4933
Epoch 28/50
112/112 [==============================] - 84s 751ms/step - loss: 0.2600 - accuracy: 0.5784 - val_loss: 0.2657 - val_accuracy: 0.4765
Epoch 29/50
112/112 [==============================] - 88s 785ms/step - loss: 0.2684 - accuracy: 0.5719 - val_loss: 0.2700 - val_accuracy: 0.4631
Epoch 30/50
112/112 [==============================] - 85s 757ms/step - loss: 0.2692 - accuracy: 0.5732 - val_loss: 0.2725 - val_accuracy: 0.4463
Epoch 31/50
112/112 [==============================] - 83s 738ms/step - loss: 0.2656 - accuracy: 0.5722 - val_loss: 0.2713 - val_accuracy: 0.4597
Epoch 32/50
112/112 [==============================] - 84s 754ms/step - loss: 0.2585 - accuracy: 0.5821 - val_loss: 0.2658 - val_accuracy: 0.5034
Epoch 33/50
112/112 [==============================] - 83s 740ms/step - loss: 0.2534 - accuracy: 0.5880 - val_loss: 0.2683 - val_accuracy: 0.4530
Epoch 34/50
112/112 [==============================] - 82s 732ms/step - loss: 0.2588 - accuracy: 0.5652 - val_loss: 0.2656 - val_accuracy: 0.4866
Epoch 35/50
112/112 [==============================] - 84s 746ms/step - loss: 0.2566 - accuracy: 0.5869 - val_loss: 0.2641 - val_accuracy: 0.4899
Epoch 36/50
112/112 [==============================] - 85s 758ms/step - loss: 0.2571 - accuracy: 0.5827 - val_loss: 0.2660 - val_accuracy: 0.4765
Epoch 37/50
112/112 [==============================] - 82s 728ms/step - loss: 0.2537 - accuracy: 0.5811 - val_loss: 0.2639 - val_accuracy: 0.4832
Epoch 38/50
112/112 [==============================] - 83s 744ms/step - loss: 0.2523 - accuracy: 0.5803 - val_loss: 0.2638 - val_accuracy: 0.4765
Epoch 39/50
112/112 [==============================] - 84s 753ms/step - loss: 0.2529 - accuracy: 0.5913 - val_loss: 0.2631 - val_accuracy: 0.4631
Epoch 40/50
112/112 [==============================] - 86s 771ms/step - loss: 0.2499 - accuracy: 0.5897 - val_loss: 0.2611 - val_accuracy: 0.4866
Epoch 41/50
112/112 [==============================] - 82s 728ms/step - loss: 0.2540 - accuracy: 0.5844 - val_loss: 0.2616 - val_accuracy: 0.4832
Epoch 42/50
112/112 [==============================] - 81s 725ms/step - loss: 0.2515 - accuracy: 0.5746 - val_loss: 0.2599 - val_accuracy: 0.4799
Epoch 43/50
112/112 [==============================] - 81s 727ms/step - loss: 0.2447 - accuracy: 0.5922 - val_loss: 0.2609 - val_accuracy: 0.4899
Epoch 44/50
112/112 [==============================] - 70s 623ms/step - loss: 0.2499 - accuracy: 0.5827 - val_loss: 0.2595 - val_accuracy: 0.4732
Epoch 45/50
112/112 [==============================] - 51s 456ms/step - loss: 0.2470 - accuracy: 0.5932 - val_loss: 0.2603 - val_accuracy: 0.4799
Epoch 46/50
112/112 [==============================] - 51s 455ms/step - loss: 0.2452 - accuracy: 0.5932 - val_loss: 0.2602 - val_accuracy: 0.4832
Epoch 47/50
112/112 [==============================] - 51s 458ms/step - loss: 0.2466 - accuracy: 0.5968 - val_loss: 0.2591 - val_accuracy: 0.5034
Epoch 48/50
112/112 [==============================] - 52s 468ms/step - loss: 0.2428 - accuracy: 0.5996 - val_loss: 0.2590 - val_accuracy: 0.4966
Epoch 49/50
112/112 [==============================] - 52s 462ms/step - loss: 0.2432 - accuracy: 0.5989 - val_loss: 0.2593 - val_accuracy: 0.4799
Epoch 50/50
112/112 [==============================] - 51s 458ms/step - loss: 0.2402 - accuracy: 0.5971 - val_loss: 0.2603 - val_accuracy: 0.4933
WARNING:absl:Found untraced functions such as lstm_cell_layer_call_fn, lstm_cell_layer_call_and_return_conditional_losses, lstm_cell_1_layer_call_fn, lstm_cell_1_layer_call_and_return_conditional_losses, lstm_cell_2_layer_call_fn while saving (showing 5 of 15). These functions will not be directly callable after loading.
INFO:tensorflow:Assets written to: Microsoft neural network model on <built-in function time>/assets
INFO:tensorflow:Assets written to: Microsoft neural network model on <built-in function time>/assets
WARNING:absl:<keras.layers.recurrent.LSTMCell object at 0x7fbb7b5310a0> has the same name 'LSTMCell' as a built-in Keras object. Consider renaming <class 'keras.layers.recurrent.LSTMCell'> to avoid naming conflicts when loading with `tf.keras.models.load_model`. If renaming is not possible, pass the object in the `custom_objects` parameter of the load function.
WARNING:absl:<keras.layers.recurrent.LSTMCell object at 0x7fbb78bcba00> has the same name 'LSTMCell' as a built-in Keras object. Consider renaming <class 'keras.layers.recurrent.LSTMCell'> to avoid naming conflicts when loading with `tf.keras.models.load_model`. If renaming is not possible, pass the object in the `custom_objects` parameter of the load function.
WARNING:absl:<keras.layers.recurrent.LSTMCell object at 0x7fbb78ad09a0> has the same name 'LSTMCell' as a built-in Keras object. Consider renaming <class 'keras.layers.recurrent.LSTMCell'> to avoid naming conflicts when loading with `tf.keras.models.load_model`. If renaming is not possible, pass the object in the `custom_objects` parameter of the load function.
Looking a the results and predicting the price
The loss of the model is: 0.26. Accuracy of the score is: 0.493
[[0.29366136]
[0.34784186]
[0.42723927]
[0.3876993 ]
[0.54054993]
[0.44694418]
[0.5084375 ]
[0.58940387]
[0.50820225]
[0.55492747]
[0.327143 ]
[0.6301163 ]
[0.39077637]
[0.5106869 ]
[0.6110525 ]
[0.36522007]
[0.5815882 ]
[0.587787 ]
[0.2198393 ]
[0.35230696]
[0.38873765]
[0.25607115]
[0.55254847]
[0.03102691]
[0.20862179]
[0.36869374]
[0.49092305]
[0.30074874]
[0.4001106 ]
[0.40313298]
[0.5498692 ]
[0.3319072 ]
[0.39149088]
[0.51660144]
[0.6421945 ]
[0.6076744 ]
[0.16761069]
[0.35047877]
[0.3623137 ]
[0.30608618]
[0.44093335]
[0.6551197 ]
[0.72247463]
[0.41415203]
[0.27896345]
[0.56392473]
[0.38708204]
[0.6541848 ]
[0.40676168]
[0.4230255 ]
[0.24592389]
[0.43590164]
[0.42900997]
[0.5192868 ]
[0.29500964]
[0.5366702 ]
[0.48458266]
[0.39964586]
[0.11978183]
[0.19146073]
[0.41532874]
[0.7469862 ]
[0.35215253]
[0.5925211 ]
[0.444054 ]
[0.35246235]
[0.46345818]
[0.3014856 ]
[0.49064046]
[0.37000775]
[0.30221444]
[0.25397834]
[0.5229772 ]
[0.49581927]
[0.68877333]
[0.56246877]
[0.4967265 ]
[0.35887456]
[0.45585948]
[0.3621019 ]
[0.35817614]
[0.49142718]
[0.5759139 ]
[0.34696227]
[0.34933293]
[0.5297281 ]
[0.51189935]
[0.3937486 ]
[0.41293305]
[0.55241644]
[0.44010627]
[0.4387324 ]
[0.44618136]
[0.2731445 ]
[0.4763742 ]
[0.5492011 ]
[0.42123598]
[0.4062822 ]
[0.46226865]
[0.38087875]
[0.5738518 ]
[0.30547106]
[0.47178924]
[0.22281314]
[0.157292 ]
[0.26120675]
[0.370421 ]
[0.31676823]
[0.1721487 ]
[0.4998414 ]
[0.47388262]
[0.35038042]
[0.4514526 ]
[0.3783221 ]
[0.24795915]
[0.18590063]
[0.23392959]
[0.15841718]
[0.57956326]
[0.37629 ]
[0.39429003]
[0.3970341 ]
[0.36019066]
[0.49940854]
[0.47443163]
[0.53471607]
[0.34918925]
[0.45536178]
[0.42116952]
[0.39221698]
[0.19563976]
[0.488033 ]
[0.30472165]
[0.25298893]
[0.5280197 ]
[0.4288664 ]
[0.23770887]
[0.37804687]
[0.461596 ]
[0.4683746 ]
[0.35944378]
[0.44510984]
[0.4371326 ]
[0.31223184]
[0.397269 ]
[0.36430788]
[0.504729 ]
[0.5062672 ]
[0.4181095 ]
[0.4653672 ]
[0.35257864]
[0.22414568]
[0.19593506]
[0.34023577]
[0.42945993]
[0.46367848]
[0.5607645 ]
[0.79153657]
[0.46855664]
[0.40671575]
[0.61064327]
[0.21451509]
[0.60155845]
[0.5842486 ]
[0.4077332 ]
[0.42843217]
[0.5273556 ]
[0.54443294]
[0.4924122 ]
[0.5221719 ]
[0.26312807]
[0.59141237]
[0.2457619 ]
[0.46357602]
[0.30292013]
[0.22076412]
[0.38839704]
[0.6067964 ]
[0.43399537]
[0.3994155 ]
[0.65852666]
[0.48998314]
[0.491665 ]
[0.25863945]
[0.32473853]
[0.46545762]
[0.2649928 ]
[0.60335755]
[0.47087646]
[0.27227485]
[0.38592005]
[0.13654396]
[0.3046146 ]
[0.41120547]
[0.46167696]
[0.6661252 ]
[0.39036587]
[0.5009196 ]
[0.48937905]
[0.4701435 ]
[0.4888494 ]
[0.38466418]
[0.35314316]
[0.33827168]
[0.262526 ]
[0.34885502]
[0.3645348 ]
[0.64505804]
[0.38239777]
[0.4920035 ]
[0.2537771 ]
[0.2762574 ]
[0.42001617]
[0.35020685]
[0.42544574]
[0.4844463 ]
[0.32954395]
[0.35538894]
[0.6505982 ]
[0.39848524]
[0.27113515]
[0.5311449 ]
[0.47360408]
[0.31657302]
[0.6635408 ]
[0.32813025]
[0.37407732]
[0.50748605]
[0.29375172]
[0.40783346]
[0.47175276]
[0.45469475]
[0.34418035]
[0.45550954]
[0.54195976]
[0.37100577]
[0.24910304]
[0.32607645]
[0.391827 ]
[0.6363231 ]
[0.6634553 ]
[0.58007777]
[0.7573394 ]
[0.5767603 ]
[0.5267645 ]
[0.50776565]
[0.5559313 ]
[0.37839115]
[0.37510598]
[0.43885392]
[0.32574326]
[0.43363273]
[0.27316546]
[0.5613041 ]
[0.44458288]
[0.5178489 ]
[0.32044035]
[0.47050017]
[0.41462657]
[0.2449892 ]
[0.46048737]
[0.3488282 ]
[0.54103833]
[0.43624926]
[0.48652905]
[0.4451391 ]
[0.52440304]
[0.30684185]
[0.432132 ]
[0.23511796]
[0.25539798]
[0.32791603]
[0.43768132]
[0.6463915 ]
[0.35244134]
[0.38039178]
[0.52953434]
[0.4217803 ]
[0.44659853]
[0.39420968]
[0.676352 ]
[0.31118292]
[0.41677302]
[0.43209046]
[0.43106145]
[0.4762702 ]
[0.56575435]
[0.39859843]
[0.19381505]
[0.3895381 ]
[0.50190306]
[0.41363984]
[0.11634065]
[0.3039672 ]
[0.66338754]
[0.31006458]
[0.6127415 ]
[0.24802227]]
Comparing the model to a baseline
Slope : 0.21056389835111408
Intercept : 68.56017803005875