Understanding LSTMs – Part 7: LSTM in Action with Real Data
Source: Dev.to
Introduction
In the previous article, we completed all three stages of the LSTM: the Forget Gate, Input Gate, and Output Gate.
Now, let us use the LSTM with real data.
Data Overview
- Companies
- Company A
- Company B
- Axes
- Y‑axis: stock value
- X‑axis: day the value was recorded
When the data for both companies are overlapped, the only difference occurs between Day 1 and Day 5.
LSTM Processing for Company A
We will sequentially pass the data from Day 1 through Day 4 into an unrolled LSTM and see whether it can correctly predict the value for Day 5.
- Initialize the long‑term and short‑term memories to zero.
- Use the simplified diagram (shown periodically) to keep a clear overview.
Day 1
- Input value:
0 - After calculations:
- New long‑term memory: (C_1 = -0.20)
- New short‑term memory: (h_1 = -0.13)
Day 2, Day 3, Day 4
- Plug in the value for each day and repeat the same process.
After processing Day 4, the final short‑term memory is:
[ h_4 = 0 ]
This means the output from the unrolled LSTM correctly predicts Company A’s value for Day 5.
LSTM Processing for Company B
We repeat the exact same process using Company B’s values.
After sequentially passing Days 1 through 4 into the LSTM, the final short‑term memory is:
[ h_4 = 1 ]
This correctly predicts the value for Day 5 for Company B.
Conclusion
This concludes our discussion of LSTMs. In upcoming articles we will move on to Word Embeddings and Word2Vec.
Installerpedia
Looking for an easier way to install tools, libraries, or entire repositories?
Installerpedia is a community‑driven, structured installation platform that lets you install almost anything with minimal hassle and clear, reliable guidance.
ipm install repo-name
… and you’re done! 🚀