Insights Related To Stock Price Drop To Make Stock Purchase Decisions
An investment advisory firm with AUM of $140 million
Our client, a registered investment advisory firm with assets under management of $140 million was manually calculating the Stock Price Drop value and percentage for each stock symbol on the fly. This meant that if you had a set of stock symbols the calculation needed to be done for a given date range and had to be recalculated if the date range or symbol changed.
There was also no way to collectively assess the stock price drop over the desired period. Again, the existing tools in the market did not have this unique computation built in although one could compute it indirectly via a series of workarounds. Overall, this once again meant a loss in productivity associated with the calculations and recalculations, and an inconvenient delay incurred whenever a large set of stock symbols were involved and the stock price needed to be calculated for different date frames.
Time to analyze the desired stock tickers and their price drop for various date frames came down to < 2 seconds from 4 hours.
By automating the Stock Price Drop calculations, the client could implement custom and proprietary logic used to make key purchasing decisions.
Menerva developed a Stock Price Drop Monitoring Analytics Portal that implemented:
A web-based portal deployed on AWS that allowed the firm’s user to configure those symbols they were interested in along with notes and warning thresholds.
The backend engine did an automated daily download of stock ticker data from quandl for all the configured symbols.
The firm’s user could then view the Stock Price Drop values and associated stats within seconds to make their stock purchase decisions.
The same functionality for stocks was extended for ETFs and Mutual Funds as well in the application at a later stage.
Technically, the web-based portal was developed using AngularJS and NodeJS. D3JS data visualization libraries were leveraged to present the relevant insights. The backend was implemented using Python along with its data processing libraries for the functional logic and Apache Spark to scale the data engineering processes.
From 4hrs to under 2 secs