There is no doubt that financial service companies succeed or fail based on their ability to understand, analyze and react to data. Massive amounts of money are spent creating and managing data delivery infrastructures and fierce competition exists between firms trying to hire the top quantitative analysts. This year’s academy award nominated film, “The Big Short” shows how analyzing CMO & CDO data enabled a handful of analysts to grow rich beyond their wildest dreams by shorting the housing market. It’s no wonder that “data mining” has become a popular buzz word; recent history shows that there is gold to be extracted from this data.
While the importance of data is evident, over the years I have been amazed at how little attention is paid to operational data. All market data delivery platforms require the mining of information with a real-time view of the system’s operational integrity and the quality of the data that is being delivered. But more often than not this data is stored and ignored or reviewed in ‘post mortem’ reports created in response to a system failure. I only wish the” lessons learned section” of these reports would include statements like: “The firm will gain a better understanding of the meaning of its operational data and schedule periodic reviews to ensure system health.”
Delivery of market data in today’s computing environment is a white knuckle affair where Federal Reserve announcements and global events cause market data levels to spike and open the door to profit killing data latency. Yes firms are capturing operational data, but this data will not become information until it has been reviewed and acted upon. No need to wait for a system failure, the data and the opportunity to avoid such problems is here right now.