West Highland Support Services https://www.westhighland.net Tue, 19 May 2020 18:35:46 +0000 en-CA hourly 1 https://wordpress.org/?v=5.4.1 Is your capacity monitoring taking you down the wrong path? https://www.westhighland.net/2020/05/19/is-your-capacity-monitoring-taking-you-down-the-wrong-path/ Tue, 19 May 2020 18:00:34 +0000 https://www.westhighland.net/?p=11468 Is your capacity monitoring taking you down the wrong path?

As firms engage in initiatives to drive down cost across all business units, senior management is once again targeting additional savings in infrastructure reductions. This directive has pushed many firms to reduce their infrastructure based on “benchmarking and monitoring” using rudimentary and inappropriately configured monitoring tools.

Problem Statement

Most monitoring utilities only have the ability to show “high level” metrics of market data infrastructures but they are being masqueraded as “detailed” visualization tools. Making an informed decision based on this type of data presents a false and dangerous perception that additional capacity exists, and infrastructure can be reduced.  Not having a complete holistic, detailed view of your market data infrastructure and an accurate view of your capacity will lead to catastrophic failures taking your firm out of the market.

Example of Inadequate Monitoring Tools – 

Network Bandwidth Example

(All chart samples below are taking at the same time period.)

Network Bandwidth measured at 10-second intervals look like there is very minimal network bandwidth being used. Spikes up to 148Mb per second are the peaks identified in the chart below. This chart could suggest reducing infrastructure capacity is appropriate to reduce cost by using lower class hardware, reduced hardware footprint, or virtualization.

 

Network Bandwidth measured at 1-second intervals look like there is very minimal bandwidth being used. Spikes up to 710Mb per second which don’t fully justify upgrading of server hardware, network hardware/interfaces to 10G, or even upgrading the switches.  The inadequate collection below still implies a relatively “safe” environment.

 

When proper market data monitoring of Network Bandwidth is performed at 100ms intervals, accurate information is now presented and there is clearly a significant amount of bandwidth being used which can impact an incorrectly sized / configured production environment. Spikes up to 2.2 Gb per second are clearly in the range of concern for strained infrastructures.   This type of detailed data is not shown on inferior or inappropriate monitoring tools but clearly warrant an upgrade to a 10G infrastructure.

SOLUTION – How West Highland Addresses this Problem

Keep your infrastructure protected. Deploy not only the best monitoring utilities that can provide you sub-second statistics, but have the data verified by true experts that understand all the moving parts and how they impact each other.

Anything less than sub-second monitoring will keep you blind to what is going on in the market and the impact to your infrastructure. West Highland tools and Data Science team can help you deploy the best tools to monitor critical metrics in your Market Data Infrastructure as well as provide a low-cost 24 hour enhanced monitoring solution that fits your business.

 

Please contact sales@westhighland.net to schedule a call with our team to discuss our findings in more detail.

]]>
COVID 3:50P https://www.westhighland.net/2020/04/27/covid-350p/ Mon, 27 Apr 2020 20:06:25 +0000 https://www.westhighland.net/?p=11185 HOW TRADING FLOORS BEING CLOSED IMPACTS MARKET CLOSE
Background

Prior to COVID-19 trading floors at Exchanges had very effective processes for opening and closing the markets. This consisted of brokers on trading floors who understood their firm’s position and would buy or sell stocks to remove any large risk to the firm. These processes have been in place for years and reduced end of day spikes resulting in a smooth on-time close.

Problem Statement

Today, the Exchange trading floors remain closed/un-manned to prevent the spread of COVID-19. Brokers are working remote. This is more difficult for firms to change their position, buying and selling stocks to protect themselves from losses. In addition, Algorithmic trading identifies this as an opportunity and activity is increased at light speed creating significantly more activity in the market. What typically was a process that started at 2pm with face to face transactions till 4pm is now commencing electronically at 3:50pm with the same 4pm close deadline. A combination of broker, electronic transactions and squeezing the closing time from 2 hour to 10 minutes during this time creates large spikes in market trading volumes.

What this means to the financial industry

Our industry relies on market data getting from the market “Exchanges” to our clients in a fast and consistent delivery. Firms incur large expense to make sure that they have a robust delivery of this data. The chart (figure 1) below depicts a typical day of volume of trading activity since COVID-19.

The green line represents bandwidth in GB to the feed and the light purple represents trading volume.

What is observed in the data collected is that that volume (light purple) typically averages around 1-1.5MB of shares traded at any given time. The green line represents bandwidth to the vendor feed mirrors the behavior of market volume so when there is an uptick in volume there is a matching uptick in bandwidth consumption. From time to time there are spikes in volume which are related to company and market activities. Our clients market data infrastructures easily support this type of activity and throughput by design.

Due to this pandemic resulting in normal activities being reduced from 2 hours to 10 minutes the trading activity and volumes have seen significant increases that have created a tremendous volatility and sustained increases during that time which has a downstream effect on the market data platforms. The chart (figure 2) below shows the impact on volume at 3:50PM that is seen consistently since COVID-19.

Figure 2 – Zoom in of 3:50PM EST until 4:00PM EST Market Close. Green line represents bandwidth in GB to the Feed and light purple represents volume of shares traded.

What is consistent here is the impact of market data volume on the amount of bandwidth consumed for a typical market data backbone. Exactly at 3:50PM (and up to 8MB at close) volume surges from <1MB to 4.8MB or between 3-4 times more of the typical trading volume. The resultant impact to bandwidth consumption is the same 5-7 times more bandwidth being consumed. Many environments fail and disconnect consumers from receiving market data and/or updates at this time which negates the firm’s objective of risk mitigation during this 10-minute window.

How West Highland helped to identify this issue and the impact to our clients & the industry

Leveraging our A.L.I.V.E. (Application Latency Indicator for Vendors and Exchanges) Solution and Data Science practice we capture our client’s throughput each day. A.L.I.V.E. programmatically overlays these findings with the market volumes we’ve captured in real time.

This information identified a clear pattern that confirmed market conditions at 3:50PM EST due to new exchange behavior had a direct impact to the client’s infrastructure and trading environment.

West Highland has a unique ability to look across all components of the client’s market data delivery system from the time it enters the building to when it reaches its final destination, the applications that drive the business. West Highland has been collecting this data and correlating its impact to client environments for over 20 years.

We identified 5 areas of impact that could cause an outage putting the firm at a disadvantage in the market.

These areas were:
  1. Saturated client circuits
  2. Saturated client / market data network
  3. Resource issues on market data servers
  4. Software delivering, consuming market data
  5. Not having the best tools / monitoring to capture these events

If your market data environment is having these or other issues, please contact us. For more information or to schedule a call please contact us at sales@westhighland.net.

]]>
The Relevance of Operational Data https://www.westhighland.net/2020/04/27/the-relevance-of-operational-data/ Mon, 27 Apr 2020 19:00:47 +0000 https://www.westhighland.net/?p=11177 There is no doubt that financial service companies succeed or fail based on their ability to understand, analyze and react to data. Massive amounts of money are spent creating and managing data delivery infrastructures and fierce competition exists between firms trying to hire the top quantitative analysts. This year’s academy award nominated film, “The Big Short” shows how analyzing CMO & CDO data enabled a handful of analysts to grow rich beyond their wildest dreams by shorting the housing market. It’s no wonder that “data mining” has become a popular buzz word; recent history shows that there is gold to be extracted from this data.

While the importance of data is evident, over the years I have been amazed at how little attention is paid to operational data. All market data delivery platforms require the mining of information with a real-time view of the system’s operational integrity and the quality of the data that is being delivered. But more often than not this data is stored and ignored or reviewed in ‘post mortem’ reports created in response to a system failure. I only wish the” lessons learned section” of these reports would include statements like: “The firm will gain a better understanding of the meaning of its operational data and schedule periodic reviews to ensure system health.”

Delivery of market data in today’s computing environment is a white knuckle affair where Federal Reserve announcements and global events cause market data levels to spike and open the door to profit killing data latency. Yes firms are capturing operational data, but this data will not become information until it has been reviewed and acted upon. No need to wait for a system failure, the data and the opportunity to avoid such problems is here right now.

]]>
The Cost of “Change” https://www.westhighland.net/2020/04/27/the-cost-of-change/ Mon, 27 Apr 2020 15:51:41 +0000 https://www.westhighland.net/?p=11156 So you want to reduce your overall market data spend… tired of paying exorbitant fees?  You are not alone.  There has been a large demand from virtually all clients on both the buy-side and the sell-side to reduce overall spend on market data.  This trend has been cyclical over the years but is currently front and center for all firms and business lines. If done right you win big. If not… good luck.

The most effective and common way to approach this is by contracting an outside party to assess and identify the opportunities where savings can be achieved. Let the buyer beware folks. Firms may approach you with a “free” service where they are compensated based upon a percentage of savings. NOTHING IS FREE – YOU WILL SPEND MONEY TO SAVE MONEY. In most cases these involve displacing users from one vendor to another. In many cases, displaced users eventually return to the desktops that they previously had. This is typical when trying to get users from Bloomberg terminals to Thomson Reuters Eikon or vice versa.

Another approach is to take a deep look user by user, application by application, what data content is used and then try to substitute another more cost effective data set or application. This will take a very, very long time (depending upon your size) to establish typical profiles, identify data sets and then prepare a mitigation strategy. This typically gets the most push back from the end user / client community.

Finally, we have the “Enterprise Assessment”. This is the most effective and logical approach if you have found a qualified professional services firm. It takes a comprehensive look at business lines, data content, applications, platforms and “fit for purpose” technology. This exercise also looks for overlapping data sets within your organization to remove unnecessary redundancies. Each approach has its pros & cons. They also take a different amount of time and resources to complete.   Some of the optimization initiatives that we have done also evaluated the resources in a firm, assessed the teams abilities, size, strengths, weaknesses, etc. All will take the cooperation of the end user / client community as well as a strict mandate from senior management in order to be successful.

So you may wonder, how did we get into this mess? There are many reasons why but the one common factor that all firms have is time. This challenge did not occur overnight – it evolved – most likely something you inherited.  If you are truly serious about doing this then remember these key points:

  • You must select a good, trusted vendor to assist with the initiative
  • Nothing is “FREE” – low-hanging fruit exercises do not bear fruit
  • It is more than just data (real time and referential) spend that impacts your bottom line
  • Know and manage your inventory daily – you must have an inventory management system
  • Know your contract terms – you can always renegotiate regardless of where you are in your contract as long as you have options / an alternative vendor to go to
  • Know and assess your team
  • Understand your invoices
  • Do not overlook the cost of operational support or infrastructure/network/technology, etc.
  • Put a plan in place so you are not in this situation again
  • Finally, it takes time – the exercise is very much a forensic one

Good luck and remember – stay thirsty my friends!

]]>