Where no man has gone before, boldly...

Data: the final frontier. These are the voyages of Enterprise data. Its five-year mission: to explore strange new datasets and vendors; to seek out new insight and new signals; to boldly go where no Market Data manager has gone before!

Cue the music.... ah, the good old days, and just like the series where every mission resulted in resolving a challenge of some sort, to be greeted in the next episode by a slightly different twist on the same problem. It's a poor metaphor/comparison, but solving how an Enterprise gain access to data in the most efficient way possible seems to be a rites of passage that even today.. some have not had the joy and heartache.

Market Data comes in all shapes and sizes, and realtime/streaming aside, if you're just looking for end/start of day, intraday pricing, security master, regulatory type data, to support your workflow, then you have a number of considerations;

  • Mode of Access (Web Services (SOAP/XML), REST, SFTP, Browser, API, etc)

  • Symbology

  • Field Behaviour/Coverage

  • Asset Class Coverage (eg, listed vs OTC)

  • Timeliness/Quality

  • Vendor(s)

  • Formats (CSV, JSON, XML, Proprietary, ...)

  • Abstraction/Normalisation

Even... if you had nominally settled on one vendor, the organic growth in your enterprise may have pulled many levers along the way from the above list to complicate your approach to such an extent, that you've backed yourself into a corner. When we look at an approach that is one of organic growth, we'd probably observe a flavour of this;

Spider Pattern

For some of us, probably the more technically tuned, will want to re-architect this and introduce some layers, but by doing that we've fallen into the trap of working on a solution without understanding the problem, or even if there is a problem. Thereby lies the challenge of insight.

Does the above represent good value to the business, is this the optimum model, could this be more efficient, are there choices ?

I've worked on a few engagements in the past year helping organisations prepare for change. One organisation in particular briefly thought about buy vs build, and chose the build route. They recognised that the answer to the challenge of their access chaos, was to consolidate the chaos in one place, by building an enterprise data store. At that time, the current (over-simplified) state of play was;

Partial Pattern

Whilst slowly novating applications over to a single API into the Data Store, such that any intraday requests, where applicable (eg, Security Master), could be satisfied efficiently from the cache/store, and anything else would be routed out to vendors. The data model is largely built up around a key vendor and its behaviour, but sadly, the other vendors have their own methods of madness when representing data, with symbology nomenclature and fields likely to be one of them. Nonetheless, the above architecture settled on a single mode of access, but, applications had to be vendor cognisant. As applications got migrated from direct access, to Data Store access, it then became apparent that was missing was any form of metering, either before, during, or after. But technically.... it felt good, even if the benefit couldn't be measured.

Ultimately, most strive for this nirvana

Enterprise Data Management

multiplexing between a myriad of applications and vendors, striving for an agnostic approach to market data. On the buy approach, there's a variety of vendors that play in this space, with solutions to suit all wallet sizes. If you've opted for build, make sure that the metering is in place, and that you provide the levers for the control aspect.

However, if you're contemplating starting out, or taking stock of your journey, ensure that you have some insight before, during and after. This insight is mostly in the form of usage reports, largely vendor specific, but can provide a plethora of information that help profile your behaviour.

How efficient is your caching? How consistent are you with symbology? How repetitive are you with requests.

Usage data is largely available on a monthly basis, to coincide with the billing/invoicing cycles, and can contain a million records and upwards for anyone but the smallest consumers. If you've tried loading these into Excel, it's deeply frustrating, and given Excel's limitations on a million rows or less, you're unlikely to make progress. However, PowerBI is great for ingesting the data and most Microsoft 365 clients will have licenses (personal edition is free). PowerBI provides all sorts of mechanisms for slicing and dicing the data. There's some Open Source tools as well although you're largely left to your own devices for those. There are vendors who can provide capability (services) that will ingest usage reports, link to rate cards, and produce cost analysis, what-if's etc, and finally there are service based companies who will work alongside your teams to help provide the business analysis.

Dashboard

However you gain your insight, ensure that you understand and can measure 1) your current implementation and 2) be able to quantify the benefits of a new architecture/approach. Then, check if the difference between those two makes sense. Avoid looking for solutions, only to have to look for the problem to solve.

At Sherpa Consultancy LTD we're experienced in the Bloomberg domain, so can offer expertise looking across Data License or B-PIPE usage. If you're struggling with some of the report artefacts, we're happy to chat/help and talk you through the process.

We'd love to hear about your journey... what were the biggest obstacles you had to overcome? Did you buy or build? How many vendors do you use ? Did you never leave the spider's web pattern because it works just great..?

Finally, the greatest Star Trek film ever was Wrath of Khan. Are you old or new generation trekkie? What's with this Star Wars thing?

Previous
Previous

Financial Services' Journey to the Cloud

Next
Next

Cloud - Misunderstood