Three considerations when reviewing your data operating model
All financial services professionals know that just having access to market and reference data is not sufficient. Ensuring it is sourced, cleaned and enriched and that the process is efficient, flexible and reliable is essential. This is the foundation, but the differentiator – the mark of success – is knowing how to integrate this efficiently into the wider business workflow, ensuring both resilience and agility.
As a result of volatile markets, buy-side firms are now seeing outflows increasing. Many industry professionals will not have experienced a true bear market before and are now looking for ways to optimize their data operations and associated costs.
It is now more important than ever for buy-side firms to adapt to changing circumstances, and through this, ensure customer confidence during challenging times. Key to this is the ability to lift insights out of multiple channels giving buy-side investors opportunities for enhanced decision making.
If these observations ring true, then it is time to rethink your current data operating model.
Things to consider when reviewing your data operating model
1. Does data quality and agility affect your ability to attract and retain customers?
Highly distributed and non-integrated data sources result in data defects that have to be corrected multiple times in multiple systems. As a result, more inaccuracies surface as each modification is not synchronized across the wider organization.
To address this issue the operating model needs to change. There are a range of approaches to action this, but the most effective is the creation of a single reference instance of key critical data artefacts.
In times of high pressure, organizations are asking themselves if this is a process they want to do themselves or take the opportunity to gain operational efficiencies by leveraging specialist third party organizations to perform these functions.
2. Will you struggle to scale your team to respond to these challenges?
Operational teams are inherently resourced to support “normal” workloads, however business does not remain static and teams have to contend with dynamic workloads while managing periodic reporting requirements, vendor enforced changes and ever-changing business strategy demands to cope with market pressures. This means it is almost impossible to retain the optimal resources and associated skillsets to meet business demands.
Managed service providers can offer flexible resource models to scale up when required and enable access to a depth of expertise that it is not easily available in the market. This approach offers a level of operational agility that cannot be realized in traditional data management operating models.
3. Do you struggle to optimize your usage of commercial data?
To fully optimize usage of market and reference data the four dimensions of people, process, systems as well as data all need to be considered as part of a target operating model.
Historically, legacy data management platforms have been provided as toolkits, requiring investment over the course of its lifecycle. However, the people required to operate these platforms are few and hard to retain.
Ensuring that effective data governance remains part of the core systems and develops as business requirements change is key, which is extremely challenging to do without a process driven operating model.
An optimal solution will provide the necessary degree of transparency so you understand how each data service is used and how you operate data at differing quality levels. This means that if a business unit only requires raw vendor data, this can be provided alongside high quality fully managed data, dependent on the business use case. This capability is critical for any organization looking to optimize their consumption of commercial data services.
If this resonates with you, then you could benefit from an in-depth assessment of your current data operating model. To start that journey, you can reach out to SimCorp and ask for a bespoke assessment of your existing data model to identify what’s working and what can be improved.
1 December 2022
30 November 2022
29 November 2022