Executive Summary
Master Data and Reference Data Management and its position with regards to Hedge Funds. The document will cover what Master Data and Reference Data is and its goals as well as its life cycle and trends in its implementation.
What is Master Data Management?
Master data is core data needed to uniquely define objects like parties (customers, vendors, suppliers, trading partners or employees), places (locations or geographies) and things (products, services or accounts). It does not change as frequently as transactional data and is referenced by business processes and other applications. This data is usually located in multiple applications and is often out of synchronization, without a true “golden” source.
Master Data Management vs. Reference Data
Isn’t Master Data same as Reference Data?
Master Data is the non-transactional data that is meaningful to the business. It is the single source of business data used across all systems. While Master data may include reference data the important difference is that reference data is the basic business data used in a single system
Master data is the key business information such as Products, Customers, and Employees etc. It supports the transactional processes as well as analytics and reporting.
Master data is stored and used by various systems within an organization so there is a possibility for discrepancies in master data.
What are CDI, PIM and Enterprise Data Management?
- Investment managers for managing products and customers
- Asset and Wealth managers for managing customers and prices
- Financial services companies for managing risks, compliance and customers
- Hedge Fund managers for managing compliance, risk and prices
- Insurance companies for managing customers and products
- Banks for managing customers, counterparties, securities and brokers
Why do I need Master Data Management?
I already have a data warehouse, why do I need MDM?
If you already have a data warehouse, you are in a very good place to start.
MDM is the tool to ensure that an organization has only one version of the truth. MDM provides the 360 degree view of data that is of interest to business
Most data warehouse ecosystems have attempted to manage master data within its data warehouse architecture, but it has typically focused on mastering data after transactions occur. This approach does little to improve data quality because data are corrected after the fact. The best way to improve data quality is to move the process upstream of the data warehouse before transactions are executed.
MDM can be used to make your organization smarter and more flexible. By having accurate data for your most important information, you can be sure that your models, projections, and predictions are as accurate as they could be. By starting with valid data you are much more likely to produce results that your organization can depend on.
How is MDM useful for Funds?
Automating Trade Executions Fast trade execution is important to brokers because it allows them to capitalize on rapidly changing market opportunities
Supporting Compliance Compliance officers need to report on existing regulations and want the ability to dynamically respond to new and evolving regulations at the national and state level, such as Basel II. To achieve this, they need desktop access to reconciled and related data within and across customer, counterparty and financial instrument data domains
Managing Risk Having a clear picture of your institution’s holdings is essential for accurately assessing and adjusting risk levels. To achieve this, bankers require desktop access to reconciled and related data within and across customer, counterparty and financial instrument data domains.
MDM Quick Implementation Checklist:
- Identify candidate for MDM data:
- First and most important step is to identify what master data you would like to manage through MDM.
- Identify the producers and consumers of the MDM data:
- Next, identify which applications are creators and modifiers of the data and who consumes this data.
- Define the owners of the MDM data:
- Here’s another critical step. Who owns the data?
- Appoint Data Stewards, Data Governance Council for MDM data:
- This group must have the knowledge and authority to make decisions on how the master data is maintained, what it contains, how long it is kept, and how changes are authorized and audited. Data Stewards are responsible for resolving conflicts. They create the policies for resolving the conflicts when there are multiple versions/sources of data available?
- Create MDM Data Model, chose MDM Tool, Build Infrastructure, Test Master Data:
- Each one of these steps require careful considerations.
- Implement the maintenance process:
- It must incorporate tools, processes, and people to maintain the quality of the data
Challenges in Implementing MDM
Getting business involvement:
MDM has to be driven by business needs, otherwise it could turn out to be just another database that needs to be synchronized with all other ones, making it more of a liability than an asset.
The difference between MDM success and failure depends greatly on an organization’s ability to determine its own definition of what constitutes a quality, trustworthy piece of data. Most Hedge Funds already have concepts such as Pricing Community which can work as Data stewards acting as liaisons between business and IT, facilitating discussions about data and determining MDM requirements.
Needs big vision but requires baby steps:
Consider the ultimate goal, but limit the scope of the initial deployment. Once master data management is working in one place, it can be extended to other domains. Important differentiator is that each MDM implementation should include domains that work together such as Customers and Products.
What is Reference Data?
All relevant information pertaining to an instrument, required to support trading, settlement, accounting, performance, recordkeeping, risk management and regulatory reporting.
Goals of a Data Management Strategy
To develop the operational platform to support a world-class Hedge Fund, minimize the operational inefficiencies, errors and costs and reduce legal, regulatory and operational risk.
Reference Data Life-Cycle
- Acquire:
- Source Data from Various Internal and External Sources
- Extract, Transform and Load
- Cleanse:
- Business Rules
- Golden Copy, if centralized design
- Consolidations and Validations
- Maintain:
- Data setups (e.g. security, issuer etc.)
- Maintain data quality
- Review Governance and other data policies
- Distribute:
- SLA
- Publish data
- Workflow and Rules
- Real time to key systems
Trends in Reference Data Implementations
- Option 1: (Buy/build solution) with 3rd party feed handling and transformation, and custom consolidation, maintenance and distribution using a common API
- Option 2: Adopt a 3rd party product to handle all functions up to distribution, with a custom translation to a common API
- Option 3: Adopt a 3rd party product, which includes all functions including distribution
- Option 4: Outsource some/all functions (e.g. allow 3rd party to conduct multiple-source consolidation)
Reference Data Common Trends
Common Architectural trends include a drive towards multi-asset trading platforms with consolidated infrastructure; they require real-time data distribution mechanisms for trading, risk management, and compliance, source public data automatically from multiple sources, cleanse/validate data through workflow-driven business rules, decouple consuming systems from the approved data source, eliminate duplication of process or infrastructure and migrate over time to a services oriented architecture.
Some organizational and operational common trends include the centralization of data service provisioning and governance, creation of a single point of compliance and audit for enterprise data, measurement of data quality for initial load and ongoing data maintenance, the establishment of data stewardship roles for the business, operational and technology organization and finally, Reference Data Outsourcing.
Reference Data Architecture Design
Centralized architecture design includes business rules, identifier assignment, common data standards, service level agreements and data residence.
Distributed data architecture design includes data capture/maintenance, ownership of data, enrichment (prioritization and consolidation), quality assurance and acceptance.
Centralized Reference Data Value Proposition
Better data quality since enterprise-wide SMF definitions to improve interoperability across the portfolio management cycle, SMF Golden Copy with centralized validation engine and a best-of-breed composite record.
There is a lower cost for doing business as a result of fewer touch-points and a more efficient maintenance process, the reduction in Bloomberg terminals and a more efficient data acquisition process.
It is more efficient in trade processing and trade execution due to a less manual review of new security set-ups (improving accuracy) and lower failed trade rates.
The key to success when implementing an enterprise reference data solution is to balance business needs with operational efficiency.