Reconciliations: The forefront of regulatory compliance procedures

Data reconciliations play a fundamental role in the success of any business operation. Whether it be for money management, stock verification or safeguarding data integrity, there is a noticeable focus being placed on the operational, technical and financial aspects of reconciliation services.


Establishing a control framework for protecting the integrity of data that is used to surveil the financial services market is a top priority for regulatory reporting entities. Demonstrating there are adequate operational procedures surrounding the management of data reporting, alongside an efficient and robust technical infrastructure are key components to proving compliance with regulatory reporting obligations around the world. Accuracy, completeness and timelines of data submissions allow regulators to effectively monitor the industry for malpractices, and forms the basis for investigations, fines and enforcement actions. As regulation continues to evolve, requirements become more complicated, deadlines become tighter, and regulators become less tolerant, a spotlight is shone on data quality.

How can firms ensure quality data is being captured and reported? How can firms reduce the risk of regulatory scrutiny as a result of erroneous data? Identifying a framework to safeguard the integrity of data throughout an organisation, and ensuring the end-to-end workflow of data transformation is validated through standardised reconciliation processes is a activity high on the agenda within the industry.


‘Adequate systems and controls’ is a term often used in deliberations describing a well-structured regulatory reporting framework. Among the numerous components that exist to define this structure lie data reconciliations, with the assigned responsibility of ensuring the quality, accuracy and completeness of data across an organisation.

Data reconciliations play a fundamental role in the success of any business operation. Whether it be for money management, stock verification or safeguarding data integrity, there is a noticeable focus being placed on the operational, technical and financial aspects of reconciliation services.

Firms within financial services have seen a shift in attitude towards reconciliation procedures over the last few years. The numerous regulatory reporting regimes imposed on the industry have given rise to the current level of observation and scrutiny levied institutions across the globe. The requirement to report trade information to a competent authority on a daily, quarterly or annual basis is now part of the business-as-usual (BAU) process for many companies, and the ability to demonstrate data  integrity across an organisation has become a principal component in most business models. Why? Because the information provided to the regulator can form the underlying evidence needed to find the firm guilty of unethical market practices.

Regulatory reporting obligations are considered an immense driving force in the operation, governance and budget of reconciliation processes within an organisation. Not only do firms have to establish a robust, timely and efficient reporting procedure, they need to encompass a range of governance components that give them greater visibility and transparency over the end-to-end operation.

There is an additional requirement for firms to demonstrate that adequate systems and control mechanisms are in place, in line with any regulatory guidance or mandates that safeguard the integrity of data being submitted to the authorities. The information presented to a regulatory body needs to truthfully reflect a firm’s position in the market, and reconciliation procedures play a significant role in the protection of data integrity across all trade lifecycle stages. This action is now on the agenda of Chief Compliance Officer (CCOs), heads of Information Technology, and Risk and Compliance Officers, as ultimately they are the ones who will be held accountable should an issue arise.

Regulators use the reported data to monitor and keep the market under surveillance for manipulation, fraud and systemic risk, with the intent to reduce the risk of market corruption or collapse. It is vital that the information being reported is complete, remains integral, and provides a true ref lection of the firm’s status. T+1 and real-time reporting obligations have allowed regulatory bodies to proactively observe and inspect financial institutions, and in many cases their findings have resulted in enforcement action being taken against a large number of firms.


After the 2008 financial crisis, regulatory bodies did not have a firm grip on how best to keep the industry under surveillance and protect it. Financial regulation did exist, but it was not geared to capture the scale of trading activity and the scope of financial products that were being bought and sold. There was no real way to monitor the risk these securities posed either to the economy or to the customers entering into contracts. The ultimate near demise of the financial system led to a number of initiatives being introduced to prevent subsequent failures occurring, and a more intrusive set of regulatory regimes was established.

Transactions, positions, quotes, portfolios, accounts, customer information and trader details were to be made visible to regulatory bodies around the globe via a number of regulatory reporting rules. The revamp of the reporting framework was designed to give regulators greater transparency of the activities taking place on their market in order for them to monitor damaging market practices. A penal system was established to discipline any institution or individual found guilty of these activities, and over the years there have been many high profile cases that have lead to fines, business closure, and in some cases, imprisonment. It is not, however, enough to just simply submit data to the authorities; it needs to be accurate, of a good quality, complete, and submitted in a timely fashion.


Regulation has bought the fear factor to the financial industry, as it is one sure way for regulators to see exactly what a firm is doing on a daily basis, and use the analysis performed on the collected data as a way into a firm for further scrutiny of their practices.

The implications of a breach in regulatory reporting have changed the mind-set of how firms run their businesses. There is far less tolerance of inaccurate data capture and reporting, and far more consequences for failures in this area. Not only will firms suffer financial damage from a fine, there is also the cost for remediation, and often firms will be put on a set programme or be forced to bring in consultants (of the regulator’s choice). The charges for re-reporting all of the erroneous data, which could go back many years, can enter into the millions when you consider the task of collecting all the data across a number of years, reviewing and updating of incorrect data, and then re-reporting the information to the competent authority. This is a large internal project that will take a considerable amount of time and resources, which distracts from the BAU process. Finally there is the reputational damage and cost to the business when publicly exposed. Studies1 have shown that the stock price impact of a firm fined under regulatory failures is ten times the value of the imposed fine. It really is not worth cutting corners and falling short on compliance.

In a review of the last few years of enforcement action, the words ‘adequate’, ‘systems’ and ‘controls’ are often listed in the transcript as a cause for prosecution. In Europe, regulators have not been shy of issuing fines for breaches in transaction-reporting obligations. Under Markets in Financial Instruments Directive (MiFID) Transaction Reporting, the Financial Conduct Authority (FCA) have collected a total of US$43.4m since 2009, with their last penalty totalling nearly US$23.7m before the 30 per cent discount applied for early settlement. The FCA quote the rationale behind the excessive fine value to be ‘past fines have not been high enough to achieve credible deterrence’.2 It is clear that the regulators ‘expect to be heard and understood across the industry’, and the fines mostly come  down to breaches on FCA transaction reporting rules and the ‘requirements for firms to have adequate management and controls’.3

A 2013 fine highlighted the FCA’s lack of tolerance for poor regulatory reporting, ‘given the considerable resources available to [the firm], it should have been able to overcome these challenges and ensure adequate systems and controls were in place’. In the USA in 2014, the Securities and Exchange Commission (SEC) announced a total of US$2.4m of charges4 and financial penalties across 33 individuals and companies for violating federal securities law, requiring them to report information under Form 4 and Schedule 13D and G regulations on time. A US$2.5m civil monetary penalty was issued to a registered swap deal for failing to properly report, failing to diligently address and correct the reporting errors, and failing to have an ‘adequate swaps supervisory system governing its swaps reporting requirements’.

In FY2015, the Commodity Futures Trading Commission (CFTC) ramped up its surveillance efforts, taking action on 69 firms for reporting failures. It also issued its first enforcement actions based on the Dodd–Frank Act large trader reporting requirements. Targeting banks and exchanges, the CFTC made it clear that ‘all persons must be held accountable to meeting their regulatory responsibilities’.5 The failure to submit daily large trader reports, and the filing of inaccurate Large Trader Reports (LTRs) saw one firm fined a US$150,000 civil monetary penalty. There was also the cost of undergoing a re-reporting and back-reporting exercise to correct the erroneous and missing data.

In total, over US$6.45m of the US$3.144bn in civil monetary penalties collected by the CFTC in 2015 was due to regulatory reporting failures.


This is absolutely an avoidable situation for a firm. Until recently, regulators have issued just guidance on acceptable processes and procedures surrounding regulatory reporting obligations. As regulation evolves, we see these recommendations turning into mandates and finding their way into written legislation. Regulatory reconciliations have their own spotlight, and at a minimum, competent Authorities expect to see a three-way reconciliation between a firm’s internal source data, data submitted to their chosen regulatory  reporting solution, and the data consumed by the regulator.

Clearly, there are going to be many other risk points in the workflow of the processed data, including data translation procedures, reference and static data cross-referencing, enrichment and consolidating processes, all of which increase the probability of data misinterpretation or loss. Ideally, an integrity check should be performed at all phases of data interception. This is not always operationally and financially feasible, but is the cost of implementing this more than the potential fine that could be issued for failings in this area?

Figure 1 Reconciliation charges, 2016


It is important for firms to run ad hoc reconciliations and consider the implementation of daily,  intraday and sometimes real-time processes. A recent poll6 asked a number of reporting firms what their biggest challenge was for reconciling data in 2016 – the results are shown in Figure 1.

Inefficient processes

Often, legacy systems impede the reconciliation process, as integrating old applications and data sets is not a straightforward exercise. It is often the case that the introduction or overhaul of a reconciliation system comes with a complete infrastructure upgrade across an organisation. Typically, the roadmap items for system upgrades would see reconciliation applications phased towards the later part of the process, but regulation has seen reconciliations move towards the forefront of this exercise.

A lack of transparency and knowledge of the flow of data also adds to the complexities of establishing a robust reconciliation process. How reliable can the reconciliation be if there is not a clear understanding of the information being managed. A significant amount of education is often needed to coincide with management of this process.

As firms desperately try to find a workaround to these issues, the act of reconciliation almost becomes obsolete and loses its true value because the structure of data flow within the organisation is just not as well established as it should be.

Introduction of new regulation

New regulations are going to be an ongoing cycle of events for firms. Organisations often contemplate new business strategies to avoid regulatory reporting regimes, but as reporting becomes more of the norm around the globe, there will be fewer places for firms to hide.

Markets in Financial Instruments Regulation (MiFIR) within the EU, has explicitly defined reconciliations as a mandatory requirement to remain compliant with the regulation. Basel
Committee on Banking Supervision (BCBS) 239 is exclusively based on a set of principles firms must follow, each dedicated to data quality and defining technology requirements to manage risks associated with data integrity. Dual-sided reporting regimes such as European Markets Infrastructure Regulation (EMIR) impose a mandatory pre-reporting reconciliation obligation to ensure that what the reporting firm submits to a trade repository matches the report of their  counterparty. With the EMIR inter-Trade Repository (TR) reconciliation process still dealing with a backlog of data, it is even more imperative that the pre-reconciliation task is undertaken.

Due to the nature of regulation, each of these reconciliations will have a different mandate, different  data specifications and different reconciliation frequencies, and managing this compliance exercise is operationally time-consuming, technically demanding and, if not performed in an efficient manner, can be extremely expensive. There has been much frustration within the industry regarding the lack of harmonisation among these regulations, but the agenda of politicians and regulators is fundamentally what drives the data requirements. Monitoring systemic risk and identifying market abuse are not overlapping activities, and the complexities and variances in products now available on the market are the main reason why multiple approaches to regulatory reporting have been implemented. As a consequence, the reconciliation process has become difficult to centralise. The scope of fields reportable under these regulations span a wide range of data sets, and where there is overlap in name there are discrepancies in standards and formats. The requirement to report counterparty details, buyer/seller details, decision-maker and trader information comes in many different forms.

A legal entity identifier, Society for Worldwide Interbank Financial Telecommunications (SWIFT) Bank
Identifier Code (BIC) Code, FCA Reference Number (FRN), national identifier and passport number are among the different formats required to identify the same entity or individual. Trade economics, instrument details and transaction details require, for example, the use of International Securities Identification Number (ISINs), Classification of Financial Instrument (CFI), Market Identifier Code (MIC) and asset class identifiers. Not only do firms need to reconcile the basic trade details, they also have to integrate various reference data feeds and databases (e.g., Global Legal Entity Identifier Foundation (GLEIF), Human Resources (HR) applications, valuation modelling and collateral management applications). This creates an intense web of system integration, and the ability to prove the accuracy of what is being reported becomes a daily challenge. There is also the challenge of cross-regional operations; for example, a derivative house may be captured by reporting obligations under EMIR, Hong Kong Monetary Authority (HKMA), Monetary Authority Singapore (MAS) and Dodd–Frank, while a commodities house may have to report under Regulation on wholesale Energy Market Integrity and Transparency (REMIT), EMIR and MiFIR. Often, the same information is being reported cross-border, and this introduces further challenges of reporting consistently across multiple jurisdictions. The inevitable initiative nature of regulation should also be taken into account. For example, the upcoming EMIR RTS has mandated a further 40 fields for its input specification. Regulations do not evolve in parallel with one another, so firms will find that even if they are currently in a stable position, another implementation may be due, often at a tangent. They are not a requirement as yet, but who can say that cross-border reporting reconciliation requirements are not on the agenda of regulators. Consolidating this effort into a single cost-effective framework seems a near impossible ask.

Linked with regulation, there are firms working with a number of reporting models, which adds further complexity to the reconciliation process. Alongside direct reporting, firms can opt into delegated reporting models, third-party reporting via a vendor, or a hybrid model. The obligation for reporting still falls on the reporting firm, but the submitting firm can ease the burden by offering a delegated reporting model. These services do not, however, always completely support the reporting firm in managing compliance. The reporting firm should periodically reconcile the information reported on their behalf against their version of the trade, because they never lose the regulatory obligation and responsibility for their data, regardless of who is reporting it. Setting up the reconciliation process is one task, but getting access to the data can prove to be a difficult operation. A lack of reporting and the inability to provide access to the reported data often leaves reporting firms vulnerable to the risks of inaccurate data being submitted on their behalf with no real way of validating and correcting the information.

Lack of automation

There are still many manual processes being performed by firms as part of BAU. Low trading volume, lack of resources or reduced budget seem to be blamed for the manual intervention into a number of operational workflows. This lack of an automated f low of information makes it very difficult to imbed a reconciliation procedure, so many reconciliations are still today being carried out on spreadsheets using macros and Excel programming, which is a huge operational and compliance risk. The increase in regulatory scope makes this a non-scalable operation, and firms are starting to see the problems and costs involved with unravelling this process.

Lack of man power

Reconciliations are both a compliance and operations consideration that spans an entire business movement, and having adequate resources to operate and manage the process is often a losing battle for firms. Historically, reconciliations have been a cost to the business, generating no revenue and therefore receiving little to no investment. Scaling up this operation to cater for a regulatory reporting obligation is helping to justify the investment in resources, but it is a restrictive component when building a regulatory reporting framework that incorporates the verification and validation of reported data. Often, firms turn to outsourcing or utility models where they can utilise external resources, but this comes with added challenges in terms of understanding data, knowledge of the regulation and reconciliation requirements, and the ability to adapt in line with regulatory and business updates. Where this is no longer controlled internally, it becomes an additional liability to a firm.

All of these  concerns lead to tactical solution builds, with firms effectively plugging a gap to meet a requirement on the surface, but on drilling down to the lower-level detail, you unearth key missing components such as audit, accountability, investigation workflow, management information and reporting. In such situations, information is not easily accessible or readily available to support the reconciliation procedure, there is no proof that a sufficient data validation process exists, and firms eventually fall foul of incompletely satisfying their regulatory obligation.

The word ‘reconciliation’ can morph into a more all-encompassing statement combining ‘data quality and verification’, which a firm could quite easily lose sight of among the prevailing issues surrounding the process.


Data takes a long journey through an organisation before it resides with a regulator. Typically travelling from Order Management System (OMS) or trade capture systems to a middleware solution for data normalisation and regulatory filtering, to an external regulatory reporting platform, before being submitted to a competent authority, there are a lot of air miles collected on the way to its final destination. Along this journey there are lifecycle events, exception management procedures and reference data alignments that amend the data to meet the reporting specification. It is easy for there to be misinterpretation and data loss during this process, and an effective reconciliation tool would be able to identify these discrepancies before they become visible to the regulators. Ultimately, a firm needs to ensure the following:

  • What they trade has been accurately reported to the regulator.
  • They are using an effective process to filter non-reportable trades and do not over or under report.
  • They have an efficient two-directional flow of data streams, so exceptions managed outside the organisation can be accurately reintegrated back into the internal source systems.

As well as having the right people and procedures, the correct technology must be in place to manage this process. Looking towards flexibility and the ability to adapt to change due to the changing nature of regulation, firms need to consider establishing partnerships on regulatory reporting engagements, not just hiring a standard technology vendor. Traditional reconciliations vendors lack the agility and subject matter expertise to truly partner with a firm in regulatory reconciliations. With a long line of implementations to deliver, firms often lose the attention they require and deserve on such an important project. Other vendors have begun implemented a self-service style approach, with wizards giving the firm more control in the development of the reconciliation tool. Although this is quick to deploy, it is operationally restrictive, and the vendor still lacks the capabilities and knowledge to support the regulations associated with the reconciliation.

Agility in responding to change and being able to mould to varying business models often pushes firms down a bespoke implementation route, as standard out-of-the-box reconciliations do not always fulfil a firm’s requirements. They want to have the ability to recognise abnormal behaviour specific to their business and generate appropriate alerts and notifications of the fact. They want the ability to add in bespoke validation rules, workflow and business logic. A standard implementation may not always be extendable, and when a regulation changes, the reconciliation tool must change in parallel. The impact of these changes will vary from firm to firm, so a standard configuration does not always   suffice, and it is becoming more and more common for organisations to look for and consider bespoke reconciliation builds.

Full automation is an overriding goal for most firms. Not only will this generate a more efficient and better performing working model, it removes the risks associated with manual intervention and human error. It allows for more analysis on workflow elements such as data mismatching, configuration of tolerances and pattern recognition. It supports a level of consistency across the reconciliation process, and firms can integrate auto-assignment of breaks, resolution codes and the removal of false positives, making the reconciliation more operationally efficient.

It is important to get the knowledge, guidance and added value a partner can offer to a business in order to remain complaint with regulation. Regulation is a grey area, so any vendor offering services in this area should inspire confidence in the relationship. From a technical perspective, visibility is key. Effective workflow management, audit capabilities, management information and reporting play fundamental roles in attesting to the reconciliation process. Operationally, documented procedures, adequate training and staff awareness are required for knowledge transfer and a high-level understanding of a firm’s reporting responsibility.

Regulatory mandates are becoming more complicated, and the need to abide by these regulations drives the conversation around implementing defence mechanisms such as reconciliation platforms, surveillance tools, and so on, to protect firms from regulatory exposure.

Regulations change, new regulations go live, and with firms focusing on meeting reporting obligations, other internal technology tools end up losing the investment and/or attention required to support the firm’s technical infrastructure. Traditional reconciliation tools can become a victim to these pressures, but regulation and its systems and control focus are shifting this mind-set The ultimate goals for any financial institution are to safeguard data integrity across the organisation and be able to actively demonstrate to a regulator best practice in establishing a robust regulatory reporting framework.