Food for Thought: Are your audit teams going the distance? Expert panelists suggest we can do better
"Food for Thought" is a series showcasing insights and best practices from Exiger's roundtables, where senior financial services professionals and industry thought leaders come together to discuss the latest industry developments over a bite to eat.
Financial crime compliance (“FCC”) internal audit teams continuously face shifting challenges as banks’ third line of defense. Regulatory expectations are higher than ever – in part because the regulatory toolkit for testing FCC controls is itself increasingly sophisticated.
Today, it’s par for the course for a financial regulator to directly request large swathes of a bank’s payment data, and then use its own in-house data analytics to select a risk-based sample of wire payments to test for payment transparency compliance or unusual activity. U.S. branches of foreign banks are under even more pressure. They not only face off with an array of federal and state agencies, but also home office regulators that are growing more engaged by the day.
The evolving role of FCC internal auditing
The upshot is that the role of FCC internal audit is evolving. To deploy controls-based testing effectively, internal audit needs to be at the forefront of new technological and analytical approaches.
This requires the third line go deeper than, for example, rote checking of a bank’s customer information program. It instead means developing and implementing far more dynamic review capabilities, such as bespoke inputs to a bank’s “data lake” from which to extract, analyze, and test transactional data. It also means moving closer to a model of continuous or near real-time monitoring, and growing from a siloed controls tester to an independent yet strategic risk consultant to the business.
U.S. financial regulators and top audit practitioners recently shared their thoughts at a series of intimate and informative panels at the Institute of International Bankers’ (“IIB”) Internal Audit Seminar on FCC this June. Panelists included senior examiners from the Federal Reserve Board (“FRB”) of New York and Office of the Comptroller of the Currency (“OCC”), seasoned internal auditors at IIB member banks, an expert from Milbank LLP, and Exiger’s Erika Peters and Kurt Drozd. The attendees shared the following perspectives on audit best practices, the risk-based approach, and the demand for data analytics – and data aptitude – in the third line of defense:
Regulatory expectations - “Must haves” for FCC internal audit
FCC internal auditors should keep in mind certain basics. The most key, arguably, is to document decision-making.
A regulatory exam nearly always begins with an assessment of internal audit’s workpapers for the area under review, be it anti-money laundering (“AML”), sanctions, anti-bribery and corruption (“ABC”), or fraud. It is especially critical for audit to memorialize decisions to waive exceptions identified in workpapers. Regulators expect a cogent and contemporaneous record of why audit did not report the exception, capturing the auditor’s thought process and basis for exercising this judgment. When workpapers lack this transparency, internal audit instantly loses credibility.
Risk assessments are the beating heart of financial crime risk management and third line oversight. A risk assessment should drive virtually all audit prioritization, and regulators often rely on the quantitative data in a bank’s risk assessment as a jumping-off point for their own exams. A mature internal audit function will perform an independent risk assessment – separate from the organization’s enterprise-wide assessment – to drive its test plan.
To test FCC controls effectively, the third line must consider new risks, including those arising from changes to an existing product or the makeup of the customer base. A copy-and-paste job of findings from an old audit report – despite an intervening and material change in the bank’s risk profile – is a serious regulatory red flag.
The third line can also preempt regulatory action with certain best practices around issue articulation, management, and validation. When identifying an issue, internal audit shouldn’t stop at merely stating the exception, but rather dig down to identify the bona fide root cause and assess the potential regulatory impact. As to the validation of supervisory issues, regulators expect not only that internal audit will test the design of an action plan, but also its operational effectiveness.
The timing of validation can also be important. Depending on the control, internal audit should wait a reasonable length of time before validation to ensure appropriate embedding of the action plan. In other words, consider issue closure sustainability. There is also a strong regulatory interest in both the robustness of internal audit’s quality assurance process as well as an accurate and well-maintained tool for tracking issues. Ultimately, banks can save themselves a lot of money – and even more angst – by having a strong third line of defense that can spot FCC issues before the regulator.
The intersection of internal audit with data analytics, model validation, and data governance
An effective internal audit function needs an analytical and strategic understanding of risk – and nowhere is this clearer than in the context of data analytics. Only a few years ago, internal audit was confined to traditional sampling techniques to test customer due diligence (“CDD”) files or transaction monitoring alerts – methods that, although satisfying statistical sampling requirements, did not allow the third line to scale testing to the magnitude and complexity of a modern bank’s financial crime risk environment and control framework.
Internal audit realized it had to ingest a far larger population of data, and then apply reasoned analysis to that data, to support richer sampling and greater visibility of the nature and quantity of a bank’s risk.
The need for data and technical know-how in the third line of defense goes beyond just selecting samples for audit testing. Internal auditors must increasingly understand the models that underpin some of their bank’s bedrock FCC controls. By definition, a model is a quantitative or statistical process; in the risk and compliance context, it is generally any quantitative or statistically based process that can yield predictive outcomes. FCC models include, for example, the predictive engine supporting a scenario-based automated transaction monitoring tool, or a multifactor customer risk assessment methodology. Top banks have a well-defined model risk-management framework that prescriptively sets forth what should be done to create and maintain each model, when, and how.
To conduct model validation, internal auditors must understand how an FCC model works – to peer inside the so-called “black box” – in order to ask the right questions and, by extension, check and challenge model design and operational effectiveness. The third line’s role in model validation thus begins by asking: What risk(s) is the model designed to address? Who defined the model purpose? What is the associated governance? Is the model continuously tested by an unbiased party from an analytics, IT, and user interface perspective?
This means more than just checking the tuning of model rules for automated transaction monitoring and sanctions screening, but also asking whether the rules themselves are the right ones to begin with. For vendor-supplied systems, a simple but often overlooked fact is that the vendor can provide paperwork that will shed light on what’s in the black box. Although the third line is expected to challenge outcomes in the model space, most internal audit departments are doing only light touch, document-driven model validations, and too often deferring to IT. If an internal audit team lacks the subject matter expertise to check and challenge model owners, banks should consider tapping external advisory consultants.
Finally, internal audit should test the data flow running among the bank’s models and systems. This starts with understanding who owns the data – which is frequently anything but simple in a multinational bank with multiple systems and data elements coming from many disparate sources. In terms of data quality, common “landmines” include stripping, missing data, and truncation. A data field in an upstream system, for example, may be limited to capturing a maximum number of characters; a linked downstream system that needs more characters can cause the loss of data completeness as the data “hops” from one system to the next.
A mature data governance framework will track data quality from the source to the dashboard of the ultimate data consumer, which allows for a comparison to validate the data flow. Sound data lineage also requires a handshake between upstream and downstream data owners, coupled with firm-wide communication about any changes to the data. For instance, the whole bank needs to be aware of new products to evaluate their potential downstream effects on data. At the end of the day, to stay relevant and effective, the third line needs to understand how data and technology support products and services to market and their alignment with regulatory requirements.
Global authority on FCC internal audit
Exiger is a global authority on FCC internal audit. Our practitioners have come from audit departments at some of the largest and most complex financial institutions. We can advise firms on how best to manage regulatory scrutiny of an audit program and FCC systems, assist with audit test plans or reporting, and help with remediating and validating issues.
Our experts also specialize in training to ensure internal auditors apply best practices and are up to speed on the latest trends and audit approaches in the constantly evolving world of FCC.
In addition, Exiger is a global leader in data analytics as well as the systems and models used to sustainably manage financial crime risk.
Stay ahead of evolving regulatory demands. Schedule a meeting with our Audit Professionals.