How to Ensure Your ERM Platform Is Examiner-Ready
Enterprise risk management platforms promise centralized risk oversight. But when examiners arrive and ask to see how your institution manages risk, many ERM platforms fail a basic test: they can't produce dated, versioned evidence of who did what, when, and whether it was reviewed and approved. The platform tracks risks. It doesn't prove work happened.
The gap between "we use an ERM platform" and "our ERM platform is examiner-ready" comes down to four capabilities: evidence capture, approval workflows, version control, and reporting packs. Without all four, your ERM platform is a risk register - not a compliance management tool.
Key Takeaways:
- Examiners don't evaluate whether you have an ERM platform - they evaluate whether it produces verifiable evidence of risk management
- Risk assessments without approval trails, version histories, and completion evidence are examination liabilities
- Reporting packs should be exportable on demand, not assembled manually before each exam
- The OCC (OCC 2014-16), FDIC (RMS Manual Section 7.1), and Federal Reserve (SR 16-11) all set expectations for enterprise risk management documentation
What Examiners Actually Look for in ERM Systems
Examiners don't audit your technology stack. They audit your risk management process and use your platform as evidence of how that process operates. The distinction matters: a spreadsheet-based ERM program with complete documentation can receive a better examination outcome than a sophisticated platform with poor evidence hygiene.
Under the FFIEC framework and agency-specific guidance, examiners evaluate:
Risk identification and assessment. Does the institution identify, measure, and monitor risks across all categories - credit, operational, compliance, market, liquidity, strategic, and reputational? The OCC's Heightened Standards (12 CFR Part 30, Appendix D) require institutions above $50 billion to maintain a comprehensive risk governance framework, but examiners apply similar expectations proportionally to community banks through OCC Bulletin 2014-16 and the FDIC's Risk Management Manual.
Control environment. Are risks mitigated by controls, and are those controls tested? Your ERM platform should document which controls map to which risks, when controls were last tested, and what the test results showed.
Risk appetite alignment. Can the institution demonstrate that its actual risk profile aligns with its stated risk appetite? This requires quantified risk metrics, threshold monitoring, and evidence of board-level review when metrics approach or breach thresholds.
Board and management oversight. Do the board and risk committee receive regular, substantive risk reports? Do they act on them? Examiners review committee minutes for evidence of engagement - not just receipt of reports, but documented discussion and decision-making.
The Four Pillars of Examiner-Ready ERM
1. Evidence Capture: Proving Work Was Done
Every risk management activity in your ERM platform should generate a timestamped, attributable record. Activities that require evidence:
| Activity | Required Evidence |
|---|---|
| Risk assessment completion | Assessor name, completion date, methodology, risk ratings with rationale |
| Control testing | Tester, test date, sample methodology, pass/fail results, exceptions found |
| Risk acceptance | Accepting authority, date, documented rationale, conditions, expiration |
| Mitigation plan completion | Owner, action taken, date, supporting documentation |
| Threshold breach | Automated timestamp, metric value, escalation record, management response |
| Periodic risk review | Reviewer, review date, changes from prior assessment, justification for changes |
The standard most examiners apply: if it isn't documented, it didn't happen. Your ERM platform must capture this evidence automatically as part of the workflow - not rely on users to manually log what they did after the fact.
Manual evidence logging introduces two risks. First, it doesn't happen consistently. Second, examiners question the reliability of evidence that was documented days or weeks after the activity occurred. Automated capture at the point of completion eliminates both concerns.
2. Approval Workflows: Who Authorized What
Risk management decisions require authorization. Risk assessments need approval from appropriate management levels. Risk acceptances require sign-off from officers with delegated authority. Policy exceptions need documented approval chains. Control deficiencies need acknowledged remediation owners.
Your ERM platform must capture not just the final decision, but the approval chain:
- Who submitted the item for approval?
- Who was it routed to?
- Did they approve, reject, or request changes?
- When did each step occur?
- Were there conditions attached to the approval?
For risk acceptances specifically, examiners reference the OCC's Risk Governance guidance (OCC 2014-16) which requires that risk acceptance decisions be made by individuals with appropriate authority and that these decisions be documented, time-limited, and subject to periodic review.
An ERM platform that shows a risk assessment was "approved" but can't show who approved it, when, or whether they had the authority to do so is a finding waiting to happen.
3. Version Control: The Audit Trail
Risk assessments evolve. Risk ratings change as conditions change. Controls are modified. Policies are updated. Your ERM platform must maintain a complete version history that answers:
- What was the prior state of this risk assessment?
- What changed?
- Who made the change?
- When was it changed?
- Why was it changed (documented rationale)?
Version control matters for three examination scenarios:
Trend analysis. Examiners want to see how risk ratings have moved over time. A risk that was rated "low" three quarters ago and is now rated "high" should have documented rationale for each change. A risk that has been rated "moderate" for years without reassessment suggests the assessment is stale.
Post-event review. If a risk event occurs (a loss, an incident, a finding), examiners will look at the risk assessment that existed at the time of the event. If the assessment rated the risk low, they'll ask why. If the assessment was accurate but controls failed, they'll examine the control testing evidence. Version history makes this review possible.
Regulatory change impact. When regulations change, examiners expect corresponding updates to affected risk assessments. Version history demonstrates that your institution evaluated the impact of regulatory changes on its risk profile within a reasonable timeframe.
Without version control, your ERM platform only shows the current state - and examiners have no way to evaluate how you got there.
4. Reporting Packs: Board-Ready Output on Demand
Examiners review the reports that go to your board and risk committee. They're evaluating two things: whether the reports contain sufficient information for effective oversight, and whether the board is actually engaging with the content.
Your ERM platform should produce reporting packs that include:
Risk dashboard. Current risk profile by category, with visual indicators for risks approaching or exceeding appetite thresholds. Include trend arrows showing directional movement since the prior reporting period.
Key risk indicators (KRIs). Quantified metrics with current values, threshold levels, and breach history. Examples: past-due loan ratios, BSA alert volumes, complaint rates by product, vendor risk ratings, cybersecurity incident counts.
Emerging risks. New or evolving risks identified since the last report, with preliminary assessment and proposed response. Examiners pay close attention to whether your institution identifies emerging risks proactively or only after they materialize.
Control effectiveness summary. Aggregate results from control testing, highlighting any controls that failed testing or have not been tested within the required cycle.
Open items. Risk acceptances approaching expiration, overdue mitigation actions, unresolved control deficiencies, and open examination findings with remediation status.
Trend analysis. Quarter-over-quarter and year-over-year views of your risk profile by category. Are risks increasing? Are new controls keeping pace?
The reporting pack should be exportable on demand. When examiners request "the last four quarterly risk reports to the board," the answer should be four clicks - not four days of compiling data from multiple sources.
Common ERM Platform Gaps Examiners Find
Based on published examination guidance and enforcement actions, these gaps appear repeatedly:
Risk assessments without supporting rationale. A risk rated "moderate" with no explanation of why it isn't "high" or "low." The FDIC's Risk Management Manual (Section 7.1) states that risk ratings should be "supported by analysis and documentation." A number in a field is not an assessment.
Stale assessments. Risk assessments that haven't been updated in 12+ months despite changes in the business environment, regulatory landscape, or institution's operations. The OCC expects risk assessments to be updated "at least annually, or more frequently when risk factors change" (OCC Bulletin 2014-16).
Missing control linkages. Risks without mapped controls, or controls that exist in the platform but have never been tested. COSO (the Committee of Sponsoring Organizations) internal control framework - widely referenced by examiners - requires that controls be designed to address identified risks and tested for operating effectiveness.
Approval gaps. Risk acceptances without documented authorization, or approvals made by individuals below the required authority level. The Federal Reserve's SR 16-11 on ERM for large institutions explicitly requires that risk acceptance decisions be "made by individuals with appropriate authority."
No export capability. All the evidence exists in the platform, but it can't be extracted into a format examiners can review independently. Examiners need to take evidence with them - they can't sit at your workstation reviewing screens. If your platform can't produce PDFs, spreadsheets, or structured exports of evidence, it's not examiner-ready.
Building Examiner Readiness Into Your ERM Process
The fix isn't a technology migration - it's building four behaviors into your existing ERM process:
- Capture evidence at the point of completion. Every risk assessment, control test, approval, and review generates a dated, attributed record automatically.
- Route decisions through documented approval chains. No risk acceptance, policy exception, or assessment change occurs without a recorded approval from an authorized individual.
- Maintain version history. Every change to a risk assessment, control, or rating is preserved with who, when, and why.
- Produce reports, don't assemble them. Board-ready reporting packs are generated from the same data that drives daily risk management, exportable at any time.
Canarie applies these four principles to compliance execution. Regulatory requirements map to workflows with evidence capture at every step. Approvals route through documented chains. Changes are versioned. And exam preparation reduces to exporting evidence packages that already exist - because they were captured during normal operations, not assembled for the exam.
See how compliance teams stay audit-ready year-round →
Frequently Asked Questions
Do examiners expect a specific ERM platform or framework?
No. Examiners evaluate outcomes, not tools. They reference frameworks like COSO ERM and agency-specific guidance (OCC 2014-16, FDIC RMS Manual, SR 16-11) to set expectations, but they don't mandate specific technology platforms. A well-documented spreadsheet-based program with complete evidence can satisfy examination requirements. The platform matters less than the process it supports.
How often should risk assessments be updated in the ERM platform?
At minimum annually, and more frequently when risk factors change materially. The OCC expects updates "at least annually, or more frequently when warranted by changes in the bank's risk profile" (OCC 2014-16). Events that should trigger interim updates include: new products or services, significant operational changes, regulatory changes, risk events or losses, and material changes in the business environment.
What evidence format do examiners prefer?
Examiners typically request PDF documents for policies and narratives, and spreadsheet formats (CSV/Excel) for data and logs. The key requirement is that evidence be self-contained and reviewable without access to your systems. Examiners take documents with them for off-site analysis. Screenshots of platform screens are generally insufficient - examiners want the underlying data, not a picture of it.
How do we handle ERM evidence for multi-entity or multi-charter institutions?
Each charter is examined separately, so evidence must be clearly attributed to the specific entity being examined. If your ERM platform serves multiple entities, ensure that reports and evidence exports can be filtered by entity. Shared services (like IT risk management or vendor oversight) should have documented allocation of responsibilities and entity-specific risk assessments.
Should our ERM platform integrate with our compliance management system?
Integration between ERM and compliance management reduces duplication and improves examiner confidence. Compliance findings, control test results, and regulatory change impacts should flow between systems so that the risk profile reflects actual compliance posture. Examiners are increasingly looking for this connection - an ERM program that shows "compliance risk: low" while the compliance management system has 15 open MRAs creates obvious credibility issues.