How to Track and Remediate Compliance Exam Findings
An exam finding doesn't close when you write a corrective action plan. It closes when you can prove - with documented evidence - that the root cause was fixed, the fix was tested, and recurrence is being monitored. Examiners verify this at every subsequent examination. Institutions that track findings informally (in spreadsheets, email threads, or someone's memory) consistently produce the worst outcome in compliance: the repeat finding.
Repeat findings signal management weakness. They tell examiners that the board and senior management either don't take findings seriously, don't have effective processes for remediation, or both. Under the FFIEC Compliance Management System framework, persistent unresolved findings can escalate from MRAs to enforcement actions.
Key Takeaways:
- Repeat findings are the single most damaging pattern examiners look for
- Every finding needs a documented lifecycle: identification, root cause, corrective action, implementation, validation, and monitoring
- "We updated the policy" is not sufficient remediation without evidence of implementation and testing
- Examiner expectations have shifted from "did you fix it?" to "prove it's fixed and won't recur"
The Finding Lifecycle: From Identification to Closure
Regulatory examination findings - whether classified as Matters Requiring Attention (MRAs), Matters Requiring Immediate Attention (MRIAs), violations, or observations - follow a predictable lifecycle. Institutions that formalize this lifecycle close findings faster and avoid repeats.
1. Finding Receipt and Classification
When examination results arrive, each finding needs classification by:
- Severity - Violation of law/regulation, MRA, MRIA, or observation/recommendation
- Regulatory source - Which law, regulation, or guidance is implicated (e.g., 12 CFR § 1024.35, OCC Bulletin 2013-29)
- Affected area - Which business line, product, or process
- Prior history - Has this finding or a substantially similar one appeared before?
Classification drives response urgency. MRIAs require immediate corrective action, often within 30-60 days. Standard MRAs typically allow 90-180 days. Violations of law require documented corrective action regardless of timeline.
2. Root Cause Analysis
This is where most remediation efforts fail. The stated root cause is usually a symptom, not the actual cause.
Common symptom-level root causes (insufficient):
- "Staff didn't follow the procedure"
- "We missed the deadline"
- "The system wasn't configured correctly"
Actual root causes (actionable):
- The procedure doesn't match the actual workflow, so staff developed workarounds
- No automated escalation exists for approaching deadlines
- System configuration changes aren't validated against regulatory requirements before deployment
The difference matters. If the root cause is "staff didn't follow procedure," your corrective action is retraining. If the root cause is "the procedure doesn't match reality," your corrective action is redesigning the workflow - and retraining alone will produce the same finding next exam.
Document the root cause analysis. Examiners will ask how you determined what caused the issue.
3. Corrective Action Plan
An effective corrective action plan contains five elements:
- Specific actions - Not "improve our process" but "implement automated 7-day advance alerts for upcoming review deadlines in the BSA monitoring workflow"
- Responsible owners - Named individuals, not departments
- Target completion dates - Realistic dates with milestones for multi-step remediations
- Success criteria - How you'll determine the fix worked
- Board/committee reporting - When and how progress will be reported to oversight
For complex findings, break the corrective action into phases. Phase 1 might be an interim manual control. Phase 2 is the permanent systemic fix. Phase 3 is validation testing. Each phase has its own completion date and evidence requirement.
4. Implementation and Evidence Capture
Every corrective action step must produce evidence of completion:
| Action | Evidence |
|---|---|
| Policy updated | Approved policy with version date and approver signature |
| Staff retrained | Training completion records with content summary and attendance |
| System reconfigured | Change management documentation with before/after settings |
| New control implemented | Control documentation, first execution evidence, responsible party acknowledgment |
| Process redesigned | Updated procedure document, staff communication, first-cycle completion records |
"We did it" is not evidence. "Here is the dated, signed document showing we did it" is evidence.
5. Validation Testing
After implementation, test whether the fix actually works. This step is frequently skipped and examiners notice.
Pull a sample of transactions or activities that occurred after the corrective action was implemented. Apply the same test criteria that would have caught the original finding. If the sample passes, document the test methodology, sample selection, and results. If it fails, the remediation isn't complete.
For FDIC-supervised institutions, examiners reference the Risk Management Manual of Examination Policies (Section 5.1) when evaluating remediation adequacy. OCC-supervised institutions should align with the OCC's Policies and Procedures Manual (PPM 5000-7) on supervisory follow-up.
6. Ongoing Monitoring
Closure isn't the end. Define how you'll monitor for recurrence:
- What metrics or reports will detect if the issue resurfaces?
- How frequently will monitoring occur?
- Who reviews the monitoring results?
- What triggers escalation?
Document this monitoring commitment. At the next exam, you'll be asked: "How do you know this finding hasn't recurred?"
Building a Finding Tracking System That Examiners Trust
Examiners have seen every variation of finding tracking: sticky notes, Word documents, email chains, and elaborate spreadsheets that no one updates. What they want to see is a system that demonstrates active management oversight.
Required Tracking Fields
At minimum, track these attributes for each finding:
- Finding ID (unique identifier)
- Examination date and agency
- Finding text (verbatim from exam report)
- Severity classification
- Regulatory citation
- Root cause
- Corrective action plan (with phases if applicable)
- Owner
- Target completion date
- Actual completion date
- Evidence of completion (linked documents)
- Validation test results
- Board/committee reporting dates
- Current status (open, in progress, implemented, validated, closed)
- Monitoring plan post-closure
Status Definitions That Mean Something
Avoid vague statuses. Define them precisely:
- Open - Finding received, corrective action plan not yet approved
- In Progress - Corrective action plan approved, implementation underway
- Implemented - All corrective actions completed, awaiting validation
- Validated - Testing confirms the fix works
- Closed - Validated, monitoring plan in place, reported to board/committee
- Reopened - Monitoring detected recurrence or validation failed
Board and Committee Reporting
Examiners review board and committee minutes specifically for evidence of finding oversight. Your reporting should include:
- Total open findings by severity and age
- Newly identified findings
- Findings closed since last report
- Overdue findings with explanation
- Trend analysis (are findings increasing or decreasing by area?)
Report this at least quarterly. Monthly is better for institutions with active remediation efforts. The board should be asking questions about overdue items - and those questions should be documented in the minutes.
The Repeat Finding Problem
A finding that appears in consecutive examinations is the clearest signal of a compliance management system failure. Examiners interpret repeat findings as evidence that:
- Management didn't prioritize the original finding
- The corrective action was superficial (addressed the symptom, not the cause)
- Oversight mechanisms - board reporting, compliance monitoring, internal audit - failed to catch the gap
Repeat findings frequently trigger supervisory escalation. An MRA that becomes a repeat MRA may become a formal enforcement action. The OCC, FDIC, and Federal Reserve all have guidance indicating that unresolved or recurring examination findings are factors in determining whether to pursue enforcement (see OCC PPM 5000-7, Section III; FDIC Statement of Policy on Enforcement Actions).
Preventing repeats requires honest root cause analysis. If your corrective action for a training-related finding was "retrained all staff" and the finding recurs, the root cause wasn't knowledge - it was process, systems, or accountability. The next corrective action needs to address the actual cause.
Proving Closure: What Examiners Want to See
When examiners review prior findings, they walk through a specific validation:
- Was the corrective action plan adequate? Did it address root cause or just symptoms?
- Was it implemented on time? Missed deadlines without documented extensions signal low priority.
- Is there evidence of implementation? Not assertions - actual documents, records, and artifacts.
- Was it tested? Can you show post-remediation test results?
- Is monitoring in place? How will you detect recurrence?
Prepare a finding remediation package for each prior finding. Include: the original finding, root cause analysis, corrective action plan, implementation evidence, validation results, and monitoring plan. Hand this to examiners proactively on day one of the examination.
Institutions that produce organized remediation packages set a tone for the entire exam. They demonstrate that findings are managed systematically, not reactively.
How Teams Close Findings Faster
The common thread among institutions that resolve findings without repeats: the remediation process is tracked the same way compliance work is tracked - with assigned owners, deadlines, escalation on overdue items, and evidence captured at every step.
Canarie maps examination findings into the same workflow engine that handles ongoing compliance execution. Each finding becomes a tracked workflow with phases, owners, deadlines, and evidence requirements. Board reporting pulls from the same system. When the next exam arrives, the remediation package is already assembled.
See how compliance teams track findings from identification to validated closure →
Frequently Asked Questions
What's the difference between an MRA and a violation?
An MRA (Matter Requiring Attention) identifies a practice that could lead to violations or unsafe conditions. It requires a corrective action plan but doesn't necessarily mean you've broken a law. A violation means a specific law, regulation, or requirement was breached. Both require remediation, but violations carry greater regulatory and legal risk. Some agencies also use MRIAs (Matters Requiring Immediate Attention) for urgent issues requiring action within 30-60 days.
How long do institutions typically have to remediate findings?
Timelines vary by severity and agency. MRIAs typically require action within 30-60 days. Standard MRAs usually allow 90-180 days, depending on complexity. There's no universal deadline - your corrective action plan proposes the timeline, and examiners either agree or request acceleration. Missing your own proposed deadlines is worse than proposing a longer timeline upfront.
Do findings ever "expire" or age off?
No. Findings remain open until they're remediated and validated, regardless of how much time passes. Older open findings draw more scrutiny, not less. An MRA that's been open for two exam cycles is far more problematic than a new finding.
Should we share our finding tracker with examiners?
Yes. Proactively providing your finding tracking system - showing statuses, evidence, and timeline compliance - demonstrates active management. Examiners will request this information regardless. Providing it upfront, organized and complete, sets a positive tone.
How do we handle findings we disagree with?
Respond through the formal examination report response process. Provide factual evidence supporting your position. Focus on documented facts, not interpretive disagreements. If the finding is factually incorrect and you can prove it, examiners may modify or withdraw it. For interpretive disagreements, you can respond in writing and escalate through the agency's appeals process if necessary.