Saturday, December 13, 2025
10 Compliance dashboard risk metrics your board actually cares about


Boards do not want 60-page compliance packs. They want a crisp view of exposure, whether the company is within appetite, and how quickly risks are being reduced. For mid-sized enterprises, the difference between a dashboard your board scans and one they study is choosing a handful of outcome-focused metrics, measured consistently and tied to core regulations like Loi Sapin II, ISO 37001, UNE 19601 and 19603, AML rules, and the EU AI Act.

What great board dashboards have in common
- Oriented to outcomes, not activities. Show risk reduction, velocity, and breaches, not only how many trainings or policies you produced.
- Anchored in risk appetite. Every metric should display a target or threshold so red, amber, green status is unambiguous.
- Trend based. Show at least four quarters, plus a 90‑day forecast where possible.
- Framework aware. Map each metric to key obligations, for example Sapin II’s eight pillars, ISO 37001 clauses, UNE 19601 and 19603 requirements, AML monitoring, and AI Act readiness.
- Decision ready. One line of narrative per metric, and one action owner for the next step.
The 10 board-level compliance risk metrics
1 Residual risk index and trend
What it shows: Your current exposure after controls, by top risk themes like corruption, antitrust, AML, criminal liability, and high-risk AI. Boards use it to see if exposure is going down and where concentration sits.
How to measure: For each risk, compute Residual Risk = Inherent Risk score multiplied by (1 minus Control Effectiveness). Create a weighted average or sum for a portfolio index. Trend quarterly.
Why it matters: Required by Sapin II risk mapping and core to ISO 37001 and UNE 19601 methodologies. It is the simplest way to answer, are we safer than last quarter.
Data sources: Risk assessment records, control test results.
2 Appetite breaches and days out of appetite
What it shows: Where the company exceeded board-approved thresholds, for example residual risk over 3.0 or control effectiveness under 80 percent, and how long those breaches lasted.
How to measure: Count total breaches in the quarter, plus cumulative days out of appetite for each. Display a bar by risk theme.
Why it matters: Links the dashboard directly to board policy. Turning red areas to green becomes the quarterly priority list.
Data sources: Risk register, approved appetite statements, KRIs.
3 Control effectiveness rate
What it shows: The proportion of tested key controls rated effective in design and operation. Break out critical anti-bribery, AML, antitrust, and criminal compliance controls.
How to measure: Effective controls divided by total controls tested in the period. Include a separate rate for repeat failures.
Why it matters: ISO 37001 and UNE 19601 emphasize control design and operating effectiveness. A falling rate signals higher residual risk and potential regulator scrutiny.
Data sources: Internal control testing, internal audit results.
4 Third‑party risk coverage and spend at risk
What it shows: The percentage of in-scope high‑risk intermediaries and suppliers with current due diligence, plus the portion of spend flowing to parties without up‑to‑date clearance.
How to measure: Coverage = high‑risk third parties with valid due diligence divided by total high‑risk third parties. Spend at risk = annualized spend with parties missing due diligence divided by total third‑party spend.
Why it matters: Central to Sapin II, ISO 37001, and AML. Also relevant to antitrust in distribution networks under UNE 19603.
Data sources: Procurement and AP data, third‑party due diligence system, contract repository.
5 Speak‑up culture health
What it shows: Whether your whistleblowing and investigations process functions as an early warning system.
How to measure: Report rate per 100 employees, substantiation rate, and median days to close cases. Segment by country or business unit to spot anomalies.
Why it matters: Sapin II and UNE 19601 require effective reporting channels. Healthy rates with timely, fair investigations correlate with lower incident severity.
Data sources: Hotline and case management tools, HRIS for denominator.
6 Remediation velocity and backlog aging
What it shows: How fast material issues are being fixed and whether overdue actions are accumulating.
How to measure: Percent of high‑severity actions closed on time, median days overdue for open actions, and count of items aged over 90 days.
Why it matters: Regulators look at whether issues are fixed promptly. Boards use this to hold owners accountable and allocate resources.
Data sources: Corrective action plans, workflow trackers, audit follow‑up logs.
7 Training and policy attestation in high‑risk roles
What it shows: Coverage and timeliness for the roles that move your risk needle, for example sales, procurement, finance, market access, and data science teams.
How to measure: On‑time completion rate for mandatory modules and policy attestations in high‑risk roles, not a companywide average. Include a short post‑assessment improvement delta to indicate effectiveness.
Why it matters: Sapin II, ISO 37001, UNE 19601 and 19603 all require risk‑based training. Boards want assurance that critical populations got what they need, on time.
Data sources: LMS, HRIS, policy management system.
8 Monitoring and screening quality
What it shows: The productivity and quality of your compliance monitoring. It should prove you are catching what matters without drowning the team.
How to measure: Alerts closed within SLA, true positive rate, and median time to escalate hits that meet reporting thresholds. For AML, show suspicious activity report timeliness.
Why it matters: An efficient, responsive monitoring program is a core AML expectation and supports antitrust early detection under UNE 19603.
Data sources: Monitoring tools, case management, regulatory reporting logs.
9 Regulatory readiness score by framework
What it shows: A one‑glance view of where you stand against key frameworks, for example Sapin II, ISO 37001, UNE 19601 and 19603, AML, and the EU AI Act.
How to measure: Score each framework on required components, for example eight pillars under Sapin II or high‑risk AI obligations, then compute percent complete and maturity level. Display the two lowest‑scoring elements per framework with owner and next action.
Why it matters: Boards need to see exposure to enforcement in specific regimes, not just generic readiness. The AI Act’s phased obligations from 2025 onward make this especially timely.
Data sources: Program documentation, controls inventory, AI system register, legal gap assessments.
10 Incident impact and self‑disclosure posture
What it shows: The severity and cost of recent compliance incidents and whether voluntary self‑disclosure is reducing penalty exposure.
How to measure: Rolling 12‑month incident count by severity, cumulative financial impact, and number of self‑disclosures or regulator interactions. Include a narrative on outcomes and lessons learned.
Why it matters: Boards must understand downside realized and how proactive cooperation shapes enforcement outcomes, especially in AML and anti‑corruption cases.
Data sources: Incident registers, legal and finance records, regulator correspondence.
Quick reference: how metrics map to frameworks
Metric | What it tells the board | Primary data sources |
|---|---|---|
Residual risk index | Exposure after controls, trend | Risk register, control testing |
Appetite breaches | Governance discipline vs thresholds | Appetite statements, KRIs |
Control effectiveness rate | Design and operating strength | ICFR and compliance tests, audit |
Third‑party coverage, spend at risk | Exposure via intermediaries and suppliers | Procurement, TPRM, contracts |
Speak‑up health | Early warning and culture | Hotline, investigations, HRIS |
Remediation velocity | Pace of risk reduction | CAPA trackers, audit follow‑up |
High‑risk role training and attestations | Whether critical staff are equipped | LMS, HRIS, policy system |
Monitoring quality | Detection efficiency and timeliness | AML and screening systems |
Regulatory readiness score | Gap to obligations by regime | Program docs, AI inventory |
Incident impact and disclosures | Realized downside and cooperation | Incident and legal records |
Presenting the metrics in a board‑ready way
- One slide per metric, with trend, threshold, and a single line that explains the variance or risk driver.
- A short “so what” and a named owner with a 90‑day action. Limit text to two lines under each chart.
- Use consistent scales and a simple red, amber, green legend tied to appetite. Do not change targets quarter to quarter without board approval.
- Segment only when it helps a decision, for example third‑party coverage by region or business unit.
Implementation tips for mid‑sized enterprises
- Start where risk is concentrated. If most exposure is third‑party and sales driven, prioritize metrics 1, 2, 4, and 7, then layer the rest.
- Stabilize definitions. Lock formulas and data dictionaries so results are comparable quarter to quarter. Document scope and denominators.
- Automate inputs early. Pull headcount and role data from HRIS, third‑party data from procurement, alerts from monitoring tools, and case data from your hotline to reduce spreadsheet risk.
- Tie metrics to incentives. Make remediation velocity and appetite breaches visible in performance reviews for accountable leaders.
- Review quarterly, deep dive annually. Run a quarterly board update and an annual recalibration of appetite and risk weights.
How Naltilia helps
Naltilia’s AI platform streamlines the collection and maintenance of the inputs behind these metrics, for example risk assessments and control testing outcomes. With automated data collection and compliance workflow automation, you can reduce manual work, keep remediation actions moving, and present consistent, audit‑ready numbers to your board. If you want to accelerate how these metrics are produced and maintained, book a demo.
By focusing on these ten metrics, aligning them with risk appetite, and automating the underlying data flows, your board will gain a clear, decision‑ready view of compliance risk, and your team will gain time back to reduce it.