All Insights
Regulatory Intelligence

Producing DORA-Compliant Audit Evidence for Identity Controls

DORA supervisors are clear on one point: policy documents are not evidence that controls are operating. Financial entities in their first DORA supervisory cycle are discovering that what they submitted as evidence and what supervisors expected to receive are two different things. This article sets out the distinction — and the five specific artifacts that supervisors request for identity controls.

24 February 2026    Mark Ahearne, Founder & Director    7 min read

Policy Statements Versus Demonstrable Controls

DORA Article 9 requires financial entities to "develop, document, and implement" access management policies. The word "implement" is doing significant work in that sentence. Supervisors are not evaluating whether policies have been developed and documented. They are evaluating whether they have been implemented — meaning whether the controls described in the policy are actually operating in the production environment.

This is a distinction that organisations frequently miss in their first examination preparation. A policy document that says "privileged access is reviewed quarterly" is a statement of intent. A set of completed quarterly review records showing which accounts were reviewed, what was found, and what was done about it is evidence of implementation.

The RTS on ICT risk management — published by the Joint Committee of the ESAs and legally binding alongside DORA — makes the evidence requirement explicit. Article 13 of the RTS requires that entities "maintain records of all management actions taken" with respect to access controls, and that those records be "available for review by competent authorities." That is an evidence retention obligation, not a policy drafting obligation.

The supervisory rejection pattern: Based on early DORA examinations, supervisors are declining to accept the following as evidence of compliance: policy documents (any version), internal audit reports that reference policies rather than testing controls, verbal assurances that controls are in place, and system configuration screenshots without associated operational records.

The Five Evidence Artifacts for Identity Controls

Across early DORA supervisory reviews — including published examination findings from the Netherlands Authority for the Financial Markets (AFM) and guidance from the EBA's supervisory methodology documents — five specific categories of identity control evidence appear in requests to examined entities. These are what supervisors ask for. If you cannot produce them, that inability is itself the finding.

1. Access Control Policy with Version History and Approval Record

A policy document is necessary but not sufficient. What supervisors need to see is a policy document that: carries a version number and date; was approved by a named management authority (not a generic "IT Security team"); has a documented review history showing it has been reviewed within the last 12 months; and reflects the actual controls operating in the environment.

The last point is frequently where policy documents fail. Policies are often written to describe an ideal state rather than the current state, or they were accurate when written but have not been updated as the environment changed. A policy that describes quarterly access reviews when your actual practice is annual creates a gap — the policy itself becomes evidence of non-compliance.

2. MFA Coverage Report

Supervisors request an export — directly from the identity platform — showing the MFA enforcement status of every account with privileged access. This is not a document your team writes. It is a system-generated report from Entra ID, Active Directory, or your PAM tooling that shows, for each account:

  • Account name and type (human or service account)
  • Role or privilege level
  • MFA enforcement status (enforced, enabled but not enforced, disabled)
  • Last authentication date

For any accounts without MFA enforced, supervisors expect to see a documented exception — specifying the business reason, the risk acceptance sign-off by a named manager, the compensating controls in place, and a remediation target date. An exception without documentation is indistinguishable from a gap.

3. Privileged Account Inventory

A complete, current inventory of all privileged accounts. Complete means every account with elevated permissions — including service accounts, emergency access accounts, and third-party accounts — not just the accounts your team thinks of as "privileged." Current means maintained on a defined cycle, not assembled for the examination.

The inventory should show, for each account: account identifier; account type; privilege level or role; named human owner; business purpose; date last reviewed; and any open findings from the last review. The existence of a maintained inventory register — as opposed to a spreadsheet produced ad hoc — is itself an evidence point about the maturity of your access governance.

4. Access Review Records

Completed access review records for each review cycle that has occurred in the supervisory period. Each record should show: the scope of the review (which accounts, which systems); who conducted the review (named individuals, not a team name); the date the review was completed; findings — accounts identified for removal, permission reduction, or exception; and the remediation actions taken with completion dates.

The remediation tracking element is where many organisations fail. An access review that produces a list of accounts to remove is not complete until those accounts have been removed. Supervisors test this by comparing the findings list from a review against a current account export — and checking whether the accounts identified for removal in the review were actually removed. If they were not, the review process is non-functional regardless of what the policy says about it.

5. Incident Logs and Exception Records

Records of identity-related incidents and access exceptions within the supervisory period. This covers: access policy exceptions and the risk acceptance process followed; identity-related security incidents (account compromises, unusual access patterns investigated, credentials reset following suspected exposure); and changes to privileged account scope or permissions, with the approval records for those changes.

The absence of any exception or incident records is not a good sign — it typically indicates the logging process is not functioning rather than that the organisation has a perfect record. Supervisors expect to see a small number of managed exceptions and incidents, handled through documented processes. No records at all suggests the processes are not operating.

Common Evidence Failures

The gap between what organisations submit as evidence and what supervisors need is consistent enough across early examinations that the failure patterns are predictable.

  • Submitting policies instead of records: The most common failure. Policy documents describe what should happen. Supervisors need records of what did happen. Submitting a "Privileged Access Management Policy" in response to a request for access review records is not a misunderstanding — it is a signal that the records do not exist.
  • System-generated reports without operational context: An Active Directory export showing all accounts with privileged roles is useful as a starting point, but without accompanying review records showing that someone examined that list, made decisions about it, and took action, it is not evidence of a functioning access management process.
  • Access reviews completed but not documented: Teams that conduct access reviews verbally or in meetings without producing a written output leave themselves unable to evidence the review. The review happened — but without documentation, it did not happen as far as a supervisor is concerned.
  • Evidence assembled in response to the examination request: Producing a privileged account inventory the day after an examination request is issued creates two problems: it signals that no maintained inventory existed beforehand, and any data in it is current rather than representing the period under examination. Supervisors look at document metadata and are familiar with this pattern.
  • Incomplete scope: Submitting access review records that cover human privileged accounts but not service accounts, or that cover on-premises systems but not cloud environments. Supervisors are aware that hybrid environments have different control surfaces and will specifically ask about each.

Building an Evidence Pack Before a Supervisory Review

The principle is simple: your evidence pack should exist before a supervisory examination begins, not be created in response to one. The operational records that constitute DORA evidence are the by-products of a functioning access governance programme. If your programme is functioning, the evidence exists. If you need to create it for the examination, the programme is not functioning.

The practical steps to build a maintainable evidence base:

  • Establish a document management process for access governance outputs: Access review records, exception approvals, incident logs, and policy documents should be stored in a named location with a defined retention period. They should not live in email threads or personal drives.
  • Template your access review outputs: Create a standard template that captures all the fields a supervisor would expect to see — scope, reviewer, date, findings, actions, completion dates. Consistency makes it easier to demonstrate a repeatable process.
  • Schedule regular exports from your identity platform: MFA coverage reports and account inventories should be generated on a defined schedule — not just when someone requests them. Scheduled exports with date stamps demonstrate a maintained oversight process.
  • Track remediation actions to completion: Every finding from an access review should be tracked in a register until it is resolved. Incomplete remediation is the most common reason access review records fail supervisory scrutiny.
  • Review your evidence pack before the examination cycle reaches you: Three months before you expect to be in a supervisory examination, review the evidence you have against the five categories above. Identify gaps and close them — this is materially better than identifying them during an examination.

The organisations that perform well in DORA supervisory examinations are not necessarily those with the most mature technical controls. They are the organisations whose controls — whatever their maturity level — are documented, evidenced, and consistently maintained. A less sophisticated control that is fully evidenced will outperform a sophisticated control that exists only in policy during a supervisory examination.

DORA Audit Evidence Preparation from IdentityFirst

IdentityFirst's DORA Compliance and Audit service includes a structured evidence preparation engagement — reviewing your current identity control records against the five evidence categories that supervisors request, identifying gaps in your evidence base, and producing the missing artifacts where controls are operating but not documented.

The output is a complete DORA identity controls evidence pack: the policy with version history, the MFA coverage report, the privileged account inventory, the access review records, and the exception and incident logs — structured for rapid production in response to a supervisory request, and maintained on a cycle that keeps it current.

Related Insights

Regulatory Intelligence

DORA Identity Controls Requirements: What the Regulation Actually Demands

The full picture of DORA's identity obligations — Articles 9, 10, 17, TLPT, and the readiness checklist.

Read Article
Regulatory Intelligence

DORA Privileged Access: What the Regulation Requires

The specific privileged access obligations under Article 9(4) — and the three gaps most organisations have.

Read Article
Board & Executive

Five Questions Every Board Should Ask About Identity Security

The governance questions that distinguish boards fulfilling their DORA accountability from those that are not.

Read Article