DORA compliance showing up in DDQs
The EU's Digital Operational Resilience Act shows up in a visible fraction of recent DDQs. What the regulation asks for, what buyers actually want, and how to structure the response without inventing posture.
The EU’s Digital Operational Resilience Act — DORA — came into application in January 2025. Through 2025 and most of 2025 we saw it quoted in a handful of DDQs from EU financial-services buyers. In the last quarter of 2025 and the first weeks of 2026, the frequency climbed sharply. Based on what we observe across the DDQs our customers put through the platform, DORA language now shows up in a meaningful share of them — and the share is higher in bids where the end customer is a bank, insurer, or payments processor regulated in an EU member state. That observation is directional and specific to the bids we see; it is not a market-wide measurement.
This is a procurement-side post. It is not a how-to-comply-with-DORA post — if you need that, go to the regulation and your compliance team. This is about how the regulation is showing up in vendor questionnaires, what kind of answer buyers are actually looking for, and how to structure that answer without inventing your compliance posture.
What DORA actually asks for, in one paragraph
DORA is a regulation on the digital operational resilience of EU financial institutions. Among other things, it requires those institutions to manage ICT risk from third-party providers — including software vendors — and to be able to demonstrate that management to supervisors. The European Commission’s overview is the primary source. EIOPA has published guidelines on how the third-party ICT-risk piece gets supervised in practice.
The downstream effect on vendors: if you sell into an EU financial institution, your customer’s compliance team has to demonstrate they have assessed your operational resilience and can oversee it. That assessment lives in the vendor questionnaire they send you.
What the DORA-influenced DDQ looks like
A DORA-influenced DDQ typically adds a cluster of questions on top of the standard security-questionnaire structure. The themes we see repeatedly:
- Concentration risk. Where else is your service hosted? If you are a SaaS vendor, what cloud provider, what region, what redundancy. What happens if your primary hosting region goes down.
- Sub-outsourcing. Who do you depend on. What fourth-party and fifth-party providers are part of your service delivery. How do you pass through resilience obligations.
- Incident notification. Within what timeframe do you notify customers of an ICT-related incident. What mechanism. What information gets provided.
- Exit and portability. If the customer terminates the contract, how does data come out, over what period, in what format. How is service continuity protected during migration.
- Audit and access rights. Can the customer’s regulator audit your service. Can the customer themselves audit. On what notice.
- Business-continuity testing. When did you last test your business-continuity plan. Under what scenarios. What did the test surface.
If these feel like the same questions that have always been in security questionnaires — you are half-right. They are security questionnaire themes with specific framing added. The framing comes from the regulation’s Article 28 obligations, and the buyer’s compliance team needs answers that map to those articles cleanly.
What buyers actually want in the answer
Three things, in order:
1. Specificity. “Yes, we have a business-continuity plan” is not an answer. “Our business-continuity plan is tested quarterly, was last tested in October 2025, covers primary-region failure with a 4-hour RTO and 1-hour RPO, and the most recent test surfaced three findings which have been remediated” is an answer. The buyer’s compliance lead is reading for specifics they can reference in their own risk assessment. Vague attestations get flagged for follow-up and slow the procurement.
2. Document backing. Every substantive DORA answer should point to a document the buyer can review under NDA — the SOC 2 report, the ISO 27001 certificate, the DORA self-assessment artifact, the sub-processor list. Safe Security has documented that the questionnaire-and-document pattern is how mature security reviews actually work; DORA-influenced reviews double down on it because the customer needs the document in their regulatory file.
3. Honesty about gaps. If you are a Series-B SaaS company and you do not have a formal ICT third-party-risk program that maps to DORA’s Article 28, saying so — and describing what you do have — reads better than asserting a level of maturity you cannot back up with documents. Most buyers know what a Series-B looks like. They will either accept your posture or exclude you; they will not be fooled by confident-sounding boilerplate.
Where the proposal-side risk lives
This is the craft-essay part. There is a specific failure mode we see repeatedly on DORA-influenced DDQs, and it is not unique to DORA — it is just more dangerous here because the stakes are regulatory.
The failure mode: reusing a past DDQ answer without checking whether the answer is still true. A vendor answered “yes, we notify within 24 hours of a material incident” 18 months ago. The policy has since changed to 72 hours. The old answer gets pulled from the KB, reused, and submitted. The buyer’s compliance team takes the 24-hour commitment into the contract. The vendor is now contractually bound to something they cannot operationally meet.
This is the specific category of risk that AutogenAI has written about — not AI-invented answers, but human-blessed stale answers delivered with confidence. DORA answers are exactly the kind of answer that rots: the regulation is new, the policies behind the answers are being actively revised, and a 12-month-old KB block almost certainly needs re-verification before it ships.
The fix is what the SME collaboration piece named: freshness discipline. Every DORA-tagged KB block has a last-verified date. Anything older than six months triggers an explicit re-review by the compliance owner before it ships. Anything referring to a specific numeric commitment — hours-to-notification, RTO, RPO — triggers a re-review regardless of date if the underlying policy document has been revised. The retrieval layer surfaces the freshness score visibly so the drafter knows they are pulling current content.
What to do this quarter
If you bid into EU financial services and you have not done a pass on your KB for DORA-relevant questions, do it now. Identify every block that touches operational resilience, incident notification, sub-outsourcing, or continuity. Tag them. Set a freshness policy. Assign an owner — almost always someone from security, compliance, or legal, not proposal writers. Walk through the most recent DDQ you submitted and diff the answers against your current policies.
The goal is not to be “DORA-compliant in the DDQ” — compliance belongs to your compliance function, not your proposal function. The goal is to make sure the DDQ answer matches the compliance posture honestly. The gap between the two is where contractual and regulatory risk accumulates, and DORA is the regulation that is making that gap most visible right now.
I expect this post to age quickly. If DORA enforcement produces major test cases in 2026, or if the supervisory guidance evolves, the right answer to a DORA-influenced DDQ will change. The discipline — re-verify, document, be honest about gaps — will not.