Data flow, threat modelling and risk analysis: A practical guide for cybersecurity professionals

by David Soto Dalmau (ERNI Spain)

Cybersecurity today is as much about anticipation as it is about defence. Modern systems face complex architectures, regulatory pressure and increasingly sophisticated threats. For security professionals, tools like data flow diagrams, threat modelling and risk analysis are no longer optional checklists – they are practical methods to uncover weaknesses, prioritise risks and demonstrate compliance.

Why these techniques matter: From engineering to compliance

In modern system design, data flow diagrams (DFDs), threat modelling and risk analysis are not optional. These techniques help identify weak points before attackers do – but more than that, they are increasingly demanded by regulatory frameworks.

Key regulatory drivers

Cyber Resilience Act (CRA)

Mandates secure-by-design development and lifecycle risk management. Article 10 stresses understanding component interaction and article 15 requires risk-based assessment and residual risk evaluation.

ENS (Esquema Nacional de Seguridad – Spain)

Enforces risk analysis and threat identification in information systems classified as medium or high. It demands segmentation, access controls and traceability.

ISO/IEC 27001 & ISO/IEC 27005

Require data flow mapping (Annex A.8), identification of potential threats, and structured risk analysis. These are foundational for building and maintaining an Information Security Management System (ISMS).

Bottom line is that these aren’t just good engineering practices – they’re a foundational requirement to meet compliance, pass audits and protect critical assets.

Data flow diagram (DFD): Understand before you protect

A data flow diagram isn’t just a technical drawing – it’s your first line of defence. If your team doesn’t understand where sensitive data lives, how it travels or who has access to it, no control will be reliable.

Step-by-step approach:

  • External actors: These are entities outside your control, such as users, external systems, third-party APIs, or devices. Label them clearly.
  • Processes: Represent backend services, API gateways, cron jobs, identity management services – anything that “does something” with data.
  • Data stores: SQL databases, NoSQL stores, log aggregators, blob storage (e.g., S3) or file systems. Note the sensitivity of each.
  • Data flows: Every interaction or data exchange. HTTP(S) requests, message queues, database reads/writes, webhook calls, etc.
  • Trust boundaries: These are crucial. Define where control, responsibility or protection levels change – e.g., between browser and API, public vs internal zones, cloud vs on-prem, or inter-team ownership.

Example: Login flow in a web application

Consider the following architecture:

  • Frontend: React SPA served via CDN
  • Backend: Node.js Express API
  • Authentication: External OAuth2 provider
  • Data store: PostgreSQL database (sessions and user profiles)

This is what your DFD might look like:

  1. User submits credentials → SPA sends request to /auth endpoint
  2. Backend validates input and delegates to OAuth2 for token exchange
  3. OAuth2 responds with ID token → stored in session DB
  4. Session ID returned to frontend → used in future requests
  5. Trust boundaries exist between:
    • Browser and backend (public zone)
    • Backend and external OAuth provider (third-party integration)
    • Backend and internal database (protected zone)

This visual representation already highlights:

  • Surface exposed to users
  • External dependencies
  • Internal data flows and access control zones

Without this baseline, risk conversations are blind.

Sample DFD

Threat modelling: Simulate before you’re simulated

Once you know how the system works, you need to model how it could be abused. Threat modelling helps uncover the conditions under which security controls might fail – and how attackers might exploit that.

When and how to apply it:

  • Perform threat modelling during design, and revisit after major architectural changes.
  • Apply to each component and flow in the DFD.
  • Use a structured framework to avoid blind spots.

Recommended frameworks:

  • STRIDE:
    A simple but effective model to classify threats:
    • Spoofing (e.g., forging identities)
    • Tampering (modifying data or code)
    • Repudiation (actions can’t be tracked)
    • Information disclosure (data leaks)
    • Denial of service (system overload)
    • Elevation of privilege (gaining unauthorised access)
  • PASTA (Process for attack simulation and threat analysis):
    A 7-stage framework focused on business impact and risk. Strong for systems with regulatory exposure or complex attack surfaces. Steps include defining objectives, decomposing the application, identifying threats, simulating attacks and quantifying risks.

Example: Using STRIDE on the Auth API

Let’s take the login flow DFD described earlier.

  • Spoofing: Could an attacker forge a login request?
    → Weak JWT validation or unsigned tokens allow impersonation.
  • Tampering: Could an attacker manipulate data in transit?
    → Using HTTP instead of HTTPS opens the door to MITM attacks.
  • Repudiation: Could a user deny having logged in?
    → If login events are not logged with user IP and timestamp, auditability is lost.
  • Information disclosure: Could data leak?
    → If the API returns error traces or user details without access checks.
  • DoS: Could the login endpoint be overwhelmed?
    → No rate limiting exposes the service to brute-force or resource exhaustion.
  • Elevation of privilege: Could a normal user act as admin?
    → JWTs with editable claims (e.g., “role”: “admin”) and missing signature verification.

Example: Using PASTA on the same system

  • Define objectives: Secure user authentication and session control
  • Decompose app: Identify exposed endpoints (/auth, /me, /logout), storage (sessions, profiles)
  • Enumerate threats: Session fixation, credential stuffing, token replay
  • Analyse vulnerabilities: No IP/session binding, long token lifetimes
  • Simulate attacks: Replay JWT from another device
  • Analyse impact: Unauthorised access to PII
  • Plan mitigation: Implement token expiration, device fingerprinting, IP change invalidation

PASTA forces consideration of real-world attacker behaviour and helps document justifications for risk decisions – ideal for regulated environments.

Risk analysis: Prioritise what matters

Threats without context mean little. A buffer overflow in an admin-only console with VPN access is not the same as one in a public endpoint. That’s why you translate threats into risk, and prioritise based on business impact.

Steps to perform risk analysis:

1. Evaluate likelihood:
  1. High: Common attack vector, system is exposed, control missing
  2. Medium: Feasible but needs skill or specific timing
  3. Low: Unlikely, rare vector or strong mitigations in place
2. Evaluate impact:
  1. Critical: Regulatory violation, major breach, loss of availability
  2. High: User impersonation, unauthorised data access
  3. Medium: Reduced functionality, minor leakage
  4. Low: Low business value, no real effect
3. Assign risk level based on the combination:
  1. Risk = Likelihood × Impact
  2. Map into categories: Low, Moderate, High, Critical
  3. Document mitigations and residual risks

Example: Risk matrix

What it gives you:

  • A rational basis for accepting, mitigating or transferring risk
  • Justification for backlog prioritisation
  • Compliance evidence for audit trails and regulators

Conclusion: Don’t wait for incidents to map reality

Security isn’t just firewalls and encryption. It’s understanding your system deeply enough to:

  • Map what exists (data flow)
  • Predict how it could be broken (threat modelling)
  • Act based on business reality (risk analysis)

Organisations that build this into their development and architecture process gain:

  • Better design decisions
  • Stronger compliance posture
  • Faster incident response
  • More efficient security investment

Build the habit. Document it. Iterate with each major change. Whether you’re securing a critical system or passing an audit, these three tools will be your foundation.

Do you want to dive deeper into cybersecurity best practices? Explore one of our previous articles Defending the digital frontier: A call for cybersecurity readiness in an era of uncertainty.

Are you ready
for the digital tomorrow?
better ask ERNI

We empower people and businesses through innovation in software-based products and services.