LINDDUN - Privacy Threats
Modeling threats to protect data from the start
Security vs. Privacy – As a software engineer, these demands often seem to be in conflict, but they don’t have to be.
Many believe that securing a system against attacks automatically ensures privacy, but I don’t see it that way.
Privacy professionals are already familiar with concepts like Privacy by Design1 and the Transparency and Consent Framework (TCF), while security professionals work closely with frameworks like Security by Design, NIST, and OWASP. Strengthening security and closing the doors to OWASP's most common threats is the bare minimum for any business operating in the digital space.
But what if there were a security methodology specifically for privacy?
That’s where LINDDUN2 comes in! It’s a methodology designed specifically to model privacy threats and ensure that systems are built with risks related to personal data processing in mind.
In practice, I see it as an interesting mix of Ann Cavoukian's principles and NIST, updated to address today’s most common threats with a stronger focus on developers and infrastructure.
LINDDUN works as a framework to identify threats based on seven categories:
Linkability
Identifiability
Non-repudiation
Detectability
Disclosure of information
Unawareness
Non-compliance.
Unlike traditional security approaches that focus on protecting systems from external attackers, LINDDUN analyzes how the system's own design can create privacy risks. This means that, even before writing a single line of code, we can anticipate and mitigate potential privacy issues.
So, while LINDDUN has a similar purpose, it is not the same as Privacy by Design (PbD). The well-established Privacy by Design framework is a broader principle that advocates for privacy as a foundational pillar from the start of system development. LINDDUN, on the other hand, is a practical methodology that helps structure this approach by providing a detailed risk analysis and proposing mitigation strategies. To put it simply: Privacy by Design defines the philosophy, while LINDDUN provides the tools to implement it in a structured and technical way.
One depends on the other. I see LINDDUN as an extension of PbD that answers the question: "Great, but how do I actually do this?" Privacy and data protection regulations don’t tell companies how to handle every situation, and even broad concepts or philosophies lack the practical guidance needed for real-world implementation.
I used LINDDUN as an example, but there are other initiatives, such as Privacy Patterns3, which provides a repository of patterns designed to integrate privacy into software development. This platform gathers practical strategies applicable to different contexts, from data minimization to granular control over data sharing and international transfers. Some examples include "Minimize Data Collection," which suggests capturing only the strictly necessary data for a specific purpose (and explains how to do it), and "Selective Disclosure," which allows users to reveal only the essential information needed to interact with a system.
Returning to LINDDUN, its case study page provides advanced real-world scenarios for modeling privacy threats, demonstrating how the methodology can be applied to complex systems.
Unlike basic examples, where the approach is more linear, these case studies involve multiple layers of interaction, more sophisticated data flows, and real-world challenges in system implementation. This highlights the importance of structuring a threat analysis that goes beyond obvious vulnerabilities, considering indirect impacts that could compromise data subjects' privacy.
Isn't that interesting?
The document also explores cases where privacy can be compromised in less obvious ways. For example, there are scenarios where combining different data sources can lead to user re-identification, even when individual pieces of information seem anonymized.
I once talked about this in the context of a known attack called "Linkage Attack", before I even knew there was a structured methodology with practical strategies to reduce this risk.
In addition, the document covers mitigation strategies, using techniques like data minimization, adding noise to datasets, and enforcing strict access and sharing controls.
Studying these more complex cases makes it clear that LINDDUN is not just a theoretical tool, but a practical framework that can be adapted to different domains and contexts. Whether in web applications, IoT devices, or machine learning systems, threat modeling helps anticipate risks and strengthen privacy from the earliest development stages.
For software engineers and system architects, understanding these examples would be a valuable differentiator, one that few truly master.
A. Cavoukian, “PbD origin and evolution,” Privacy by Design, 2012; http://privacybydesign.ca/
https://linddun.org/
https://privacypatterns.org/