Why Security Does Not Equal Privacy


As marketers and publishers discuss their responsibilities concerning consumer data, they must avoid interchangeably using the terms ‘privacy’ and ‘security.’ Bosko Milekic, Chief Privacy Officer and Co-founder of Optable, sheds light on the need to differentiate between security and privacy and how organizations can better protect their data. 

As the privacy compliance burden on our industry spirals upward, marketers and publishers are naturally thinking harder about their responsibility to consumer data. 

But as the privacy discourse takes on more importance, we notice some in our industry making a persistent error in framing the discussion – conflating “privacy” and its enabling technologies with tried and true “security” methods developed over the last 40 or so years. You hear it off-handedly in phrases like “securing our customers’ data” and, more importantly, in who we look to for solutions. 

This is understandable – the two disciplines are related after all – but treating these two concepts interchangeably could nevertheless prove costly, leading publishers and advertisers to repeat the original sin of programmatic media – collecting too much data. 

Security and Privacy: What Is the difference? 

“Security” and “privacy” are broad concepts. This article will focus on how the online ad industry applies these terms and ignore use-cases for verticals like finance and healthcare. 

First, let us define terms: security refers to methods used to safeguard personal data, and in particular, to protect against security breaches, which occur when a party gains unauthorized access to user data. Breaches happen relatively frequently and typically involve large numbers of users. In fact, in October 2021, Fortune reported 1,291 data breaches this year, 17% more than the total in 2020, and suggested this could be an unprecedented year for data compromises.

On the other hand, privacy is about protecting information related to an individual. Privacy protections can live on user devices, restricting the personal information transferred to an app owner, publisher or other data collecting entity. Notably, privacy also comes into play when a company shares information and data on its customers with third parties. For example, Safari and Firefox have phased out third-party cookies to protect user privacy and Google Chrome, which controls over two-thirds of the global browser market, plans to follow suit by 2022. If this happens, third-party cookie tracking will be effectively eliminated. 

Security is vital for collecting personal data and the individual user (or data subject) at a high level. Privacy is essential for the data subject first and foremost. You could argue that privacy is also crucial for the company collecting personal data to reduce the risks associated with collecting and sharing said data.

Why does the distinction matter? 

Security and privacy are both essential to digital advertising. There is always an obligation to understand and leverage each discipline when working with and sharing information about people. But the similarity ends there. 

Nothing needs to change regarding how online advertising works from a security standpoint, even as our data landscape becomes more complicated. Regardless of what data you’re dealing with, it’s your job to protect that data – meaning you have to encrypt it, comply with audits, and ensure that only those who have the rights to access it can do so. 

On the other hand, privacy compliance requires a patchwork of technologies, processes, and methodologies that our existing security infrastructure was not designed to support. 

And these privacy tools are iterating quickly. The first such capabilities, our good old Transparency & Consent Frameworks (TCFs), have enabled companies to gather opt-ins from individual users for years. TCFs have emerged as the bare minimum capability required to comply with the GDPR and other regulations globally in the last few years. Since then, publishers and advertisers have been introduced to a steady procession of privacy-enhancing technologies, including things like differential privacy, private set intersection and even a data cleanroom piloted by entertainment giant Disney.

See More: Worried About Online Privacy? Check Out These 5 Privacy-Centric Browsers

Good Security Does Not Equal Compliance 

None of these newer privacy-enhancing technologies has much to do with security. Instead, their primary function is to enable publishers (and other data owners) to exchange users’ personal information with advertisers directly, a critical use case for digital advertising.  

But any time data about an individual is exchanged, there is potential for privacy to be compromised. Take, for instance, the outright sharing of log files.  

Since log data tend to be device or network-level, it contains much information about individuals. Even when log data is shared securely, with all parties using best practices such as end-to-end encryption, the amount of information transmitted. The party that receives the log files may gain access to information about individuals that it did not have previously – and in many cases did not even request. This is still surprisingly common, even though it arguably violates GDPR’s “privacy by design” requirements. 

This type of failure often occurs when two companies share data without a specific purpose and specific controls to limit the information transmitted. Entering agreements without these guardrails has become far riskier than it used to be – legally and from the standpoint of customer goodwill. 

A big reason we still encounter excessive sharing of data between parties has to do with those parties’ over-reliance on security measures that were never designed for nuanced data sharing. When Company A shares data with Company B knowingly and through a contractual agreement, these arrangements are often made securely. The data is encrypted, Company B has the keys, and everyone sleeps well at night. 

However, this is not to negate the importance of security. Without suitable security protocols, a company risks leaking data to unauthorized parties. This, in turn, creates the potential for a follow-on privacy breach. But even with good security, Company A may well have divulged potentially private information about individuals that it should not have. Thus while good privacy necessitates good security as a prerequisite, security best practices won’t do the job alone. 

See More: In the Age of Multi-Cloud, Is Data Governance Receiving the Attention It Deserves? 

The Future Is About “Minimum Data” 

As 2021 winds down, we know that privacy is a spectrum, not an on/off switch.

The more privacy someone desires and expresses through their device, browser and app settings, the fewer companies that enable digital experiences are permitted to do with that person’s data, which is how it should have been from the beginning. 

But we haven’t yet achieved this end state, and security best practices won’t get us there. Those of us who work in the emerging privacy milieu need to ensure we expose no data beyond what is necessary and take any necessary steps to limit the exposed data. 

Where would you place your organization on the privacy spectrum? Share with us on LinkedInOpens a new window , TwitterOpens a new window , or FacebookOpens a new window . We’re always keen to hear from you!