Murky Consent in Software: GDPR & CCPA


Murky consent is a term coined by law professor Daniel J. Solove to describe consent that is ambiguous, uncertain, or incomplete. It is a middle ground between full consent and no consent, and it is often difficult to determine whether or not murky consent is valid.

There are a number of factors that can contribute to murky consent, including:

  • Power imbalances: When one person has power over another, it can be difficult for the less powerful person to give truly free and informed consent. For example, a student may feel pressured to consent to a sexual advance from a professor, or an employee may feel pressured to consent to their boss’s demands.
  • Coercion and manipulation: Coercion and manipulation can also lead to murky consent. For example, someone might be threatened or forced into giving consent, or they might be manipulated into believing that they want something when they don’t.
  • Lack of information or understanding: If someone does not have all of the relevant information about a situation, or if they do not understand the implications of their consent, their consent may be murky. For example, someone might consent to a medical procedure without fully understanding the risks and benefits.


I was working in ad-tech when GDPR went live in 2018. It was a fever dream of sitting in stuffy meetings with product and leadership as they tried to decide what exactly counted as PII (Privately Identifiable Information), what did not and how much of a pain it would be to completely remove that information if a user requested it.

Roll forward to the current timeline and every site visited for the first time will greet you with a banner giving you the option to “Accept All” cookies or “Reject All” to reject the cookies and work with a partially functional site. Some of the fancier banners will give you the option to fine tune which cookies are used. If you don’t click anything and proceed with browsing, it’s treated the same as “Accept All.”

I truly believe the transparency forced by GDPR and CCPA is good. But something about it never sat right with me – I finally found the term I was looking for: Murky Consent.

I recently came across this paper: “Murky Consent: An Approach to the Fictions of Consent in Privacy Law” by Daniel J Solove and it hit so many nails on the head. I highly recommend reading the summary at the very least.

By clicking “Accept All” (or dismissing the banner without making a selection) the user gives the site carte blanche to store and share any cookie they want. In many cases, the user is lead to believe their experience while using the site will be significantly degraded, or simply lacks the context to understand the long reaching implications of how their data is used (for example, tracking pixels that may cause their information to be shared to a variety of sites the user has never visited, see: TikTok tracking non users). This is the murkiest and is ripe for abuse.

The paper proposes we take a different mindset – recognize and treat user consent as “a fiction” and not give data brokers the “moral magic” the idea of binary consent gives.

Because the law pretends people are consenting, the law’s goal should be to ensure that what people are consenting to is good. Doing so promotes the integrity of the fictions of consent. I propose four duties to achieve this end: (1) duty to obtain consent appropriately; (2) duty to avoid thwarting reasonable expectations; (3) duty of loyalty; and (4) duty to avoid unreasonable risk.

Daniel J Solove


In particular, I think tech companies are generally weak on (2) and (4). GDPR (and CCPA) is a good start at transparency, but the actual reality of how tech companies gather and think about user data should shift to reflect the growing reality and complexity of what it means to be “someone online.”