It’s the day before a holiday - and, of course, I’m scrolling through my emails. And there I see it - the new proposed New York Privacy Act (NYPA). My little regulatory compliance heart skips a beat. A new term: data fiduciary. This term, if it passes, could change the way we look at privacy and corporate responsibility.
What is a “data fiduciary”?
According to the text, the NYPA defines a data fiduciary as:
Every legal entity, or any affiliate of such entity, and every controller and data broker, which collects, sells or licenses personal information of consumers, shall exercise the duty of care, loyalty and confidentiality expected of a fiduciary with respect to securing the personal data of a consumer against a privacy risk; and shall act in the best interests of the consumer, without regard to the interests of the entity, controller or data broker, in a manner expected by a reasonable consumer under the circumstances.
So, let’s look at this for a minute before I go deeper into how/why this could signal a major change in how we all approach cybersecurity.
I’m going to summarize the major tenets of this for everyone because, well, if you wanted to read it, you’d just go to the document. Section 1102 which defines data fiduciary is fairly long and likely pretty boring to most people. While some of the requirements are exactly what we’re used to seeing, some of them are not.
On the “well, I already do that” list, organizations must:
- reasonably secure personal data from unauthorized access
- promptly inform the consumer of any breach
- take reasonable steps to monitor the data security of the person with whom information is shared
So far, so good. Old hat, as you might say. However, here’s where this starts to get interesting from the legal perspective:
- not use personal data in a way that is a detriment to an end user that will cause foreseeable and material physical or financial harm
- disclose, sell, or share personal data without the person who gets the information entering into a contract that creates the same duty of care as the discloser/seller/sharer
- defines "privacy risk as:
- “psychological harm” including anxiety/embarrassment
- significant inconvenience or expenditure of time
- stigmatization or reputational harm
- disruption and intrusion from unwanted commercial communications or contracts
- effects on an individual that are not reasonably foreseeable, contemplated by, or expected by the individual to whom the personal data relates, that are nevertheless reasonably foreseeable, contemplated by, or expected by the controller assessing privacy risk, that: (A) alters that individual’s experiences; (B) limits that individual’s choices; © influences that individual’s responses; or (D) predetermines results; or (j) other adverse consequences that affect an individual’s private life, including private family matters, actions and communications within an individual’s home or similar physical, online, or digital location, where an individual has a reasonable expectation that personal data will not be collected or used
- states that the fiduciary duty to the consumer supersedes any other duties owed
In short, the proposed law creates a strict liability-esque standard. As an industry, we’re used to this. What the proposed NYPA does, however, is extend the definition of harm and create a new type of responsibility that could change how we view data.
What is a fiduciary?
The first thing to understand here is the definition of “fiduciary.” According to B lack’s Law Dictionary , a fiduciary is:
a person holding the character of a trustee, or a character analogous to that of a trustee, in respect to the trust and confidence involved in it and the scrupulous good faith and candor which it requires.
Basically, what this means is that anyone considered a fiduciary must work really hard at keeping a secret and fulfilling their responsibilities.
Most senior level executives are used to thinking of their “fiduciary duty” to shareholders. As Black’s defines fiduciary duty:
when one part much act for another. They are entrusted with the care of property or funds.
Normally, we think in terms of fiduciary duty as being a monetary promise made between executives or a Board of Directors and their shareholders. For example, senior executives and Boards are expected to refrain from conflicts of interest or secret profits from a business when fulfilling their fiduciary duty. Basically, don’t be an Enron.
How does applying fiduciary duty to personal data change the cybersecurity landscape?
First, let’s take a look at the concept of “entrusted with the care of property or funds.” The NYPA may be the first time that a law is aligning data with property/money.
Let’s just talk about that for a minute. We all know, theoretically, that data matters to people. However, no other law has specifically laid it out as a property that can be traded. This is a massive shift in how the law is seeking to view data. The concept of property ownership, especially in the United States, is one we hold dear to our hearts. So dear, in fact, that the 5th Amendment protects “life, liberty, and property” thus requiring due process should any of those be limited. In short, the shift from data as an abstract to being viewed as property establishes new legal rights to its protection.
Think about it this way: if a random real estate company tried to sell your house without your knowledge, you’d be able to sue. In fact, we’d probably never do that because, well, it’s fundamentally wrong.
Now, applying that to data considered as property - a random marketing company can’t sell you data without your knowledge, you’d be able to sue. In fact, the whole concept of personal data as property make the concept fundamentally wrong.
What about this definition of “privacy risk”?
The definition of privacy risk also expands the legal definitions.
For example, the GDPR is silent on the definition of “damages” only stating:
- Any person who has suffered material or non-material damage as a result of an infringement of this Regulation shall have the right to receive compensation from the controller or processor for the damage suffered.
That tells us a whole lot of nothing. In fact, my guess is, that’s going to end up being a whole lot of confusing litigation.
Oh, please. I’m a compliance dork who used to work in environmental claims. I love watching this kind of thing unfold slowly. After all, as they taught us in law school “bad facts make bad law.” And we all know that there are going to be some bad factual cases leading to some bad precedent.
Now, let’s take a look at the CCPA which is equally silent on what constitutes damages providing only the following:
Any consumer whose nonencrypted or nonredacted personal information … is subject to an unauthorized access and exfiltration, theft, or disclosure a result of the business’ violation of the duty to implement and maintain reasonable security procedures and practices appropriate to the nature of the information to protect the personal information may institute a line 16 civil action
While the CCPA is harsh, it’s also pretty limited. In short, a consumer can only sue (and note this is based on damages not greater than $750 per consumer per incident or actual damages whichever is greater ) if they can prove that the company did not maintain and implement “reasonable” security procedures. Basically, you need to have had a breach that actually caused some kind of damages, but those damages aren’t defined anywhere.
Now, the NYPA? That goes an extra step and incorporates a lot of subjectivity.
- The proposed regulation talks about "privacy risk, " not just an actual breach. All a person has to do is show that there was a risk to their data being disclosed or accessed.
- It defines the damages pretty broadly including the consumer being “embarrassed” or experiencing a disruption or intrusion by unwanted commercial communications.
Starting with the risk aspect, this is the first time a law has focused on risk rather than breach. This law takes risk of breach into account rather than requiring an actual breach. What if, for example, someone decides to read a SOC 3 report and sue?
We don’t know. That would be an interesting challenge, don’t you think?
Now, the categories of damages are also interesting. After all, things like “embarrassment” are pretty subjective. What embarrasses me, may not embarrass you. Moreover, it also incorporates all of those spam emails that I hate but never unsubscribe to because, as well all know in the industry, that’s just telling people the email is real which becomes a new threat vector. Also, “disruption or intrusion” is pretty broad. Am I disrupted from doing work every time I delete a marketing email?
Well, hot diggity. That’s going to lead to some interesting lawsuits right there.
What does all this mean?
That’s a good question. Sure, the NYPA may never get passed, but this shows the direction several states may be heading. But, if we’re going to be honest with ourselves in this industry? It means that we need to start thinking differently about data. We also need to start talking to our C-suites and Boards differently.
The idea of being a “data fiduciary” will change the way we collect, process, store, and transmit data. We need to start preparing ourselves. Being a data fiduciary will increase the duty of care to which organizations will be held. Best practices may not be enough. We need to start working on bestest practices - ones that are proactive rather than reactive.
Equally important, this law could be an interesting attempt at inserting the language of business (fiduciary) to cybersecurity. The C-suite and Boards will understand this term. They will realize the impact it has on their approach to security. The term “data fiduciary” creates a shift in the narrative. It’s no longer about just protecting from a data breach. Now, it’s comparing money with personal information and focusing on privacy risk .
Consumers place trust in organizations who manage their data. Are we really at a point where we should be doing that? Are we really ready, as an industry and community, to take on that responsibility? Can we, really, keep our promises?