Lawmakers looking for a starting place on privacy legislation should pass on The Uniform Law Commission’s Uniform Personal Data Protection Act (UPDPA). The Uniform Law Commission (ULC) seeks to write model legislation that can be adopted in state legislatures across the country to set national standards. Sadly, the ULC has fumbled its consumer privacy bill and created, in the UPDPA, a model bill that is weak, confusing, and toothless.
A strong privacy bill must place consumers first. EFF has laid out its top priorities for privacy laws, which include a full private right of action that allows people to act as their own privacy enforcers, and measures that prevent companies from discriminating—by charging more or offering less—against those who wish to protect their privacy by exercising their rights. EFF also advocates for an opt-in consent model that requires companies to obtain a person’s permission before they collect, share, or sell their data, rather than an opt-out model.
The UPDPA falls short on many of these fronts. And why? Because, despite years of evidence that companies will not protect consumer privacy on their own, the UPDPA defers to company complaints that respecting people’s privacy is a burden. In fact, UPDPA Committee Chairman Harvey Perlman openly admitted that one of the drafting committee’s main goals was to lower business compliance costs.
By lowering its standards to coax companies into compliance, the UPDPA leaves consumers twisting in the wind.
By seeking a middle path on some of the biggest disagreements between consumer advocates and companies looking to do as little as possible to change their practices, the UPDPA has come up with “compromises” that work for no one. Company advocates find its suggestions confusing, as it sets up yet another framework for compliance. Consumer advocates find the “protections” in the bill hollow. It’s no surprise that one Oklahoma legislator told the International Association of Privacy Professionals the bill was “empty.” “There appears to be nothing else substantive in this bill besides an obligation for the data company to provide a voluntary consent standard,” he said. “Essentially those in control of the data get to decide what their policies and procedures are going to be. So this law is empty because it’s saying [businesses] have to come up with something to address privacy, but we’re not telling you exactly what it is.”
Consumer Rights, But Defined by Companies
By lowering its standards to coax companies into compliance, the UPDPA leaves consumers twisting in the wind. At its core, the bill hinges on whether a company uses your information for purposes that are either “compatible” or “incompatible” with the reasons the company originally collected the information. So, for example, you might allow a company to collect your location information if it’s going to do something for you related to where you are, such as identify certain restaurants near you. This kind of guardrail might sound good at first blush; in fact, it’s in line with an important privacy principle –companies should only use a consumer’s information for the kinds of purposes that a consumer gave permission for in the first place. However, the UPDPA undermines the meaning of “compatible purpose”—providing no real protections for normal people.
First, individuals have no say over whether the purposes companies ultimately use their data for are “compatible” with the original purpose of collection, leaving that definition entirely up to companies. This gives a company wide latitude to process people’s information for whatever reason it may deem in keeping with the reason it collected it. That could include processing that a person wouldn’t want at all. For example, if the company collecting your location information to tell you about nearby restaurants decided it also wanted to use that data to track your regular travel patterns, it could unilaterally classify that new use as supposedly “compatible” with the original use, without asking you to approve it.
The UPDPA also defines targeted advertising as a “compatible purpose” that requires no extra consent—despite targeted ads being one of the most commonly derided uses of personal information. In fact, when consumers are given the choice, they overwhelmingly choose not to participate in advertising that tracks their behavior. This contorts ideas to protect privacy and lets an unwanted privacy invasion slip under the lowest bar possible.
Furthermore, when a company uses a consumer’s data for an incompatible purpose, the bill only requires the company to give the consumer notice and an opportunity to opt-out. In other words, if a weather app had your permission to collect your location information for the purpose of locally-accurate forecasts, but then decided to share it with a bunch of advertisers, it wouldn’t have to ask for your permission first. It would simply have to give you a heads-up that “we share with advertisers” and the option to opt-out—likely in a terms and conditions update that no one ever reads.
Other rights in this bill, including those EFF supports such as the right to access one’s data and the right to correct it, are severely limited. For example, the bill gives companies permission to ignore correction requests that they deem “inaccurate, unreasonable, or excessive.” They can decide which requests meet this criteria without providing justification. That gives companies far too much leeway to ignore what their customers want. And while the bill gives consumers the right to access their data, it does not give them the right to a machine-readable electronic copy—what is often call the right to data portability.
The UPDPA also comes up short on one of EFF’s most important privacy principles: making sure that consumers aren’t punished for exercising their privacy rights. Even in cases where the bill requires a company using data to get permission to use it for an “incompatible data practice,” companies can offer a “reward or discount” in exchange for that permission. In other words, you can have your human right to privacy only if you’re willing and able to pay for it.
As we have said before, this type of practice frames our privacy as a commodity to be traded away, rather than a fundamental right to be protected. This is wrong. Someone who values their privacy but is struggling to make ends meet will feel pressured to surrender their rights for a very small gain—maybe $29 off a monthly phone bill. Privacy legislation should rebalance power in favor of consumers, not double-down on a bad system of corporate overreach.
The UPDPA Has Big Blind Spots…
The UPDPA also fails to address how data flows between private companies and government. It’s not alone in this regard: while the European General Data Protection Regulation (GDPR) covers both government and private entities, many state privacy laws in the United States focus on just one or the other.
However, there is a growing need to address the ways that data flows from private entities to government, and the UPDPA largely turns a blind eye to this threat. For example, the bill considers data “publicly available”—and therefore exempt from its protections–if it is “observable from a publicly accessible location.” That would seem to exempt, for example, footage from Ring cameras that people place on their doors which document what is happening in adjacent public sidewalks. Information from Ring and other private cameras needs to be protected, particularly against indiscriminate sharing with law enforcement agencies. This is yet another example of how the model legislation ignores pressing privacy concerns.
The definition of publicly available information would also seemingly exempt information posted on limited-access social media sites such as Facebook entirely—including from requirements for adhering to privacy policies or security practices. Specifically, the UPDPA exempts "a website or other forum with restricted access if the information is available to a broad audience.” That is far too broad, and willfully ignores the ways private companies feed information from social media and other companies into the hands of government agencies.
…And No Teeth
Finally, the UPDPA has gaping holes in its enforcement provisions. Privacy laws are only as good as their teeth. That means strong public enforcement and a strong private right of action. This bill has neither.
Worst of all, it expressly creates no private right of action, cutting people off from the most obvious avenue for defending themselves against a company that abuses their privacy: a lawsuit. Many privacy statutes contain a private right of action, including federal laws on wiretaps, stored electronic communications, video rentals, driver’s licenses, credit reporting, and cable subscriptions. So do many other kinds of laws that protect the public, including federal laws on clean water, employment discrimination, and access to public records. There’s no reason consumer privacy should be any different.
By denying people this obvious and powerful tool to enforce the few protections they gain in this law, the UPDPA fails the most crucial test.
State attorneys general do have the power to enforce the bill, but they have broad discretion to choose not to enforce the law. That’s too big a big gamble to play with privacy. Attorneys general might be understaffed or suffer regulatory capture—in those cases, consumers have no recourse whatsoever to be made whole for violations of the few privacy protections this bill provides.
Don’t Duplicate This Bill
While the UPDPA wrestles with many of the most controversial discussions in privacy legislation today, it falls short of providing a meaningful resolution to any of them. It grossly fails to address the privacy problems ordinary people face—invasive data collection, poor control over how their information is used, no clear means to fight for themselves—that have data privacy on the agenda in the first place. Lawmakers, federal or state, should not duplicate this hollow bill and lower the bar on privacy.