Tuesday, August 30, 2022

Tuesday Talk*: Do It For The Children, Facial Recognition Edition

Over at Techdirt, Mike Masnick has been doing a series of posts about a stunningly dystopian scheme to protect children from “inappropriate” content, whatever that means in California, and which could very well end up affecting SJ (and hence you, dear reader) as it would the readers of Techdirt and pretty much every other site in existence.

Eric Goldman parsed the bill when it was little more than a twinkle in an unduly passionate eye.

First, the bill pretextually claims to protect children, but it will change the Internet for EVERYONE. In order to determine who is a child, websites and apps will have to authenticate the age of ALL consumers before they can use the service. NO ONE WANTS THIS.

It will erect barriers to roaming around the Internet. Bye bye casual browsing. To do the authentication, businesses will be forced to collect personal information they don’t want to collect and consumers don’t want to give, and that data collection creates extra privacy and security risks for everyone. Furthermore, age authentication usually also requires identity authentication, and that will end anonymous/unattributed online activity.

Second, even if businesses treated all consumers (i.e., adults) to the heightened obligations required for children, businesses still could not comply with this bill. That’s because this bill is based on the U.K. Age-Appropriate Design Code. European laws are often aspirational and standards-based (instead of rule-based), because European regulators and regulated businesses engage in dialogues, and the regulators reward good tries, even if they aren’t successful. We don’t do “A-for-Effort” laws in the U.S., and generally we rely on rules, not standards, to provide certainty to businesses and reduce regulatory overreach and censorship.

Third, this bill reaches topics well beyond children’s privacy. Instead, the bill repeatedly implicates general consumer protection concerns and, most troublingly, content moderation topics. This turns the bill into a trojan horse for comprehensive regulation of Internet services and would turn the privacy-centric California Privacy Protection Agency/CPPA) into the general purpose Internet regulator.

There have been some mods to the bill since it’s been embraced by the California lege, at the urging of a UK baroness no less, but even so, what it appears to mean is that I, as the proprietor of this here hotel, must make a sufficiently valid guesstimate of the age of each reader such that I can protect those of you under 18 from harm by seeing inappropriate material (like Howl’s Broadway videos). The only way to know whether younguns are accessing inappropriate content is to scan everyone who does so to guess their age.

If you thought cookie pop-ups were an annoying nuisance, just wait until you have to scan your face for some third party to “verify your age” after California’s new design code becomes law.

On Friday, I wrote about the companies and organizations most likely to benefit from California’s AB 2273, the “Age Appropriate Design Code” bill that the California legislature seems eager to pass (and which they refer to as the “Kid’s Code” even though the details show it will impact everyone, and not just kids).

And as the vendors selling this facial recognition software and supporting this law with every tear for the children they can muster, promises, they will never collect, save and sell their data because that would be wrong.

First, we want to reassure you and your readers generally about anonymity. The purpose of the online age verification sector is to allow users to prove their age to a website, WITHOUT disclosing their identity.

This can be achieved in a number of ways, but primarily through the use of independent, third-party AV providers who do not retain centrally any of your personal data. Once they have established your age or age-range, they have no need (and under EU GDPR law, therefore no legal basis) to retain your personal data.

In fact, the AV provider may not have needed to access your personal data at all. Age estimation based on facial analysis, for example, could take place on your own device, as can reading and validating your physical ID.

Does this “reassurance” leave you feeling all warm and fuzzy? On the one hand, there are many who sincerely believe that children need to be protected from the internet on many levels. On the other hand, if it’s left in the hands of lawmakers, will this be the future of browsing and privacy?

*Tuesday Talk rules apply.

No comments:

Post a Comment