Meta Is Right. Apple and Google Can Do More to Protect Kids
The owner of Facebook and Instagram proposes the sensible idea of always requiring a parent’s approval whenever someone younger than 16 tries to download an app from the companies’ stores.
The question of age verification on the internet has always been fraught. Every now and then, a usually well-intentioned politician sees requiring age checks as a straightforward policy victory only to realize later not nearly enough consideration had been given to how a practical system would truly work.
Privacy advocates argue users shouldn't need to confirm their age and identity to use apps or browse the web, even when (or especially when) they're looking for sexual or other content they would like to keep secret. Handing over credit card information, even if there's no charge, comes with obvious security and privacy implications. Another option, such as verifying users by having them upload a government ID, is even worse — not to mention exclusionary. Newer efforts to use computer camera technology to estimate a person's age show promise, though I wouldn't blame any user for not trusting such a process.
We are now on WhatsApp. Click to join.
But that doesn't mean everyone should just throw their hands up in frustration. Options are not lacking to protect children — we just need to recognize the right ones when we see them. Meta Platforms Inc.'s call this week for federal regulation to add an age verification process to popular app stores is one such example. The Facebook and Instagram owner — which already goes to considerable (if partly futile) efforts to restrict young users on its platforms and keep those younger than 13 off its main apps — is arguing for Apple and Google to always obtain a parent's approval whenever a child younger than 16 tries to download an app from their stores. Both companies already offer opt-in features to require parental approval for app downloads and purchases, but Meta wants it to be the default, enforced by law.
“This way parents can oversee and approve their teen's online activity in one place,” wrote Antigone Davis, Meta's head of safety. “They can ensure their teens are not accessing adult content or apps, or apps they just don't want their teens to use. And where apps like ours offer age-appropriate features and settings, parents can help ensure their teens use them.”
This is a technically feasible, light-touch approach that doesn't intrude on a teenager's right to be a teen. It doesn't give parents access to everything their child does in the app, but they do get sufficient visibility into the digital places they are spending their time. Like setting boundaries for where a child can go and play outside in the physical world, this simple measure would help keep kids away from the busy roads and dangerous rivers of the internet.
What it wouldn't do is much to prevent minors from using web browsers, rather than apps, to access inappropriate sites. Indeed, apps that are specifically for sexual content are already prohibited in Apple's and Google's stores. But harm on the internet doesn't begin and end with naked flesh. It's YouTube videos that contain hateful content; Discord channels that might involve unsuitable conversations or material; bullying and harassment on Instagram. A parent knowing what apps a child has is at the very least a conversation starter. As Davis wrote, parents will have a different opinion, as is their right, about the right age for their children to have access to certain technologies.
In pushing for a law, Meta has something to gain, of course. It's trying to avoid the complexity of dealing with a patchwork of laws across several states. Meta is right to stress that these different approaches mean children are protected inconsistently. And the measures often rely on sites using a hodgepodge of third-party verification services. In Texas, a federal judge blocked an age verification law, saying it violated the First Amendment. A judge in Arkansas echoed the same about a similar law in that state. Meta's understandable desire for simplicity in enforcement comes at no cost to the rights of users.
Encouragingly, there is a bipartisan bill on child protection online calling for a feasibility study into how age verification might be handled at the device or operating system level, as Meta is suggesting, though some civil liberties groups have broader concerns about the bill.
Making the app stores function as a centralized age verification mechanism reduces the risk of highly personal data being widely disseminated among several entities responsible for verifying ages on behalf of each and every app. Give those details once to either Apple or Google — something you've likely already done — and you're good to go. (Apple declined to comment on Meta's blog post; Google did not respond to requests for comment.)
Critics of Meta might at this point scoff at the company positioning itself as some kind of champion of child protection. Its blog post comes just days after a former employee testified to the Senate that Facebook ignored repeated warnings on harassment of teens through the app. Along with other leading apps, Meta faces hundreds of lawsuits that contend its apps are intentionally addictive to young people.
Those criticisms deserve to be heard but can be treated separately. Protecting young people on the internet will always be a delicate balancing process: The rights of parents to protect their children, the rights of teens to have their own private spaces, and the rights of everyone to use the internet freely and without any greater erosion of privacy than has already happened. It's complicated. But when one obvious step can be taken — one with, as far as I can see, no negative consequences — we shouldn't hesitate to take it.
One more thing! HT Tech is now on WhatsApp Channels! Follow us by clicking the link so you never miss any updates from the world of technology. Click here to join now!