Meta, Google to Microsoft, tech titans face child exploitation lawsuits
Meta might be the latest tech major being slapped with a lawsuit for failing to safeguard children, but it certainly is not the only one.
On Tuesday, Meta was served a lawsuit by more than 30 US states, including 8 individual lawsuits from states, over the alleged use of features in Instagram and Facebook to lure children to the platform and get them hooked on harmful content. This fresh case has again opened the longstanding issue of companies exploiting children, one of the most vulnerable demographics of any digital user base in order to gain profit. However, Meta is not the only one to blame here. For years, many tech firms including Google, Microsoft, Apple, and others have faced similar lawsuits for failing to protect underage users.
Protecting children online: The Premise
So, if you have come across numerous news articles about Meta's lawsuit, you might be curious to know why is there a need to protect children online, and what dangers are they exposed to. The answer is a little complicated.
Unlike in real life, where the dangers are visible and often harm a person's physical well-being, in the digital space, there are invisible threats that damage a person's emotional and mental well-being.
For example, one of the charges on Meta is that its algorithm promotes harmful content on Facebook and Instagram for children. This content can be sexual or violent in nature, which affects the psychology of a growing child, but it can be far more subliminal. Even when these platforms promote age-appropriate but addictive content, it can have harmful effects. A study found that there is a 53 percent higher odds of poor sleep quality among adolescents with consistent bedtime social media use. It also found that “Social media increased use correlates to Emergency Department visits for mental illness, including depression, addiction, and anxiety”.
So, as even smaller triggers can affect the still-developing minds of children, many non-profit organizations as well as governments believe that there should be higher vigilance and protection available to underage users on the platform.
So, who's guilty?
It turns out, everyone. Most social media platforms, as well as companies that make age-agnostic consumer-facing products, have faced either complaints, petitions, or lawsuits around the issue. Following are some of the most notable ones in recent years, not counting the ongoing Meta lawsuit.
Meta in 2021: In March 2021, various cases were filed by Russia against Facebook, alongside Twitter, Google, TikTok and Telegram, after protests across Russia over the arrest of Alexei Navalny. All companies were claimed to have failed to delete posts that urged children to join the protests.
Meta whistleblower incident in 2021: In 2021, the former Meta employee Frances Haugen came out as a whistleblower and highlighted with documents about internal operations that the company was knowingly preying on its younger user base for profits. Haugen revealed an internal study at Instagram which found evidence that many adolescent girls using the photo-sharing app were suffering from depression and anxiety around body-image issues. Haugen's testimony to Congress is cited in Tuesday's complaint.
Microsoft in 2002: In early 2002, Microsoft proposed to settle the private lawsuits by donating $1 billion in money, software, services, and training, including Windows licenses and refurbished PCs, to about 12,500 underprivileged public schools. This was seen by the judge as a potential windfall for Microsoft, not only in educating schoolchildren on Microsoft solutions but also in flooding the market with Microsoft products.
Twitter (now X) in 2023: This lawsuit alleged that Twitter knowingly possessed and broadly distributed child sex abuse material of two 13-year-old boys, which was viewed, shared, and downloaded hundreds of thousands of times through the Twitter platform.
Google in 2019: Google and its subsidiary YouTube paid $170 million after allegations by the Federal Trade Commission and the New York Attorney General that YouTube illegally collected personal information from children without the consent of their parents.
Google in 2014: In 2014 a parent filed a class action lawsuit against Google for "in-app" purchases, which are microtransactions that can be made within applications. The parent contended that there is a 30-minute window during which authorizations can be made for credit card purchases that are designed to entice children to make such purchases in "free apps", and that Google should have been aware of the issue.
Google in 2006: The US Justice Department sought to compel Google to turn over one million web addresses from the company's database and one week's worth of search engine queries absent any personal information. The request was intended to help fight Internet pornography and counter legal challenges to the Child Online Protection Act of the country.
Apple in 2011: In 2011, five parents filed a class action suit against Apple for "in-app" purchases, which are purchases that can be made within apps. The parents contended that Apple had not disclosed that the "free" apps that were to be used by children had the potential to rack up fees without the parent's knowledge.
What these lawsuits prove is that tech platforms need to improve the protection they offer the users who are below the age of 18, and to ensure that psychological issues such as depression, anxiety, eating disorders, and such do not spread due to the content exposed to children.