From Joan Donovan and Sarah Parker, writing for The Conversation…
“You have blood on your hands.”
“I’m sorry for everything you have all been through.”
These quotes, the first from Sen. Lindsey Graham, R-S.C., speaking to Meta CEO Mark Zuckerberg, and the second from Zuckerberg to families of victims of online child abuse in the audience, are highlights from an extraordinary day of testimony before the Senate Judiciary Committee about protecting children online.
But perhaps the most telling quote from the Jan. 31, 2024, hearing came not from the CEOs of Meta, TikTok, X, Discord or Snap but from Sen. Graham in his opening statement: Social media platforms “as they are currently designed and operate are dangerous products.”
We are university researchers who study how social media organizes news, information and communities. Whether or not social media apps meet the legal definition of “unreasonably dangerous products,” the social media companies’ business models do rely on having millions of young users. At the same time, we believe that the companies have not invested sufficient resources to effectively protect those users.
Mobile device use by children and teens skyrocketed during the pandemic and has stayed high. Naturally, teens want to be where their friends are, be it the skate park or on social media. In 2022, there were an estimated 49.8 million users age 17 and under of YouTube, 19 million of TikTok, 18 million of Snapchat, 16.7 million of Instagram, 9.9 million of Facebook and 7 million of Twitter, according to a recent study by researchers at Harvard’s Chan School of Public Health.
Teens are a significant revenue source for social media companies. Revenue from users 17 and under of social media was US$11 billion in 2022, according to the Chan School study. Instagram netted nearly $5 billion, while TikTok and YouTube each accrued over $2 billion. Teens mean green.
Social media poses a range of risks for teens, from exposing them to harassment, bullying and sexual exploitation to encouraging eating disorders and suicidal ideation. For Congress to take meaningful action on protecting children online, we identify three issues that need to be accounted for: age, business model and content moderation.
For the rest, click here.