In January, the CEOs of X, TikTok, Meta, Snap, and Discord testified in front of a congressional committee about child exploitation on their platforms. “Mr. Zuckerberg, you and the companies before us, I know you don’t mean it to be so, but you have blood on your hands,” Senator Lindsey Graham said at the time.
Despite confrontational questioning from Graham and others about how many underage users were on their platforms, and what safeguards protected them, Zuckerberg and other executives weren’t questioned about the concerning practices of some parents who manage social media accounts on behalf of their young children. A New York Times investigation the month after the hearing found that some parents, mostly of girls, were amassing tens of thousands of followers for their children by posting suggestive images that can attract predators.
Now, Democratic Senator Maggie Hassan is demanding that tech companies account for the untold thousands of accounts that place girls as risk of exploitation on their platforms, through the actions of adult account-holders.
“These corporations must answer for how they are allowing young women and girls to be exploited on their platforms and what steps they will take in response,” Senator Hassan, who represents New Hampshire, told WIRED. “Young women should be able to express themselves online in safe environments that do not facilitate the monetization of potentially exploitative content.”
The Times investigation found that parents can readily bypass the age restrictions of social platforms that bar children under 13 from having accounts. Some parents use the accounts they set up for their children to essentially monetize their daughters by putting them to work as influencers, garnering discounts and sponsorship deals or pulling in advertising revenue.
More sinisterly, some of these accounts brought in money from people seeking sexual or suggestive material about young girls, some of whom were convicted sex offenders. Some of these followers are willing to pay for extra photos beyond those shared on a girl’s social media account, or for private chats or used clothing. Times reporters examined some 5,000 accounts of young girls run by their parents.
While the Times found that some of the parents also operated TikTok accounts, the phenomenon was most prevalent on Meta’s Instagram. (X was not mentioned in the Times investigation, and the company claims that its underage user base constitutes less than 1 percent of its usership. WIRED has previously reported that the platform may not have the age verification systems needed to accurately make such a claim.)
“After the disturbing revelations about predators interacting with the posts of minors and even buying their worn clothing, it continues to be clear that social media companies are failing to keep our children safe,” says Senator Hassan.
Meta, TikTok, and X did not immediately respond to requests for comment.
In a statement to the Times about its earlier reporting, Meta spokesperson Andy Stone said that the company prevents “accounts exhibiting potentially suspicious behavior from using our monetization tools, and we plan to limit such accounts from accessing subscription content,” but that parents were ultimately responsible for the accounts.
In the letters sent to TikTok, X, and Meta, Hassan is asking companies to disclose whether they were aware of parents circumventing their age requirements, whether accounts of young girls are monetized—or have ads placed on them—by the platforms, and what active measures the companies have in place to detect these kinds of accounts.
The platforms have until April 8 to offer their responses.
SOURCE: Wired