YouTube remains under fire for disturbing content made available in the Suggested Videos area of its YouTube Kids app, and for collecting data on young children. Even as recently as a couple of weeks ago, publications including Wired reported that the app was still awash in creepy and violent videos targeted at kids.
YouTube is under fire more broadly, too. News stories range from the relatively benign – like Tuesday’s hack that wiped out highly-viewed music videos like last year’s hit “Despacito,” to the more disturbing, like reports – also from Tuesday – that the platform was serving in-video ads linking to hardcore porn. For what it’s worth, Google’s advertising policy, which governs ads in YouTube videos, prohibits “adult content” in ads.
Now, a coalition of more than twenty consumer advocacy groups says that the platform is illegally mining personal data from young children. Josh Golin is executive director of the Campaign for a Commercial Free Childhood. His is one of the groups filing a complaint against YouTube with the Federal Trade Commission.
“Even though YouTube’s terms of service say [the site] is for ages 13 and up,” says Golin, “the platform is loaded with cartoons, nursery rhyme videos, toy commercials and all sorts of content for young children.” With advertising sold to reach young children, YouTube “knows children are there. So they must comply with the Children’s Online Privacy Protection Act, which means they must get parental permission before collecting data on kids.”
Golin says his group’s complaint alleges “YouTube is not getting parental permission and they’ve been collecting all sorts of data.” Getting permission to collect data usually involves reading terms of service and disclaimers and checking a box. Golin says that instead, YouTube could simply “not collect data on videos” posted for young children. While the YouTube Kids app does not collect data, “more kids use the regular YouTube site, than use the YouTube Kids app,” says Golin. “Google knows those kids are there and, under the law, they have a responsibility to protect those kids.”
“YouTube’s reliance on data and algorithms at the expense of human moderation…is flawed for kids. It’s been shown time and time again,” says Bolin “that inappropriate content is recommended to kids because those videos aren’t reviewed by an actual human being.”
written by Christopher De Los Santos.