CDD

FTC complaints filed by Campaign for a Commercial-Free Childhood and Center for Digital Democracy lead to major changes for kids on YouTube

Most parents can tell you the most popular website for kids is YouTube. But for years, while Google made millions luring children to YouTube, vacuuming up their sensitive information, and using it to target them with ads, Google told the Big Lie: “YouTube is not for kids. It says so right in our terms of service.”

That has now changed, thanks to the advocacy of Campaign for a Commercial-Free Childhood (CCFC) and the Center for Digital Democracy (CDD) and the support of a coalition of advocacy groups.

Google deliberately developed YouTube as the leading site for children with programming and marketing strategies designed to appeal directly to kids. But it ignored the only federal law addressing commercial privacy online—the Children’s Online Privacy Protection Act (COPPA). Their behavior sent a message that a corporation as powerful and well-connected corporation as Google is above the law—even laws designed to protect young people.

CCFC, CDD, and our attorneys at the Institute for Public Representation (link is external) (IPR) at Georgetown University Law Center, with a broad coalition of consumer, privacy, public health and child rights groups, began filing complaints at the Federal Trade Commission (FTC) in 2015 concerning Google’s child-directed practices on YouTube and the YouTube Kids app. We kept up the pressure on the FTC, with the help of Congress and the news media. After we filed a complaint (link is external) in April 2018 describing YouTube’s ongoing violations of COPPA, the FTC, under the leadership of Chairman Joe Simons, finally decided to take action. The result was the FTC’s September decision (link is external)—which in many ways is both historic and a major step in the direction of protecting children online. Google was fined $170 million for its violations of children’s privacy, a record amount for a COPPA-connected financial sanction.

The FTC’s action also implemented important new policies (link is external) protecting children, most of which will go into effect by January 2020:

  • Children will no longer be targeted with data-driven marketing and advertising on YouTube programming targeted to kids: This is the most important safeguard. Google will no longer conduct personalized “behavioral” marketing on YouTube programming that targets children. In other words, they will stop the insidious practice of using kids’ sensitive information in order to target them with ads tailored for their eyes. Google will require video producers and distributors to self-identify that their content is aimed at kids; and will also employ its own technology to identify videos that target young audiences.
  • Google will substantially curtail the data they collect from children watching YouTube videos. Since the main YouTube site has no age gate, they will limit data collection and tracking of viewers’ identities for anyone watching child-directed content there to only the data “needed to support the operation of the service.”. The same limitation will apply to videos on YouTube Kids.
  • Google is taking steps to drive kids from the main YouTube site to YouTube Kids, where parental consent is required. Google launched the YouTube Kids app in 2015. But the app never rivaled the main YouTube platform’s hold on children, and was plagued with a number of problems, such as inadequate screening of harmful content. As a result of the FTC investigation, Google has launched a YouTube Kids website, and when kids watch children’s content on the main YouTube site they get a pop-up suggesting they visit YouTube Kids. Google says it will more effectively curate different programming that will appeal to kids aged 4 through 12. This is a positive development because, while a number of concerns remain about YouTube Kids, children are better off using the Kids site rather than the Wild West of the main YouTube platform.
  • Google created a $100 million fund for “quality kids, family and educational content.” CCFC and CDD had proposed this, and we are gratified Google acknowledged it bears responsibility to support programing that enriches the lives of children. This is to be a three-year program to spur “the creation of thoughtful, original children’s content.”
  • Google has made changes to make YouTube a “safer platform for children:” The company is proactively promoting “quality” children’s programming on YouTube by revising the algorithm used to make recommendations. It is also not permitting comments and notifications on child-directed content.

Google has told CCFC and CDD it will make these changes regarding data collection and targeted marketing worldwide. Other questions remain to be answered. How will it treat programming classified as “family viewing”—exempt it from the new data targeting safeguards? (It should not be permitted to do so.) Will the new $100 million production fund commit to supporting child-directed non-commercial content (instead of serving as a venture for Google to expand its marketing to kids)? Will Google ensure that its other child-directed commercial activities—such as its Play Store—also reflect the new safeguards the company has adopted for YouTube? Google also permits the targeting of young people via “influencers,” including videos where toys and other products are unboxed. When such videos are child-directed, Google should put an end to them.

CCFC, CDD and our allies intend to play a proactive role holding Google, its programmers, advertisers and the FTC accountable to make sure that these new policies are implemented effectively. Our work in bringing about this change and the work we will do to make other companies follow suit is part of our commitment to ensuring that young people around the world grow up in a media environment that respects and promotes their health, privacy, and well-being.