CDD

Press Releases

  • Center for Digital Democracy             April 17, 2024                                                                                               Washington, DC Contact: Katharina Kopp, kkopp@democraticmedia.orgStatement Regarding House Energy & Commerce Hearing : “Legislative Solutions to Protect Kids Online and Ensure Americans’ Data Privacy Rights” The following statement can be attributed to Katharina Kopp, Ph.D., Deputy Director, Center for Digital Democracy:The Center for Digital Democracy (CDD) welcomes the bi-cameral and bi-partisan effort to come together and produce the American Privacy Rights Act (APRA) discussion draft. We have long advocated for comprehensive privacy legislation that would protect everyone’s privacy rights and provide default safeguards.The United States confronts a commercial surveillance crisis, where digital giants invade our private lives, spy on our families, and exploit our most personal information for profit. Through a vast, opaque system of algorithms, we are manipulated, profiled, and sorted into winners and losers based on data about our health, finances, location, gender, race, and other personal information. The impacts of this commercial surveillance system are especially harmful for marginalized communities, fostering discrimination and inequities in employment, government services, health and healthcare, education, and other life necessities. The absence of a U.S. privacy law not only jeopardizes our individual autonomy but also our democracy.However, our reading the APRA draft, we have several questions and concerns, suggesting that the document needs substantial revision. While the legislation addresses many of our requirements for comprehensive privacy legislation, we oppose various provisions in their current form, including-              Insufficient limitations on targeted advertising and de-facto sharing of consumer data: The current data-driven targeted ad supported business model is the key driver of commercial exploitation, manipulation, and discrimination. APRA, however, allows the continuation and proliferation of “first party” targeted advertising without any recourse for individuals. Most of the targeted advertising today relies on first party advertising and widely accepted de-facto sharing practices like “data clean rooms.” APRA should not provide any carve-out for first-party targeted advertising. -              Overbroad preemption language: APRA’s preemption of state privacy laws prevents states from implementing stronger privacy protections. Considering that it took the U.S. three decades to pass any comprehensive privacy legislation since the establishment of pervasive digital marketing practices, it would be short-sighted and reckless to believe that the current form of APRA can adequately protect online privacy in the long run without pressure from states to innovate. Technology and data practices are rapidly evolving, and our understanding of their harms are evolving as well. The preemption language is particularly careless, especially since there are almost no provisions giving the FTC the ability to update APRA through effective rulemaking. CDD strongly supports the Children and Teens' Online Privacy Protection Act (COPPA 2.0), HR 7890. Children and teens require additional privacy safeguards beyond those outlined in APRA. Digital marketers are increasingly employing manipulative and unfair data-driven marketing tactics to profile, target, discriminate against, and exploit children and teens on all the online platforms they use. This is leading to unacceptable invasions of privacy and public health harms. The Children and Teens' Online Privacy Protection Act (COPPA 2.0) is urgently needed to provide crucial safeguards and to update federal protections that were initially established almost 25 years ago. We commend Rep. Walberg (R-Mich.) and Rep. Castor (D-FL) for introducing the House COPPA 2.0 companion bill. The bill enjoys strong bipartisan support in the U.S. Senate. We urge Congress to promptly pass this legislation into law. Any delay in bringing HR 7890 to a vote would expose children, adolescents, and their families to greater harm. CDD strongly supports the Kids Online Safety Act (KOSA) and believes that children and teens require robust privacy safeguards and additional online protections. Social media platforms, such as Meta, TikTok, YouTube, and Snapchat, have prioritized their financial interests over the well-being of young users for too long. These companies should be held accountable for the safety of America's youth and take measures to prevent harms like eating disorders, violence, substance abuse, sexual exploitation, addiction-like behaviors, and the exploitation of privacy. We applaud the efforts of Reps. Gus Bilirakis (R-FL), Kathy Castor (D-FL), Erin Houchin (R-IN), and Kim Schrier (D-WA), on the introduction of KOSA. The Senate has shown overwhelming bipartisan support for this legislation, and we urge the House to vote on KOSA, adopt the Senate's knowledge standard, and make the following amendments to ensure its effectiveness:-              Extend the duty of care to all covered platforms, including video gaming companies, rather than just the largest ones.-              Define the "duty of care" to cover "patterns of use that indicate or encourage addiction-like behaviors" rather than simply “compulsive usage”. This will ensure a broader scope that addresses more addiction-like behaviors.-              Retain the consideration of financial harms within the duty of care.We believe these adjustments will improve the much-needed safety of young internet users. ###
    the capitol building in washington d c is shown by Tim Mossholder
  • Press Release

    Children’s Advocates Urge the Federal Trade Commission to Enact 21st Century Privacy Protections for Children

    More than ten years since last review, organizations urge the FTC to update the Children’s Online Privacy Protection Act (COPPA)

    FOR IMMEDIATE RELEASEContact:David Monahan, Fairplay: david@fairplayforkids.org Jeff Chester, Center for Digital Democracy: jeff@democraticmedia.org Children’s Advocates Urge the Federal Trade Commission toEnact 21st Century Privacy Protections for Children More than ten years since last review, organizations urge the FTC to updatethe Children’s Online Privacy Protection Act (COPPA)  WASHINGTON, DC — Tuesday, March 12, 2024 – A coalition of eleven leading health, privacy, consumer protection, and child rights groups has filed comments at the Federal Trade Commission (FTC) offering a digital roadmap for stronger safeguards while also supporting many of the agency’s key proposals for updating its regulations implementing the bipartisan Children’s Online Privacy Protection Act (COPPA).   Comments submitted by Fairplay, the Center for Digital Democracy,  the American Academy of Pediatrics, and other advocacy groups supported many of the changes the commission proposed in its Notice of Proposed Rulemaking issued in December 2023. The groups, however, told the FTC that a range of additional protections are required to address the “Big Data” and Artificial Intelligence (AI) driven commercial surveillance marketplace operating today, where children and their data are a highly prized and sought after target across online platforms and applications. “The ever-expanding system of commercial surveillance marketing online that continually tracks and targets children must be reined in now,” said Katharina Kopp, Deputy Director, Director of Policy, Center for Digital Democracy.  “The FTC and children’s advocates have offered a digital roadmap to ensure that data gathered by children have the much-needed safeguards. With kids being a key and highly lucrative target desired by platforms, advertisers, and marketers, and with growing invasive tactics such as AI used to manipulate them, we call on the FTC to enact 21st-century rules that place the privacy and well-being of children first.” “In a world where streaming and gaming, AI-powered chatbots, and ed tech in classrooms are exploding, children's online privacy is as important as ever,” said Haley Hinkle, Policy Counsel, Fairplay.  “Fairplay and its partners support the FTC's efforts to ensure the COPPA Rule meets kids' and families' needs, and we lay out the ways in which the Rule addresses current industry practices. COPPA is a critical tool for keeping Big Tech in check, and we urge the Commission to adopt the strongest possible Rule in order to protect children into the next decade of technological advancement.” While generally supporting the commission’s proposal that provides parents or caregivers greater control over a child’s data collection via their consent, the groups told the commission that a number of improvements and clarifications are required to ensure that privacy protections for a child’s data are effectively implemented.  They include issues such as: ●      The emerging risks posed to children by AI-powered chatbots and biometric data collection.●      The need to apply COPPA’s data minimization requirements to data collection, use, and retention to reduce the amount of children’s data in the audience economy and to limit targeted marketing.●      The applicability of the Rule’s provisions – including notice and the separate parental consent for collection and disclosure consent and data minimization requirements – to the vast networks of third parties that claim to share children’s data in privacy safe ways, including “clean rooms”, but still utilize young users’ personal information for marketing.●      The threats posed to children by ed tech platforms and the necessity of strict limitations on any use authorized by schools.●      The need for clear notice, security program, and privacy program requirements in order to effectively realize COPPA’s limitations on the collection, use, and sharing of personal information. The 11 organizations that signed on to the comments are: the Center for Digital Democracy (CDD); Fairplay; American Academy of Pediatrics; Berkeley Media Studies Group; Children and Screens: Institute of Digital Media and Child Development; Consumer Federation of America; Center for Humane Technology; Eating Disorders Coalition for Research, Policy, & Action; Issue One; Parents Television and Media Council; and U.S. PIRG. ### 
  •  Washington, DC                                                                                 February 15, 2024For too long social media platforms have prioritized their financial interests over the well-being of young users. Meta, TikTok, YouTube, Snap and other companies should be held accountable for the safety of America's youth. They must be required to prevent harms to kids—such as eating disorders, violence, substance abuse, sexual exploitation, and the exploitation of their privacy.  The Kids Online Safety Act (KOSA) would require platforms to implement the most protective privacy and safety settings by default. It would help prevent the countless tragic results experienced by too many children and their parents. We are in support of the updated language of the Kids Online Safety Act and urge Congress to pass the bill promptly. Katharina Kopp, Ph.D.Director of Policy, Center for Digital Democracy
  • Washington, DC                                                                                   February 15, 2024Digital marketers are unleashing a powerful and pervasive set of unfair and manipulative tactics to target and exploit children and teens.  Wherever they go online— social media, viewing videos, listening to music, or playing games—they are stealthily “accompanied” by an array of marketing practices designed to profile and manipulate them.  The Children and Teens’ Online Privacy Protection Act (COPPA 2.0) will provide urgently needed online privacy safeguards for children and teens and update legislation first enacted nearly 25 years ago.  The proposed new law will deliver real accountability to the digital media as well as help limit harms now experienced by children and teens online. For example, by stopping data targeted ads to young people under 16, the endless stream of information harvested by online companies will be significantly reduced. Other safeguards will limit the collection of personal information for other purposes. COPPA 2.0 will also extend the original COPPA law protections for youth from 12 to 16 years of age.  The proposed law also provides the ability to delete children’s and teen’s data easily. Young people will also be better protected from the myriad of methods used to profile them that has unleashed numerous discriminatory and other harmful practices.  An updated knowledge standard will make this legislation easier to enforce.We welcome the bipartisan updated text from co-sponsors Sen. Markey and Sen. Cassidy and new co-sponsors Chair Sen. Cantwell (D-WA) and Ranking Member Sen. Cruz (R-Texas). Katharina Kopp, Ph.D.Director of Policy, Center for Digital Democracy
    the capitol building in washington d c is shown by Tim Mossholder
  • Press Release

    Leading Advocates for Children Applaud FTC Update of COPPA Rule

    Fairplay and the Center for Digital Democracy see a crucial step toward creating safer online experiences for kids

    Contact:David Monahan, Fairplay, david@fairplayforkids.org Jeff Chester, CDD, jeff@democraticmedia.org Leading Advocates for Children Applaud FTC Update of COPPA RuleFairplay and the Center for Digital Democracy see a crucial step toward creating safer online experiences for kids WASHINGTON, DC — December 20, 2023—The Federal Trade Commission has proposed its first update of rules protecting children’s privacy in a decade.  Under the bipartisan Children’s Online Privacy Protection Act of 1998 (COPPA), children under 13 currently have a set of core safeguards designed to control and limit how their data can be gathered and used.   The COPPA Rules were last revised in 2012, and today’s proposed changes offer new protections to ensure young people can go online without losing control over their personal information.  These new rules would help create a safer, more secure, and healthier online environment for them and their families—precisely at a time when they face growing threats to their wellbeing and safety.  The provisions offered today are especially needed to address the emerging methods used by platforms and other digital marketers who target children to collect their data, including through the growing use of AI and other techniques. Haley Hinkle, Policy Counsel, Fairplay:“The FTC’s recent COPPA enforcement actions against Epic Games, Amazon, Microsoft, and Meta demonstrated that Big Tech does not have carte blanche with kids’ data. With this critical rule update, the FTC has further delineated what companies must do to minimize data collection and retention and ensure they are not profiting off of children’s information at the expense of their privacy and wellbeing. Anyone who believes that children deserve to explore and play online without being tracked and manipulated should support this update.” Katharina Kopp, Ph.D., Director of Policy, Center for Digital Democracy:“Children face growing threats as platforms, streaming and gaming companies, and other marketers pursue them for their data, attention, and profits.  Today’s FTC’s proposed COPPA rule update provides urgently needed online safeguards to help stem the tidal wave of personal information gathered on kids. The commission’s plan will limit data uses involving children and help prevent companies from exploiting their information. These rules will also protect young people from being targeted through the increasing use of AI, which now further fuels data collection efforts. Young people 12 and under deserve a digital environment that is designed to be safer for them and that fosters their health and well-being.  With this proposal, we should soon see less online manipulation, purposeful addictive design, and fewer discriminatory marketing practices.” ### 
    girl in white sweater and blue denim jeans sitting on floor by bruce mars
  • Press Release

    BREAKING: Advocates Decry Meta’s Attempt to Shut Down the FTC

    In response to an order that would prohibit Meta from monetizing minors’ data, the social media company has filed a suit claiming the agency’s structure is unconstitutional. 

    Contact: David Monahan, Fairplay, david@fairplayforkids.orgContact: Jeff Chester, CDD, jeff@democraticmedia.org BREAKING: Advocates Decry Meta’s Attempt to Shut Down the FTCIn response to an order that would prohibit Meta from monetizing minors’ data, the social media company has filed a suit claiming the agency’s structure is unconstitutional. WASHINGTON, DC – THURSDAY, NOVEMBER 30, 2023 – Advocates for children and privacy condemned a lawsuit filed last evening by Meta against the Federal Trade Commission that seeks to shut the agency down by asserting the Commission’s structure is unconstitutional.  Meta’s suit comes in response to a proposed FTC order prohibiting Meta from monetizing children’s data for violating the Children’s Online Privacy Protection Act (COPPA) while already operating under a Consent Decree for multiple serious privacy violations. Earlier this week, Judge Timothy Kelly of the U.S. District Court for the District of Columbia denied a motion filed by Meta that claimed the FTC had no authority to modify its previous settlement. Now Meta is escalating its attacks on the Commission’s authority.Meta has posed a threat to the privacy and welfare of young people in the U.S. for many years, as it targeted them to further its data-driven commercial surveillance advertising system. Scandal after scandal has exposed the company’s blatant disregard for children and youth, with nearly daily headlines about its irresponsible actions coming from former employees turned whistleblowers and major multi-state and bi-partisan investigations of states attorneys-general.  Despite multiple attempts by regulators to contain Meta’s ongoing undermining of its user privacy, including through multiple FTC consent decrees, it is evident that a substantive remedy is required to safeguard US youth. Fairplay, the Center for Digital Democracy, and the Electronic Privacy Information Center (EPIC) have issued these comments on today's announcement of a Meta lawsuit against the Federal Trade Commission: Josh Golin, Executive Director, Fairplay: “While many have noted social media’s role in fueling the mental health crisis, the Federal Trade Commission has taken actual meaningful action to protect young people online by its order prohibiting serial privacy offender Meta from monetizing minor’s data. So it’s not surprising that Meta is launching this brazen attack on the Commission, especially given the company may have $200 billion in COPPA liability according to recently unsealed documents. Anyone who cares about the wellbeing of children– and the safety of American consumers – should rally to the defense of the Commission and be deeply concerned about the lengths Meta will go to preserve its ability to profit at the expense of young people.”  Katharina Kopp, Director of Policy, Center for Digital Democracy: “For decades Meta has put the maximization of profits from so-called behavioral advertising above the best interests of children and teens. Meta’s failure to comply repeatedly with its 2012 and 2020 settlements with the FTC, including its non-compliance with the federal children’s privacy law (COPPA), and the unique developmental vulnerability of minors, justifies the FTC to propose the modifications of Meta’s consent decree and to require it to stop profiting from the data it gathers on children and teens.  It should not surprise anybody then that Meta is now going after the FTC with its lawsuit. But this attack on the FTC is essentially an attack on common sense regulation to curtail out-of-control commercial power and an attack on our children, teenagers, and every one of us.” John Davisson, Director of Litigation, Electronic Privacy Information Center (EPIC): “It seems there's no legal theory, however far-fetched, that Meta won't deploy to avoid a full accounting of its harmful data practices. The reason is clear. A hearing before the FTC will confirm that Meta continues to mishandle personal data and put the privacy and safety of minors at risk, despite multiple orders not to do so. The changes FTC is proposing to Meta's exploitative business model can't come soon enough. We hope the court will reject Meta's latest attempt to run out the clock, as another federal court did just this week.” ### 
    black metal window frame on brown concrete wall by Ian Hutchinson
  • Contact:David Monahan, Fairplay: david@fairplayforkids.org Day of Action as Advocates for Youth Urge Hill:Pass the Kids Online Safety Act Now Momentum keeps growing: 217 organizations call on Congress to address the youth mentalhealth crisis spurred by social media WASHINGTON, D.C. – Wednesday, November 8, 2023 – Today, a huge coalition of advocacy groups is conducting a day of action urging Congress to finally address the youth mental health crisis and pass the Kids Online Safety Act (KOSA, S. 1409). Momentum in support of the bill continues to build and today 217 groups which advocate for children and teens across a myriad of areas–including mental health, privacy, suicide prevention, eating disorders, and child sexual abuse prevention–are sending a letter urging Senate Majority Leader Schumer and Senate Minority Leader McConnell to move KOSA to a floor vote by the end of this year. “After numerous hearings and abundant research findings,” the coalition writes, “the evidence is clear of the potential harms social media platforms can have on the brain development and mental health of our nation’s youth, including hazardous substance use, eating disorders, and self-harm.” “With this bipartisan legislation,” they write, “Congress has the potential to significantly improve young people’s wellbeing by transforming the digital environment for children and teens.” KOSA, authored by Senators Richard Blumenthal (D-CT) and Marsha Blackburn (R-TN) enjoys growing bi-partisan support; it is endorsed by 48 US Senators–24 from each side of the aisle. Today’s day of action will see supporters of the 217 organizations calling senators to urge a floor vote and support for KOSA. The letter and day of action follow: a suit by 42 attorneys general against Meta for exploiting young users’ vulnerabilities on Instagram; persistent calls to pass KOSA from parents of children who died from social media harms; and Tuesday’s Senate Judiciary Committee hearing on Social Media and the Teen Mental Health Crisis. COMMENTS: Josh Golin, Executive Director of Fairplay:“Every day that Congress allows social media companies to self-regulate, children suffer, and even die, from preventable harms and abuses online. The Kids Online Safety Act would force companies like Meta, TikTok and Snap to design their platforms in ways that reduce risks for children, and create a safer and less addictive internet for young people. James P. Steyer, Founder and CEO of Common Sense Media:"With a new whistleblower and 42 states filing suit against Meta for its deceptive practices and dangerous platform design, the growing support and urgent need for KOSA is now too strong to ignore. Common Sense will continue to work with lawmakers and advocates on both sides of this bill to once and for all begin to curb the harms that online platforms are causing for youngpeople. Sacha Haworth, Executive Director of the Tech Oversight Project:“The disturbing revelations sadly add to a mountain of evidence proving that tech companies are willfully negligent and even openly hostile to protecting minors from the harms their products bring. As a mother, my heart breaks for the kids who experienced pain and harassment online because tech executives were willing to sacrifice their physical and emotional health in pursuit of profit. Lives are on the line, and we cannot sit on the sidelines. We need to pass KOSA to force companies like Meta to protect children and teens and treat them with the dignity they deserve.” Katharina Kopp, Director of Policy, Center for Digital Democracy:“The public health crisis that children and teens experience online requires an urgent intervention from policymakers. We need platform accountability and an end to the exploitation of young people. Their well-being is more important than the ‘bottom-line’ interests of platforms. KOSA will prevent companies taking advantage of the developmental vulnerabilities of children and teens. We urge the U.S Senate to bring KOSA to a floor vote by the end of the year.” ###
  • FOR IMMEDIATE RELEASEThursday, September 14, 2023 Contacts:David Monahan, Fairplay, david@fairplayforkids.orgJeff Chester, CDD, jeff@democraticmedia.org Statement of Fairplay and the Center for Digital Democracy on FTC’s Announcement: Protecting Kids From Stealth Advertising in Digital MediaBOSTON, MA, WASHINGTON DC—Today, the Federal Trade Commission released a new staff paper, “Protecting Kids from Stealth Advertising in Digital Media.” The paper’s first recommendation states:“Do not blur advertising. There should be a clear separation between kids’ entertainment/educational content and advertising, using formatting techniques and visual and verbal cues to signal to kids that they are about to see an ad.”This represents a major shift for the Commission. Prior guidance only encouraged marketers to disclose influencer and other stealth marketing to children. For years – including in filings last year and at last year’s FTC Workshop—Fairplay and the Center for Digital Democracy had argued that disclosures are inadequate for children and that stealth marketing to young people should be declared an unfair practice. Below are Fairplay’s and CDD’s comments on today’s FTC staff report:Josh Golin, Executive Director, Fairplay:“Today is an important first step towards ending an exploitative practice that is all too common on digital media for children.  Influencers—and the brands that deploy them—have been put on notice: do not disguise your ads for kids as entertainment or education.”Katharina Kopp, Deputy Director, Director of Policy, Center for Digital Democracy“Online marketing and advertising targeted at children and teens is pervasive, sophisticated and data-driven. Young people are regularly exposed to an integrated set of online marketing operations that are manipulative, unfair, invasive.  These commercial tactics can be especially harmful to the mental and physical health of youth.   We call on the FTC to build upon its new report to address how marketers use the latest cutting-edge marketing tactics to influence young people—including neuro-testing, immersive ad formats and ongoing data surveillance.”###
    person holding smartphone by Rodion Kutsaiev
  • Press Release

    Advocates demand Federal Trade Commission investigate Google for continued violations of children’s privacy law

    Following news of Google’s violations of COPPA and 2019 settlement, 4 advocates ask FTC for investigation

    Contact:Josh Golin, Fairplay: josh@fairplayforkids.orgJeff Chester, Center for Digital Democracy: jeff@democraticmedia.org Advocates demand Federal Trade Commission investigate Google for continued violations of children’s privacy lawFollowing news of Google’s violations of COPPA and 2019 settlement, 4 advocates ask FTC for investigation BOSTON and WASHINGTON, DC – WEDNESDAY, August 23, 2023 – The organizations that alerted the Federal Trade Commission (FTC) to Google’s violations of the Children’s Online Privacy Protection Act (COPPA) are urging the Commission to investigate whether Google and YouTube are once again violating COPPA, as well as the companies’ 2019 settlement agreement and the FTC Act. In a Request for Investigation filed today, Fairplay and the Center for Digital Democracy (CDD) detail new research from Adalytics, as well as Fairplay’s own research, indicating Google serves personalized ads on “made for kids” YouTube videos and tracks viewers of those videos, even though neither is permissible under COPPA. Common Sense Media and the Electronic Privacy Information Center (EPIC), joined Fairplay and CDD in calling on the Commission to investigate and sanction Google for its violations of children’s privacy. The advocates suggest that the FTC should seek penalties upwards of tens of billions of dollars. In 2018, Fairplay and Center for Digital Democracy led a coalition asking the FTC to investigate YouTube for violating the Children’s Online Privacy Protection Act (COPPA) by collecting personal information from children on the platform without parental consent. As a result of the advocates’ complaint, Google and YouTube were required to pay a then-record $170 million fine in a 2019 settlement with the FTC and comply with COPPA going forward. Rather than getting the required parental permission before collecting personally identifiable information from children on YouTube, Google claimed instead it would comply with COPPA by limiting data collection and eliminating personalized advertising on “made for kids.” But an explosive new report released by Adalytics last week called into question Google’s assertions and compliance with federal privacy law. The report detailed how Google appeared to be surreptitiously using cookies and identifiers to track viewers of “made for kids” videos. The report also documented how YouTube and Google appear to be serving personalized ads on “made for kids” videos and transmitting data about viewers to data brokers and ad tech companies. In response to the report, Google told the New York Times that ads on children’s videos are based on webpage content, not targeted to user profiles. But follow-up research conducted independently by both Fairplay and ad buyers suggests the ads are, in fact, personalized and Google is both violating COPPA and making deceptive statements about its targeting of children. Both Fairplay and the ad buyers ran test ad campaigns on YouTube where they selected a series of users of attributes and affinities for ad targeting and instructed Google to only run the ads on “made for kids” channels. In theory, these test campaigns should have resulted in zero placements, because under Google and YouTube’s stated policy, no personalized ads are supposed to run on “made for kids” videos. Yet, Fairplay’s targeted $10 ad campaign resulted in over 1,400 impressions on “made for kids” channels and the ad buyers reported similar results. Additionally, the reporting Google provided to Fairplay and the ad buyers to demonstrate the efficacy of the ad buys would not be possible if the ads were contextual, as Google claims. “If Google’s representations to its advertisers are accurate, it is violating COPPA,” said Josh Golin, Executive Director of Fairplay. “The FTC must launch an immediate and comprehensive investigation and use its subpoena authority to better understand Google’s black box child-directed ad targeting. If Google and YouTube are violating COPPA and flouting their settlement agreement with the Commission, the FTC should seek the maximum fine for every single violation of COPPA and injunctive relief befitting a repeat offender.” The advocates’ letter urges the FTC to seek robust remedies for any violations, including but not limited to: ·       Civil penalties that demonstrate that continued violations of COPPA and Section 5 of the FTC Act are unacceptable. Under current law, online operators can be fined $50,120 per violation of COPPA. Given the immense popularity of many “made for kids” videos, it is likely millions of violations have occurred, suggesting the Commission should seek civil penalties upwards of tens of billions of dollars.·       An injunction requiring relinquishment of all ill-gotten gains·       An injunction requiring disgorgement of all algorithms trained on impermissibly collected data·       A prohibition on the monetization of minors’ data·       An injunction requiring YouTube to move all “made for kids” videos to YouTube Kids and remove all such videos from the main YouTube platform. Given Google’s repeated failures to comply with COPPA on the main YouTube platform – even when operating under a consent decree – these videos should be cabined to a platform that has not been found to violate existing privacy law·       The appointment of an independent “special master” to oversee Google’s operations involving minors and provide the Commission, Congress, and the public semi-annual compliance reports for a period of at least five yearsKatharina Kopp, Deputy Director of the Center for Digital Democracy, said “The FTC must fully investigate what we believe are Google’s continuous violations of COPPA, its 2019 settlement with the FTC, and Section 5 of the FTC Act. These violations place many millions of young viewers at risk. Google and its executives must be effectively sanctioned to stop its ‘repeat offender’ behaviors—including a ban on monetizing the personal data of minors, other financial penalties, and algorithmic disgorgement. The Commission’s investigation should also review how Google enables advertisers, data brokers, and leading online publisher partners to surreptitiously surveil the online activities of young people. The FTC should set into place a series of ‘fail-safe’ safeguards to ensure that these irresponsible behaviors will never happen again.” Caitriona Fitzgerald, Deputy Director of the Electronic Privacy Information Center (EPIC), said "Google committed in 2019 that it would stop serving personalized ads on 'made for kids' YouTube videos, but Adalytics’ research shows that this harmful practice is still happening. The FTC should investigate this issue and Google should be prohibited from monetizing minors’ data."Jim Steyer, President and CEO of Common Sense Media, said "The Adalytics findings are troubling but in no way surprising given YouTube’s history of violating the kids’ privacy. Google denies doing anything wrong and the advertisers point to Google, a blame game that makes children the ultimate losers. The hard truth is, companies — whether it’s Big Tech or their advertisers — basically care only about their profits, and they will not take responsibility for acting against kids’ best interests. We strongly encourage the FTC to take action here to protect kids by hitting tech companies where it really hurts: their bottom line." ### 
    a red play button with a white arrow by Eyestetix Studio
  • New research released today by Adalytics raises serious questions about whether Google is violating the Children's Online Privacy Protection Act (COPPA) by collecting data and serving personalized ads on child-directed videos on YouTube. In 2019, in response to a Request for Investigation by Fairplay and the Center for Digital Democracy, the Federal Trade Commission fined Google $170 million for violating COPPA on YouTube and required Google to change its data-collection and advertising practices on child-directed videos. As a result of that settlement, Google agreed to stop serving personalized ads and limit data collection on child-directed videos. Today's report - and subsequent reporting by The New York Times - call into question whether Google is complying with the settlement.  STATEMENTS FROM FAIRPLAY AND CDD:Josh Golin, Executive Director, Fairplay:This report should be a wake-up call to parents, regulators and lawmakers, and anyone who cares about children -- or the rule of law, for that matter. Even after being caught red-handed in 2019 violating COPPA, Google continues to exploit young children, and mislead parents and regulators about its data collection and advertising practices on YouTube. The FTC must launch an immediate and comprehensive investigation of Google and, if they confirm this report's explosive allegations, seek penalties and injunctive relief commensurate with the systematic disregard of the law by a repeat offender. Young children should be able to watch age-appropriate content on the world's biggest video platform with their right to privacy guaranteed, full stop. Jeff Chester, Executive Director, Center for Digital Democracy:Google operates the leading online destination for kids’ video programming so it can reap enormous profits, including through commercial surveillance data and advertising tactics.  It must be held accountable by the FTC for what appears are violations of the Children’s Online Privacy Protection Act and its own commitments.   Leading advertisers, ad agencies, media companies and others partnering with Google appear to have been more interested in clicks than the safety of youth. There is a massive and systemic failure across the digital marketplace when it comes to protecting children’s privacy.   Congress should finally stand up to the powerful “Big Data” ad lobby and enact long-overdue privacy legislation.  Google’s operations must also be dealt with by antitrust regulators.  It operates imperiously in the digital arena with no accountability. The Adalytics study should serve as a chilling reminder that our commercial surveillance system is running amok, placing even our most vulnerable at great risk.
  • Press Release

    Transatlantic Consumer Dialogue (TACD) Calling on White House and Administration to Take Immediate Action on Generative AI

    Transatlantic Consumer Dialogue (TACD), a coalition of the leading consumer organizations in North America and Europe, asking policymakers on both side of the Atlantic for action

    The Honorable Joseph R. BidenPresident of the United StatesThe White House1600 Pennsylvania Avenue NWWashington, DC 20500 June 20, 2023  Dear President Biden,We are writing on behalf of the Transatlantic Consumer Dialogue (TACD), a coalition of the leading consumer organizations in North America and Europe, to ask you and your administration to take immediate action regarding the rapid development of Generative Artificial Intelligence in a growing number of applications, such as text generators like ChatGPT, and the risks these entail for consumers. We are calling on policymakers and regulators on both sides of the Atlantic to use existing laws and regulations to address the problematic uses of Generative Artificial Intelligence; adopt a cautious approach to deploying Generative Artificial Intelligence in the public sector; and adopt new legislative measures to directly address Generative Artificial Intelligence harms. As companies are rapidly developing and deploying this technology and outpacing legislative efforts, we cannot leave consumers unprotected in the meantime.  Generative Artificial Intelligence systems are now already widely used by consumers in the U.S. and beyond. For example, chatbots are increasingly incorporated into products and services by businesses. Although these systems are presented as helpful, saving time, costs, and labor, we are worried about serious downsides and harms they may bring about.Generative Artificial Intelligence systems are incentivized to suck up as much data as possible to train the AI models, leading to inclusion of personal data that may be irremovable once the sets have been established and the tools trained. Where training models include data that is biased or discriminatory, those biases become baked into the Generative Artificial Intelligence’s outputs, creating increasingly more biased and discriminatory content that is then disseminated. The large companies making advances in this space are already establishing monopolistic market concentration. Running Generative Artificial Intelligence tools requires enormous amounts of water and electricity, leading to heightened carbon emissions. The speed and volume of information creation with these technologies speeds the generation and spread of increasing misinformation and disinformation. Three of our members (Public Citizen, The Electronic Privacy Information Center, and The Norwegian Consumer Council) have already published reports setting forth the specific harms of Generative Artificial Intelligence and proposing steps to counter these harms – we would be happy to discuss these with you. In addition, TACD has adopted policy principles which we believe are key to safely deploying Generative Artificial Intelligence. Our goal is to provide policymakers,                                lawmakers, enforcement agencies, and other relevant entities with a robust starting point to ensure that Generative Artificial Intelligence does not come at the expense of consumer, civil, and human rights.  If left unchecked, these harms will become permanently entrenched in the use and development of Generative Artificial Intelligence. We are calling for actions that insist upon transparency, accountability, and safety in these Generative Artificial Intelligence systems, including ensuring that discrimination, manipulation, and other serious harms are eliminated. Where uses of GAI are clearly harmful or likely to be clearly harmful, they must be barred completely.  In order to combat the harms of Generative Artificial Intelligence, your administration must ensure that existing laws are enforced wherever they apply. New regulations must be passed that specifically address the serious risks and gaps in protection identified in the reports mentioned above. Companies and other entities developing Generative Artificial Intelligence must adhere to transparent and reviewable obligations. Finally, once binding standards are in place, the Trade and Technology Council must not undermine those binding standards.We welcome the administration’s efforts on AI to protect Americans’ rights and safety, particularly your efforts to center civil rights, via executive action. Furthermore, we are encouraged to see the leading enforcement agencies underscore their collective commitment to leverage their existing legal authorities to protect the American people. But more must be done, and soon, especially for those already disadvantaged and the most vulnerable, including people of color and others who have been historically underserved and marginalized, as well as children and teenagers. We want to work with you to ensure that privacy and other consumer protections remain at the forefront of these discussions, even when new technology is involved.Sincerely, Finn Lützow-Holm Myrstad                                Director of Digital Policy, Norwegian Consumer European Co-Chair of TACD’s Digital Policy Calli SchroederSenior Counsel and Global Privacy Counsel, EPIC U.S. Co-Chair of TACD’s Digital PolicyTransatlantic Consumer Dialogue (TACD)Rue d’Arlon 80, B-1040 Brussels  Tel. +32 (0)2 743 15 90  www.tacd.org  @TACD_ConsumersEC register for interest representatives: identification number 534385811072-96                                       
  • Press Release

    Advocates call for FTC action to rein in Meta’s abusive practices targeting kids and teens

    Letter from 31 organizations in tech advocacy, children’s rights, and health supports FTC action to halt Meta’s profiting off of young users’ sensitive data

    Contact:David Monahan, Fairplay: david@fairplayforkids.orgKatharina Kopp, Center for Digital Democracy: kkopp@democraticmedia.org Advocates call for FTC action to rein in Meta’s abusive practices targeting kids and teensLetter from 31 organizations in tech advocacy, children’s rights, and health supports FTC action to halt Meta’s profiting off of young users’ sensitive data BOSTON/ WASHINGTON DC–June 13, 2023– A coalition of leading advocacy organizations is standing up today to support the Federal Trade Commission’s recent order reining in Meta’s abusive practices aimed at kids and teens.  Thirty-one groups, led by the Center for Digital Democracy, the Electronic Privacy Information Center (EPIC), Fairplay, and U.S. PIRG, sent a letter to the FTC saying “Meta has violated the law and its consent decrees with the Commission repeatedly and flagrantly for over a decade, putting the privacy of all users at risk. In particular, we support the proposal to prohibit Meta from profiting from the data of children and teens under 18. This measure is justified by Meta’s repeated offenses involving the personal data of minors and by the unique and alarming risks its practices pose to children and teens.”  Comments from advocates: Katharina Kopp, Director of Policy, Center for Digital Democracy:“The FTC is fully justified to propose the modifications of Meta’s consent decree and to require it to stop profiting from the data it gathers on children and teens.  There are three key reasons why.  First, due to their developmental vulnerabilities, minors are uniquely harmed by Meta’s failure to comply repeatedly with its 2012 and 2020 settlements with the FTC, including its non-compliance with the federal children’s privacy law (COPPA); two, because Meta has failed for many years to even comply with the procedural safeguards required by the Commission, it is now time for structural remedies that will make it less likely that Meta can again disregard the terms of the consent decree; and three, the FTC must affirm its credibility and that of the rule of law and ensure that tech giants cannot evade regulation and meaningful accountability.” John Davisson, Director of Litigation, Electronic Privacy Information Center (EPIC): "Meta has had two decades to clean up its privacy practices after many FTC warnings, but consistently chose not to. That's not 'tak[ing] the problem seriously,' as Meta claims—that's lawlessness. The FTC was right to take decisive action to protect Meta's most vulnerable users and ban Meta from profiting off kids and teens. It's no surprise to see Meta balk at the legal consequences of its many privacy violations, but this action is well within the Commission's power to take.” Haley Hinkle, Policy Counsel, Fairplay: “Meta has been under the FTC's supervision in this case for over a decade now and has had countless opportunities to put user privacy over profit. The Commission's message that you cannot monetize minors' data if you can't or won't protect them is urgent and necessary in light of these repeated failures to follow the law. Kids and teens are uniquely vulnerable to the harms that result from Meta’s failure to run an effective privacy program, and they can’t wait for change any longer.” R.J. Cross, Director of U.S. PIRG’s Don’t Sell My Data campaign: “The business model of social media is a recipe for unhappiness. We’re all fed content about what we should like and how we should look, conveniently presented alongside products that will fix whatever problem with our lives the algorithm has just helped us discover. That’s a hard message to hear day in and day out, especially when you’re a teen. We’re damaging the self-confidence of some of our most impressionable citizens in the name of shopping. It’s absurd. It’s time to short circuit the business model.”  ###
    a white and blue square with a blue and white facebook logo by Dima Solomin
  • “By clarifying what types of data constitute personal data under COPPA, the FTC ensures that COPPA keeps pace with the 21st century and the increasingly sophisticated practices of marketers,” said Katharina Kopp, Director of Policy at Center for Digital Democracy.“As interactive technologies evolve rapidly, COPPA must be kept up to date and reflect changes in the way children use and access these new media, including virtual and augmented realities. The metaverse typically involves a convergence of physical and digital lives, where avatars are digital extension of our physical selves. We agree with the FTC that an avatar’s characteristics and its behavior constitute personal information. And as virtual and augmented reality interfaces allow for the collection of extensive sets of personal data, including sensitive and biometric data, this data must be considered personal information under COPPA. Without proper protections this highly coveted data would be exploited by marketers and used to further manipulate and harm children online.”
    person holding black game controller by Hardik Sharma
  • Contact: Katharina Kopp, kkopp [at] democraticmedia.org“We welcome the FTC ‘s action to address the rampant commercial surveillance of children via Internet of Things (IoT) devices, such as Amazon’s Echo, and for enforcing existing law,” said Katharina Kopp, Director of Policy at Center for Digital Democracy. “Children’s data is taken away from them illegally and surreptitiously on a massive scale via IoT devices, including their voice recordings and data gleaned from kids’ viewing, reading, listening, and purchasing habits. These violations in turn lead to further exploitation and manipulation of children and teens. They lead to violating their privacy, to manipulating them into being interested in harmful products, to undermining their autonomy and hooking them to digital media, and to perpetuating discrimination and bias. As Commissioner Bedoya’s separate statement points out, with this proposed order the FTC warns companies that they cannot take data from children and teens (and others) illegitimately to develop even more sophisticated methods to take advantage of them. Both the FTC and the Department of Justice must hold Amazon accountable.”
    white and black Amazon Echo Dot 2 by Find Experts at Kilta.com
  • CDD urges Congress to adopt stronger online safeguards for kids and teensContact: Katharina Kopp, kkopp [at] democraticmedia.orgThe Children’s Online Privacy Protection Act (COPPA 2.0), introduced by Senators Markey and Cassidy, will provide urgently needed online safeguards for children and teens. It will enact real platform accountability and limit the economic and psychological exploitation of children and teens online and thus address the public health crisis they are experiencing.By banning targeted ads to young people under 16, the endless streams of data collected by online companies to profile and track them will be significantly reduced. The ability of digital marketers and platforms to manipulate, discriminate, and exploit children and teens will be curtailed. COPPA 2.0 will also extend the original COPPA law protections for youth from 12 to 16 years of age.  The proposed law provides the ability to delete children’s and teen’s data with a click of an “eraser button.”  With the creation of a new FTC "Youth Marketing and Privacy Division,” COPPA 2.0 will ensure young peoples’ privacy rights are enforced.
  • Reining In Meta’s Digital ‘Wild West’ as FTC protects young people’s safety, health and privacyContacts:Jeff Chester, CDD, 202-494-7100David Monahan, Fairplay, 781-315-2586Children’s advocates Fairplay and Center for Digital Democracy respond to today’s announcement that the FTC proposes action to address Facebook’s privacy violations in practices impacting children and teens.  And see important new information compiled by Fairplay and CDD, linked below.Josh Golin, executive director, Fairplay:The action taken by the Federal Trade Commission against Meta is long overdue. For years, Meta has flouted the law and exploited millions of children and teens in their efforts to maximize profits, with little care as to the harms faced by young users on their platforms. The FTC has rightly recognized Meta simply cannot be trusted with young people’s sensitive data and proposed a remedy in line with Meta’s long history of abuse of children. We applaud the Commission for its efforts to hold Meta accountable and for taking a huge step toward creating the safe online ecosystem every young American deserves.Jeff Chester, executive director, Center for Digital Democracy:Today’s action by the Federal Trade Commission (FTC) is a long-overdue intervention into what has become a huge national crisis for young people. Meta and its platforms are at the center of a powerful commercialized social media system that has spiraled out of control, threatening the mental health and wellbeing of children and adolescents. The company has not done enough to address the problems caused by its unaccountable data-driven commercial platforms. Amid a continuing rise in shocking incidents of suicide, self-harm and online abuse, as well as exposés from industry “whistleblowers,” Meta is unleashing even more powerful data gathering and targeting tactics fueled by immersive content, virtual reality and artificial intelligence, while pushing youth further into the metaverse with no meaningful safeguards. Parents and children urgently need the government to institute protections for the “digital generation” before it is too late. Today’s action by the FTC limiting how Meta can use the data it gathers will bring critical protections to both children and teens. It will require Meta/Facebook to engage in a proper “due diligence” process when launching new products targeting young people—rather than its current method of “release first and address problems later approach.” The FTC deserve the thanks of U.S parents and others concerned about the privacy and welfare of our “digital generation.”NEW REPORTS:META HAS A LONG HISTORY OF FAILING TO PROTECT CHILDREN ONLINE(link is external)(from Fairplay)META’S VIRTUAL REALITY-BASED MARKETING APPARATUS POSES RISKS TO TEENS AND OTHERS(from CDD)
     by
  • Advocates Fairplay, Eating Disorders Coalition, Center for Digital Democracy, and others announce support of the newly reintroduced Kids Online Safety ActContact:David Monahan, Fairplay (david@fairplayforkids.org)Advocates pledge support for landmark bill requiring online platforms to protect kids, teens with “safety by design” approachAdvocates Fairplay, Eating Disorders Coalition, Center for Digital Democracy, and others announce support of the newly reintroduced Kids Online Safety ActBOSTON, MA and WASHINGTON, DC — May 2, 2023 — Today, a coalition of leading advocates for children’s rights, health, and privacy lauded the introduction of the Kids Online Safety Act (KOSA), a landmark bill that would create robust online protections for children and teens online. Among the advocates pledging support for KOSA are Fairplay, Eating Disorders Coalition, the American Academy of Pediatrics, the American Psychological Association, and Common Sense.KOSA, a bipartisan bill from Senators Richard Blumenthal (D-CT) and Martha Blackburn (R-TN), would make online platforms and digital providers abide by a “duty of care” requiring them to eliminate or mitigate the impact of harmful content on their platforms. The bill would also require platforms to default to the most protective settings for minors and enable independent researchers to access “black box” algorithms to assist in research on algorithmic harms to children and teens.The reintroduction of the Kids Online Safety Act coincides with a rising tide of bipartisan support for action to protect children and teens online amidst a growing youth mental health crisis. A February report from the CDC showed that teen girls and LGBTQ+ youth are facing record levels of sadness and despair, and another report from Amnesty International indicated that 74% of youth check social media more than they’d like.Fairplay Executive Director, Josh Golin:“For far too long, Big Tech have been allowed to play by their own rules in a relentless pursuit of profit, with little regard for the damage done to the children and teens left in their wake. Companies like Meta and TikTok have made billions from hooking kids on their products by any means necessary, even promoting dangerous challenges, pro-eating disorder content, violence, drugs, and bigotry to the kids on their platforms. The Kids Online Safety Act stands to change all that. Today marks an exciting step toward the internet every young person needs and deserves, where children and teens can explore, socialize and learn without being caught in Big Tech crossfire.”National Alliance for Eating Disorders CEO and EDC Board Member, Johanna Kandel:“The Kids Online Safety Act is an integral first step in making social media platforms a safer place for our children. We need to hold these platforms accountable for their role in exposing our kids to harmful content, which is leading to declining mental health, higher rates of suicide, and eating disorders. As both a CEO of an eating disorders nonprofit and a mom of a young child, these new laws would go a long way in safeguarding the experiences our children have online.”Center for Digital Democracy Deputy Director, Katharina Kopp:“The Kids Online Safety Act (KOSA), co-sponsored by Senators Blumenthal and Blackburn, will hold social media companies accountable for their role in the public health crisis that children and teens experience today. It will require platforms to make better design choices that ensure the well-being of young people. KOSA is urgently needed to stop online companies operating in ways that encourage self-harm, suicide, eating disorders, substance use, sexual exploitation, patterns of addiction-like behaviors, and other mental and physical threats.  It also provides safeguards to address unfair digital marketing tactics. Children and teens deserve an online environment that is safe. KOSA will significantly reduce the harms that children, teens, and their families experience online every day.”Children and Screens: Institute of Digital Media and Children Development Executive Director, Kris Perry:“We appreciate the Senators’ efforts to protect children in this increasingly complicated digital world. KOSA will allow access to critical datasets from online platforms for academic and research organizations. This data will facilitate scientific research to better understand the overarching impact social media has on child development."###kosa_reintro_pr.pdf
  • Statement from Children’s Advocacy Groups on New Social Media Bill by U.S. Senators Schatz and CottonWashington, D.C., April 26, 2023– Several children’s advocacy groups expressed concern today with parts of a new bill intended to protect kids and teens from online harms.  The bill, “The Protecting Kids on Social Media Act,” was introduced this morning by U.S. Sens. Brian Schatz (D-HI) and Tom Cotton (R-AR).The groups, including Common Sense Media, Fairplay, and The Center for Digital Democracy, play a leading role on legislation in Congress to ensure that tech companies, and social media platforms in particular, are held accountable for the serious and sometimes deadly harms related to the design and operation of these platforms. They said the new bill is well-intentioned in the face of a youth mental health crisis and has some features that should be adopted, but that other aspects of the bill take the wrong approach to a serious problem.The groups said they support the bill’s ban on algorithmic recommendation systems to minors, which would prevent platforms from using personal data of minors to amplify harmful content to them. However, they said they object to the fact that the bill places too many new burdens on parents and creates unrealistic bans and institutes potentially harmful parental control over minors’ access to social media. By requiring parental consent before a teen can use a social media platform, vulnerable minors, including LGBTQ+ kids and kids who live in unsupportive households, may be cut off from access to needed resources and community. At the same time, kids and teens could pressure their parents or guardians to provide consent. Once young users make it onto the platform, they will still be exposed to addictive or unsafe design features beyond algorithmic recommendation systems, such as endless scroll and autoplay. The bill’s age verification measures also introduce troubling implications for the privacy of all users, given the requirement for covered companies to verify the age of both adult and minor users. Despite its importance, there is currently no consensus on how to implement age verification measures without compromising users’ privacy. The groups said that they strongly support other legislation that establish important guardrails on platforms and other tech companies to make the internet a healthier and safer place for kids and families, for example the Kids Online Safety Act (KOSA), COPPA 2.0, bi-partisan legislation that was approved last year by the Senate Commerce Committee and expected to be reintroduced again this year.“We appreciate Senators Schatz and Cotton's effort to protect kids and teens online and we look forward to working with them as we have with many Senators and House members over the past several years. But this is a life or death issue for families and we have to be very careful about how to protect kids online. The truth is, some approaches to the problem of online harms to kids risk further harming kids and families,” said James P. Steyer, founder and CEO of Common Sense Media. “Congress should place the onus on companies to make the internet safer for kids and teens and avoid placing the government in the middle of the parent-child relationship. Congress has many good policy options already under consideration and should act on them now to make the internet healthier and safer for kids.”“We are grateful to Senators Schatz, Cotton, Britt and Murphy for their efforts to improve the online environment for young people but are deeply concerned their bill is not not the right approach,” said Josh Golin, Executive Director of Fairplay. “ Young people deserve secure online spaces where they can safely and autonomously socialize, connect with peers, learn, and explore. But the Protecting Kids on Social Media Act does not get us any closer to a safer internet for kids and teens. Instead, if this legislation passes, parents will face the same exact conundrum they face today: Do they allow their kids to use social media and be exposed to serious online harms, or do they isolate their children from their peers? We need legislative solutions that put the burden on companies to make their platforms safer, less exploitative, and less addictive, instead of putting even more on parents’ plates.”"It’s critical that social media platforms are held accountable for the harmful impacts their practices have on children and teens. However, this bill’s approach is misguided. It places too much of a burden on parents, instead of focusing on platforms’ business practices that have produced the unprecedented public health crisis that harms our children’s physical and mental well-being. Kids and teens should not be locked out of our digital worlds, but be allowed online where they can be safe and develop in age-appropriate ways. One of the unintended consequences of this bill will likely be a two-tiered online system, where poor and otherwise disadvantaged parents and their children will be excluded from digital worlds. What we need are policies that hold social media companies truly accountable, so all young people can thrive,” said Katharina Kopp, Ph.D., Deputy Director of the Center for Digital Democracy.schatz-cotton_bill_coalition_statement.pdf
  • Citing research that illustrates a number of serious risks to children and teens in the Metaverse, advocates say Meta must wait for more research and root out dangers before targeting youth in VR. BOSTON, MA, WASHINGTON, DC and LONDON, UK — Friday, April 14, 2023 — Today, a coalition of over 70 leading experts and advocates for health, privacy, and children’s rights are urging Meta to abandon plans to allow minors between the ages of 13 and 17 into Horizon Worlds, Meta’s flagship virtual reality platform. Led by Fairplay, the Center for Digital Democracy (CDD), and the Center for Countering Digital Hate (CCDH), the advocates underscored the dearth of research on the impact of time spent in the Metaverse on the health and wellbeing of youth as well as the company’s track record of putting profits ahead of children’s safety. The advocates’ letter maintained that the Metaverse is already unsuitable for use by children and teens, citing March 2023 research from CCDH which revealed that minors already using Horizon Worlds were routinely exposed to harassment and abuse—including sexually explicit insults and racist, misogynistic, and homophobic harassment—and other offensive content. In addition to the existing risks present in Horizon Worlds, the advocates’ letter outlined a variety of potential risks facing underage users in the Metaverse, including magnified risks to privacy through the collection of biomarkers, risks to youth mental health and wellbeing, and the risk of discrimination, among others.In addition to Fairplay, CDD, and CCDH, the 36 organizations signing on include Common Sense Media, the Electronic Privacy Information Center (EPIC), Public Citizen, and the Eating Disorders Coalition.The 37 individual signatories include: Richard Gephardt of the Council for Responsible Social Media, former Member of Congress and House Majority Leader; Sherry Turkle, MIT Professor and author of Alone Together and Reclaiming Conversation; and social psychologist and author Jonathan Haidt.Josh Golin, Executive Director, Fairplay:“It's beyond appalling that Mark Zuckerberg wants to save his failing Horizons World platform by targeting teens. Already, children are being exposed to homophobia, racism, sexism, and other reprehensible content on Horizon Worlds. The fact that Mr. Zuckerberg is even considering such an ill-formed and dangerous idea speaks to why we need Congress to pass COPPA 2.0 and the Kids Online Safety Act.”Katharina Kopp, PhD, Deputy Director, Center for Digital Democracy:“Meta is demonstrating once again that it doesn’t consider the best interest of young people when it develops plans to expand its business operations.  Before it considers opening its Horizon Worlds metaverse operation to teens, it should first commit to fully exploring the potential consequences.  That includes engaging in an independent and research-based effort addressing the impact of virtual experiences on young people’s mental and physical well-being, privacy, safety, and potential exposure to hate and other harmful content.  It should also ensure that minors don’t face forms of discrimination in the virtual world, which tends to perpetuate and exacerbate ‘real life’ inequities.”Mark Bertin, MD, Assistant Professor of Pediatrics at New York Medical College, former Director of Developmental Behavioral Pediatrics at the Westchester Institute for Human Development, author of The Family ADHD Solution, Mindful Parenting for ADHD, and How Children Thrive:“This isn't like the panic over rock and roll, where a bunch of old folks freaked out over nothing. Countless studies already describe the harmful impact of Big Tech products on young people, and it’s worsening a teen mental health crisis. We can't afford to let profit-driven companies launch untested projects targeted at kids and teens and let families pick up the pieces after. It is crucial for the well-being of our children that we understand what is safe and healthy first.” Imran Ahmed, CEO of the Center for Countering Digital Hate:“Meta is making the same mistake with Horizon Worlds that it made with Facebook and Instagram. They have prioritized profit over safety in their design of the product, failed to provide meaningful transparency, and refused to take responsibility for ensuring worlds are safe, especially for children.“Yet again, their aim is speed to market in order to achieve monopoly status – rather than building truly sustainable, productive and enjoyable environments in which people feel empowered and safe.“Whereas, to some, ‘move fast and break things’ may have appeared swashbuckling from young startup entrepreneurs, it is a brazenly irresponsible strategy coming from Meta, one of the world’s richest companies. It should have learned lessons from the harms their earlier products imposed on society, our democracies and our citizens.”horizonletter.pdf
    Jeff Chester
     by
  • Reports indicate FTC plans to advance case against Amazon for violation of kids’ privacy after advocates’ 2019 complaint. BOSTON, MA and WASHINGTON, DC — Friday, March 31, 2023 — Following a groundbreaking investigation of Amazon’s Echo Dot Kids by Fairplay and Center for Digital Democracy (CDD), the Federal Trade Commission is preparing to advance a case against Amazon for the company’s violations of children’s privacy law to the Department of Justice. According to new reporting from Politico, the case centers on Amazon’s violations of the Children’s Online Privacy Protection Act (COPPA) through its Alexa voice assistant.In 2019, privacy advocates Fairplay and CDD called for the FTC to take action against Amazon after an investigation of the company’s Echo Dot Kids smart home assistant, a candy-colored version of Amazon’s flagship home assistant with Alexa voice technology. The investigationrevealed a number of shocking illegal privacy violations, including Amazon’s indefinite retention of kids’ sensitive data even after parents requested for it to be deleted. Now, reports indicate that the FTC is acting on the advocates’ calls for investigation.“We’re thrilled that the Federal Trade Commission and Department of Justice are close to taking action against Amazon for its egregious violations of children’s privacy,” said Josh Golin, Executive Director of Fairplay. “We know it’s not just social media platforms and apps thatmisuse children’s sensitive data. This landmark case would be the first time the FTC sanctioned the maker of a voice-enabled device for flouting COPPA. Amazon and its Big Tech peers must learn that COPPA violations are not just a cost of doing business.” “It is time for the FTC to address the rampant commercial surveillance of children via Internet of Things (IoT) devices, such as Amazon’s Echo, and enforce existing law,” said Katharina Kopp, Director of Policy at Center for Digital Democracy. “Children are giving away sensitive personal data on a massive scale via IoT devices, including their voice recordings and data gleaned from kids’ viewing, reading, listening, and purchasing habits. These data practices lead to violating children’s privacy, to manipulating them into being interested in harmful products, undermining their autonomy, and to perpetuating discrimination and bias. Both the FTC and the Department of Justice must hold Amazon accountable.”[see attached for additional comments] ftc_amazon_investigation_statement_fairplay_cdd.pdf
    Jeff Chester