CDD

Newsroom

  • Press Release

    Children’s Advocates Urge the Federal Trade Commission to Enact 21st Century Privacy Protections for Children

    More than ten years since last review, organizations urge the FTC to update the Children’s Online Privacy Protection Act (COPPA)

    FOR IMMEDIATE RELEASEContact:David Monahan, Fairplay: david@fairplayforkids.org Jeff Chester, Center for Digital Democracy: jeff@democraticmedia.org Children’s Advocates Urge the Federal Trade Commission toEnact 21st Century Privacy Protections for Children More than ten years since last review, organizations urge the FTC to updatethe Children’s Online Privacy Protection Act (COPPA)  WASHINGTON, DC — Tuesday, March 12, 2024 – A coalition of eleven leading health, privacy, consumer protection, and child rights groups has filed comments at the Federal Trade Commission (FTC) offering a digital roadmap for stronger safeguards while also supporting many of the agency’s key proposals for updating its regulations implementing the bipartisan Children’s Online Privacy Protection Act (COPPA).   Comments submitted by Fairplay, the Center for Digital Democracy,  the American Academy of Pediatrics, and other advocacy groups supported many of the changes the commission proposed in its Notice of Proposed Rulemaking issued in December 2023. The groups, however, told the FTC that a range of additional protections are required to address the “Big Data” and Artificial Intelligence (AI) driven commercial surveillance marketplace operating today, where children and their data are a highly prized and sought after target across online platforms and applications. “The ever-expanding system of commercial surveillance marketing online that continually tracks and targets children must be reined in now,” said Katharina Kopp, Deputy Director, Director of Policy, Center for Digital Democracy.  “The FTC and children’s advocates have offered a digital roadmap to ensure that data gathered by children have the much-needed safeguards. With kids being a key and highly lucrative target desired by platforms, advertisers, and marketers, and with growing invasive tactics such as AI used to manipulate them, we call on the FTC to enact 21st-century rules that place the privacy and well-being of children first.” “In a world where streaming and gaming, AI-powered chatbots, and ed tech in classrooms are exploding, children's online privacy is as important as ever,” said Haley Hinkle, Policy Counsel, Fairplay.  “Fairplay and its partners support the FTC's efforts to ensure the COPPA Rule meets kids' and families' needs, and we lay out the ways in which the Rule addresses current industry practices. COPPA is a critical tool for keeping Big Tech in check, and we urge the Commission to adopt the strongest possible Rule in order to protect children into the next decade of technological advancement.” While generally supporting the commission’s proposal that provides parents or caregivers greater control over a child’s data collection via their consent, the groups told the commission that a number of improvements and clarifications are required to ensure that privacy protections for a child’s data are effectively implemented.  They include issues such as: ●      The emerging risks posed to children by AI-powered chatbots and biometric data collection.●      The need to apply COPPA’s data minimization requirements to data collection, use, and retention to reduce the amount of children’s data in the audience economy and to limit targeted marketing.●      The applicability of the Rule’s provisions – including notice and the separate parental consent for collection and disclosure consent and data minimization requirements – to the vast networks of third parties that claim to share children’s data in privacy safe ways, including “clean rooms”, but still utilize young users’ personal information for marketing.●      The threats posed to children by ed tech platforms and the necessity of strict limitations on any use authorized by schools.●      The need for clear notice, security program, and privacy program requirements in order to effectively realize COPPA’s limitations on the collection, use, and sharing of personal information. The 11 organizations that signed on to the comments are: the Center for Digital Democracy (CDD); Fairplay; American Academy of Pediatrics; Berkeley Media Studies Group; Children and Screens: Institute of Digital Media and Child Development; Consumer Federation of America; Center for Humane Technology; Eating Disorders Coalition for Research, Policy, & Action; Issue One; Parents Television and Media Council; and U.S. PIRG. ### 
  • Comments submitted by Fairplay, the Center for Digital Democracy,  the American Academy of Pediatrics, in response to the COPPA proposed Notice of Proposed Rulemaking issued in December 2023 by the Federal Trade Commission
  •  Washington, DC                                                                                 February 15, 2024For too long social media platforms have prioritized their financial interests over the well-being of young users. Meta, TikTok, YouTube, Snap and other companies should be held accountable for the safety of America's youth. They must be required to prevent harms to kids—such as eating disorders, violence, substance abuse, sexual exploitation, and the exploitation of their privacy.  The Kids Online Safety Act (KOSA) would require platforms to implement the most protective privacy and safety settings by default. It would help prevent the countless tragic results experienced by too many children and their parents. We are in support of the updated language of the Kids Online Safety Act and urge Congress to pass the bill promptly. Katharina Kopp, Ph.D.Director of Policy, Center for Digital Democracy
  • Washington, DC                                                                                   February 15, 2024Digital marketers are unleashing a powerful and pervasive set of unfair and manipulative tactics to target and exploit children and teens.  Wherever they go online— social media, viewing videos, listening to music, or playing games—they are stealthily “accompanied” by an array of marketing practices designed to profile and manipulate them.  The Children and Teens’ Online Privacy Protection Act (COPPA 2.0) will provide urgently needed online privacy safeguards for children and teens and update legislation first enacted nearly 25 years ago.  The proposed new law will deliver real accountability to the digital media as well as help limit harms now experienced by children and teens online. For example, by stopping data targeted ads to young people under 16, the endless stream of information harvested by online companies will be significantly reduced. Other safeguards will limit the collection of personal information for other purposes. COPPA 2.0 will also extend the original COPPA law protections for youth from 12 to 16 years of age.  The proposed law also provides the ability to delete children’s and teen’s data easily. Young people will also be better protected from the myriad of methods used to profile them that has unleashed numerous discriminatory and other harmful practices.  An updated knowledge standard will make this legislation easier to enforce.We welcome the bipartisan updated text from co-sponsors Sen. Markey and Sen. Cassidy and new co-sponsors Chair Sen. Cantwell (D-WA) and Ranking Member Sen. Cruz (R-Texas). Katharina Kopp, Ph.D.Director of Policy, Center for Digital Democracy
    the capitol building in washington d c is shown by Tim Mossholder
  • Regulating the Digital Obesogenic Ecosystem

    Lessons from the 20-year Effort to Pass the United Kingdom’s Online Ban on Unhealthy Food and Beverage Advertising

    Regulating the Global Online Junk Food Marketing SystemThe UK Experience In a March 2023 report, the World Obesity Federation issued a dire prognosis and warning: “The majority of the global population (51%, or over 4 billion people) will be living with either overweight or obesity by 2035 if current trends prevail,” based on the latest figures. The greatest and most rapid increase is expected among young people between the ages of 5 and 19.  Yet, despite these alarming trends, food and beverage companies around the world continue to push ads for junk food, sugar-sweetened sodas, and other harmful products to young people, using increasingly sophisticated and intrusive digital marketing campaigns, such as this one in Indonesia by McDonald’s. The Center for Digital Democracy’s 2021 report, Big Food, Big Tech and the Global Childhood Obesity Pandemic, described the far-reaching, global, digital media and marketing system that now targets children and teens across social media, gaming platforms, and mobile devices, and called for international advocacy efforts to address this threat.The World Health Organization and other international health bodies have urged nations to adopt strong policies to curb digital food marketing. Governments around the world have responded with a host of new restrictions in countries such as Chile, Mexico, Argentina, and Norway.  Amid this growing momentum for regulation, the UK stands out as the country where some of the most comprehensive efforts have been underway for more than two decades to develop food marketing safeguards. These include a recently-passed ban on online junk food advertising, which has triggered a powerful backlash from the industry, along with attempts to derail its implementation. CDD’s latest report – Regulating the Obesogenic Ecosystem: Lessons from the 20-year Effort to Pass the United Kingdom’s Online Ban on Unhealthy Food and Beverage Advertising – offers a detailed case study of this campaign, chronicling the interplay among health advocates, researchers, government policymakers, and corporate lobbyists, and offering insights for other organizations around the world that are seeking to rein in the powerful global food/tech marketing complex.                        
    Kathryn C. Montgomery and Jeff Chester
  • Press Release

    Leading Advocates for Children Applaud FTC Update of COPPA Rule

    Fairplay and the Center for Digital Democracy see a crucial step toward creating safer online experiences for kids

    Contact:David Monahan, Fairplay, david@fairplayforkids.org Jeff Chester, CDD, jeff@democraticmedia.org Leading Advocates for Children Applaud FTC Update of COPPA RuleFairplay and the Center for Digital Democracy see a crucial step toward creating safer online experiences for kids WASHINGTON, DC — December 20, 2023—The Federal Trade Commission has proposed its first update of rules protecting children’s privacy in a decade.  Under the bipartisan Children’s Online Privacy Protection Act of 1998 (COPPA), children under 13 currently have a set of core safeguards designed to control and limit how their data can be gathered and used.   The COPPA Rules were last revised in 2012, and today’s proposed changes offer new protections to ensure young people can go online without losing control over their personal information.  These new rules would help create a safer, more secure, and healthier online environment for them and their families—precisely at a time when they face growing threats to their wellbeing and safety.  The provisions offered today are especially needed to address the emerging methods used by platforms and other digital marketers who target children to collect their data, including through the growing use of AI and other techniques. Haley Hinkle, Policy Counsel, Fairplay:“The FTC’s recent COPPA enforcement actions against Epic Games, Amazon, Microsoft, and Meta demonstrated that Big Tech does not have carte blanche with kids’ data. With this critical rule update, the FTC has further delineated what companies must do to minimize data collection and retention and ensure they are not profiting off of children’s information at the expense of their privacy and wellbeing. Anyone who believes that children deserve to explore and play online without being tracked and manipulated should support this update.” Katharina Kopp, Ph.D., Director of Policy, Center for Digital Democracy:“Children face growing threats as platforms, streaming and gaming companies, and other marketers pursue them for their data, attention, and profits.  Today’s FTC’s proposed COPPA rule update provides urgently needed online safeguards to help stem the tidal wave of personal information gathered on kids. The commission’s plan will limit data uses involving children and help prevent companies from exploiting their information. These rules will also protect young people from being targeted through the increasing use of AI, which now further fuels data collection efforts. Young people 12 and under deserve a digital environment that is designed to be safer for them and that fosters their health and well-being.  With this proposal, we should soon see less online manipulation, purposeful addictive design, and fewer discriminatory marketing practices.” ### 
    girl in white sweater and blue denim jeans sitting on floor by bruce mars
  • Advocates File Amicus in Support of the California Age-Appropriate Design Code Act

    Groups explain to court in NetChoice case the ways commercial surveillance marketers track & target kids

    Today a coalition of groups and individuals filed an Amicus Brief in support of the CAADCA, including Fairplay Inc., Center for Digital Democracy, Common Sense, 5Rights Foundation, Children’s Advocacy Institute, Accountable Tech, Beyond the Screen, Children & Screens, Design It For Us, The Tyler Clementi Foundation, Becca Schmill Foundation, Arturo Béjar, Frances Haugen.
    woman in dress holding sword figurine by Tingey Injury Law Firm
  • The insatiable quest to acquire more data has long been a force behind corporate mergers in the US—including the proposed combination of supermarket giants Albertsons and Kroger. Both grocery chains have amassed a powerful set of internal “Big Data” digital marketing assets, accompanied by alliances with data brokers, “identity” management firms, advertisers, streaming video networks, and social media platforms. Albertsons and Kroger are leaders in one of the fastest-growing sectors in the online surveillance economy—called “retail media.” Expected to generate $85 billion in ad spending in the US by 2026, and with the success of Amazon as a model, there is a new digital “gold rush” by retailers to cash in on all the loyalty programs, sales information, and other growing ways to target their customers.Albertsons, Kroger, and other retailers including Walmart, CVS, Dollar General and Target find themselves in an enviable position in what’s being called the “post-cookie” era. As digital marketing abandons traditional user-tracking technologies, especially third-party cookies, in order to address privacy regulations, leading advertisers and platforms are lining up to access consumer information they believe comes with less regulatory risk. Supermarkets, drug stores, retailers and video streaming networks have massive amounts of so-called “first-party” authenticated data on consumers, which they claim comes with consent to use for online marketing. That’s why retail media networks operated by Kroger and others, as well as data harvested from streaming companies, are among the hottest commodities in today’s commercial surveillance economy. It’s not surprising that Albertsons and Kroger now have digital marketing partnerships with companies like Disney, Comcast/NBCUniversal, Google and Meta—to name just a few.The Federal Trade Commission (FTC) is currently reviewing this deal, which is a test case of how well antitrust regulators address the dominant role that data and the affordances of digital marketing play in the marketplace. The “Big Data” digital marketing era has upended many traditional marketplace structures; consolidation is accompanied by a string of deals that further coalesces power to incumbents and their allies. What’s called “collaboration”—in which multiple parties work together to extend individual and collective data capabilities—is now a key feature operating across the broader online economy, and is central to the Kroger/Albertsons transaction. Antitrust law has thus far failed to address one of the glaring threats arising from so many mergers today—their impact on privacy, consumer protection, and diversity of media ownership. Consider all the transactions that the FTC and Department of Justice have allowed in recent years, such as the scores of Google and Facebook acquisitions, and what deleterious impact they had on competition, data protection, and other societal outcomes.Under Chair Lina Khan, the FTC has awakened from what I have called its long “digital slumber,” moving to the forefront in challenging proposed mergers and working to develop more effective privacy safeguards. My organization told the commission that addressing the current role data-driven marketing plays in the Albertsons and Kroger merger, and how consolidating the two digital operations is really central to the two companies’ goals for the deal, must be part of its antitrust case.Kroger has been at the forefront of understanding how the sales and marketing of groceries and other consumer products have to operate simultaneously in-store and online. It acquired a leading “retail, data science, insights and media” company in 2015—which it named 84.51° after its geo coordinates in Cincinnati. Today, 84.51° touts its capabilities to leverage “data from over 62 million households” in influencing consumer buying behavior “both in-store and online,” using “first party retail data from nearly 1 of 2 US households and more than two billion transactions.” Kroger’s retail media division—called “Precision Marketing”—draws on the prowess of 84.51° to sell a range of sophisticated data targeting opportunities for advertisers, including leading brands that stock its in-store and online shelves. For example, ads can be delivered to customers when they search for a product on the Kroger website or its app; when they view digital discount coupons; and when customers are visiting non-Kroger-owned sites.These initiatives have created a number of opportunities for Kroger to make money from data. Last year, Precision Marketing opened its “Private Marketplace” service that enables advertisers to access Kroger customers via targeting lists of what are known as “pre-optimized audiences” (groups of consumers who have been analyzed and identified as potential customers for various products). Like other retailers, Kroger has a data and ad deal with video streaming companies, including Disney and Roku. Its alliance with Disney enables it to take advantage of that entertainment company’s major data-marketing assets, including AI tools and the ability to target consumers using its “250 million user IDs.”Likewise, the Albertsons “Media Collective” division promises advertisers that its retail media “platform” connects them to “over 100 million consumers.” It offers similar marketing opportunities for grocery brands as Kroger, including targeting on its website, app and also when its customers are off-site. Albertsons has partnerships across the commercial surveillance advertising spectrum, including with Google, the Trade Desk, Pinterest, Criteo and Meta/Facebook. It also has a video streaming data alliance involving global advertising agency giant Omnicom that expands its reach with viewers of Comcast’s NBCUniversal division, as well as with Paramount and Warner Bros./Discovery.Both Kroger and Albertsons partner with many of the same powerful identity-data companies, including data-marketing and cross-platform leaders LiveRamp and the Trade Desk. Through these relationships, the two grocery chains are connected to a vast network of databrokers that provide ready access to customer health, financial, and geolocation information, for example. The two grocery chains also work with the same “retail data cloud” company that further extends their marketing impact. Further compounding the negative competitive and privacy threats from this deal is its role in providing ongoing “closed-loop” consumer tracking to better perfect the ways retailers and advertisers measure the effectiveness of their marketing. They know precisely what you bought, browsed and viewed—in store and at home.Antitrust NGOs, trade unions and state attorneys-general have sounded the alarm about the pending Albertsons/Kroger deal, including its impact on prices, worker rights and consumer access to services. As the FTC nears a decision point on this merger, it should make clear that such transactions, which undermine competition, privacy, and expand the country’s commercial surveillance apparatus, should not be permitted.This article was originally published by Tech Policy Press.
    gray shopping cart on gray brick wall by Joshua Hoehne
  • Press Release

    BREAKING: Advocates Decry Meta’s Attempt to Shut Down the FTC

    In response to an order that would prohibit Meta from monetizing minors’ data, the social media company has filed a suit claiming the agency’s structure is unconstitutional. 

    Contact: David Monahan, Fairplay, david@fairplayforkids.orgContact: Jeff Chester, CDD, jeff@democraticmedia.org BREAKING: Advocates Decry Meta’s Attempt to Shut Down the FTCIn response to an order that would prohibit Meta from monetizing minors’ data, the social media company has filed a suit claiming the agency’s structure is unconstitutional. WASHINGTON, DC – THURSDAY, NOVEMBER 30, 2023 – Advocates for children and privacy condemned a lawsuit filed last evening by Meta against the Federal Trade Commission that seeks to shut the agency down by asserting the Commission’s structure is unconstitutional.  Meta’s suit comes in response to a proposed FTC order prohibiting Meta from monetizing children’s data for violating the Children’s Online Privacy Protection Act (COPPA) while already operating under a Consent Decree for multiple serious privacy violations. Earlier this week, Judge Timothy Kelly of the U.S. District Court for the District of Columbia denied a motion filed by Meta that claimed the FTC had no authority to modify its previous settlement. Now Meta is escalating its attacks on the Commission’s authority.Meta has posed a threat to the privacy and welfare of young people in the U.S. for many years, as it targeted them to further its data-driven commercial surveillance advertising system. Scandal after scandal has exposed the company’s blatant disregard for children and youth, with nearly daily headlines about its irresponsible actions coming from former employees turned whistleblowers and major multi-state and bi-partisan investigations of states attorneys-general.  Despite multiple attempts by regulators to contain Meta’s ongoing undermining of its user privacy, including through multiple FTC consent decrees, it is evident that a substantive remedy is required to safeguard US youth. Fairplay, the Center for Digital Democracy, and the Electronic Privacy Information Center (EPIC) have issued these comments on today's announcement of a Meta lawsuit against the Federal Trade Commission: Josh Golin, Executive Director, Fairplay: “While many have noted social media’s role in fueling the mental health crisis, the Federal Trade Commission has taken actual meaningful action to protect young people online by its order prohibiting serial privacy offender Meta from monetizing minor’s data. So it’s not surprising that Meta is launching this brazen attack on the Commission, especially given the company may have $200 billion in COPPA liability according to recently unsealed documents. Anyone who cares about the wellbeing of children– and the safety of American consumers – should rally to the defense of the Commission and be deeply concerned about the lengths Meta will go to preserve its ability to profit at the expense of young people.”  Katharina Kopp, Director of Policy, Center for Digital Democracy: “For decades Meta has put the maximization of profits from so-called behavioral advertising above the best interests of children and teens. Meta’s failure to comply repeatedly with its 2012 and 2020 settlements with the FTC, including its non-compliance with the federal children’s privacy law (COPPA), and the unique developmental vulnerability of minors, justifies the FTC to propose the modifications of Meta’s consent decree and to require it to stop profiting from the data it gathers on children and teens.  It should not surprise anybody then that Meta is now going after the FTC with its lawsuit. But this attack on the FTC is essentially an attack on common sense regulation to curtail out-of-control commercial power and an attack on our children, teenagers, and every one of us.” John Davisson, Director of Litigation, Electronic Privacy Information Center (EPIC): “It seems there's no legal theory, however far-fetched, that Meta won't deploy to avoid a full accounting of its harmful data practices. The reason is clear. A hearing before the FTC will confirm that Meta continues to mishandle personal data and put the privacy and safety of minors at risk, despite multiple orders not to do so. The changes FTC is proposing to Meta's exploitative business model can't come soon enough. We hope the court will reject Meta's latest attempt to run out the clock, as another federal court did just this week.” ### 
    black metal window frame on brown concrete wall by Ian Hutchinson
  • Common Sense Media (“Common Sense”), the Center for Digital Democracy (“CDD”), and Fairplay submit comments in response to the National Telecommunications and Information Administration’s (“NTIA’s”) Request for Comment re its Initiative to Protect Youth Mental Health, Safety & Privacy Online.
  • Contact:David Monahan, Fairplay: david@fairplayforkids.org Day of Action as Advocates for Youth Urge Hill:Pass the Kids Online Safety Act Now Momentum keeps growing: 217 organizations call on Congress to address the youth mentalhealth crisis spurred by social media WASHINGTON, D.C. – Wednesday, November 8, 2023 – Today, a huge coalition of advocacy groups is conducting a day of action urging Congress to finally address the youth mental health crisis and pass the Kids Online Safety Act (KOSA, S. 1409). Momentum in support of the bill continues to build and today 217 groups which advocate for children and teens across a myriad of areas–including mental health, privacy, suicide prevention, eating disorders, and child sexual abuse prevention–are sending a letter urging Senate Majority Leader Schumer and Senate Minority Leader McConnell to move KOSA to a floor vote by the end of this year. “After numerous hearings and abundant research findings,” the coalition writes, “the evidence is clear of the potential harms social media platforms can have on the brain development and mental health of our nation’s youth, including hazardous substance use, eating disorders, and self-harm.” “With this bipartisan legislation,” they write, “Congress has the potential to significantly improve young people’s wellbeing by transforming the digital environment for children and teens.” KOSA, authored by Senators Richard Blumenthal (D-CT) and Marsha Blackburn (R-TN) enjoys growing bi-partisan support; it is endorsed by 48 US Senators–24 from each side of the aisle. Today’s day of action will see supporters of the 217 organizations calling senators to urge a floor vote and support for KOSA. The letter and day of action follow: a suit by 42 attorneys general against Meta for exploiting young users’ vulnerabilities on Instagram; persistent calls to pass KOSA from parents of children who died from social media harms; and Tuesday’s Senate Judiciary Committee hearing on Social Media and the Teen Mental Health Crisis. COMMENTS: Josh Golin, Executive Director of Fairplay:“Every day that Congress allows social media companies to self-regulate, children suffer, and even die, from preventable harms and abuses online. The Kids Online Safety Act would force companies like Meta, TikTok and Snap to design their platforms in ways that reduce risks for children, and create a safer and less addictive internet for young people. James P. Steyer, Founder and CEO of Common Sense Media:"With a new whistleblower and 42 states filing suit against Meta for its deceptive practices and dangerous platform design, the growing support and urgent need for KOSA is now too strong to ignore. Common Sense will continue to work with lawmakers and advocates on both sides of this bill to once and for all begin to curb the harms that online platforms are causing for youngpeople. Sacha Haworth, Executive Director of the Tech Oversight Project:“The disturbing revelations sadly add to a mountain of evidence proving that tech companies are willfully negligent and even openly hostile to protecting minors from the harms their products bring. As a mother, my heart breaks for the kids who experienced pain and harassment online because tech executives were willing to sacrifice their physical and emotional health in pursuit of profit. Lives are on the line, and we cannot sit on the sidelines. We need to pass KOSA to force companies like Meta to protect children and teens and treat them with the dignity they deserve.” Katharina Kopp, Director of Policy, Center for Digital Democracy:“The public health crisis that children and teens experience online requires an urgent intervention from policymakers. We need platform accountability and an end to the exploitation of young people. Their well-being is more important than the ‘bottom-line’ interests of platforms. KOSA will prevent companies taking advantage of the developmental vulnerabilities of children and teens. We urge the U.S Senate to bring KOSA to a floor vote by the end of the year.” ###
  • Blog

    Is So-called Contextual Advertising the Cure to Surveillance-based “Behavioral” Advertising?

    Contextual advertising might soon rival or even surpass behavioral advertising’s harms unless policy makers intervene

    Contextual advertising is said to be privacy-safe because it eliminates the need for cookies, third-party trackers, and the processing of other personal data. Marketers and policy makers are placing much stock in the future of contextual advertising, viewing it as the solution to the privacy-invasive targeted advertising that heavily relies on personal data.However, the current state of contextual advertising does not look anything like our plain understanding of it in contrast to today's dominant mode of behavioral advertising: placing ads next to preferred content, based on keyword inclusion or exclusion. Instead, industry practices are moving towards incorporating advanced AI analysis of content and its classification, user-level data, and insights into content preferences of online visitors, all while still referring to “contextual advertising.” It is crucial for policymakers to carefully examine this rapidly evolving space and establish a clear definition of what “contextual advertising” should entail. This will prevent the emergence of toxic practices and outcomes, similar to what we have witnessed with surveillance-based behavioral marketing, from becoming the new normal.Let’s recall the reasons for the strong opposition to surveillance-based marketing practices so we can avoid those harms regarding contextual advertising. Simply put, the two main reasons are privacy harms and harms from manipulation. Behavioral advertising is deeply invasive when it comes to privacy, as it involves tracking users online and creating individual profiles based on their behavior over time and across different platforms and across channels. These practices go beyond individual privacy violations and also harm groups of people, perpetuating or even exacerbating historical discrimination and social inequities.The second main reason why many oppose surveillance-based marketing practices is the manipulative nature of commercial messaging that aims to exploit users’ vulnerabilities. This becomes particularly concerning when vulnerable populations, like children, are targeted, as they may not have the ability to resist sophisticated influences on their decision-making. More generally, the behavioral advertising business heavily incentivizes companies to optimize their practices for monetizing attention and selling audiences to advertisers, leading to many associated harms.New and evolving practices in contextual advertising should raise questions for policy makers. They should consider whether the harms we sought to avoid with behavioral marketing may resurface in these new advertising practices as well.Today’s contextual advertising methods are taking advantage of the latest analytical technologies to interpret online content so that contextual ads will likely soon be able to manipulate us just as behavioral ads can. Artificial intelligence (AI), machine learning, natural language processing models for tone and sentiment analysis, computer vision, audio analysis, and more are being used to consider a multitude of factors and in this way “dramatically improve the effectiveness of contextual targeting.” Gumgum’s Verity, for example, “scans text, image, audio and video to derive human-like understandings.” Attention measures – the new performance metric that advertisers crave – indicate that contextual ads are more effective than non-contextual ads. Moments.AI, a “real-time contextual targeting solution” by the Verve Group, for example, allows brands to move away from clicks and to “optimize towards consumer attention instead,” for “privacy-first” advertising solutions.Rather than analyzing one single URL or one article at a time, marketers can analyze a vast range of URLs and can “understand content clusters and topics that audiences are engaging with at that moment” and so use contextual targeting at scale. The effectiveness and sophistication of contextual advertising allows marketers to use it not just for enhancing brand awareness, but also for targeting prospects. In fact, the field of “neuroprogammatic” advertising “goes beyond topical content matching to target the subconscious feelings that lead consumers to make purchasing decisions,” according to one industry observer. Marketers can take advantage of how consumers “are feeling and thinking, and what actions they may or may not be in the mood to take, and therefore how likely are to respond to an ad. Neuroprogrammatic targeting uses AI to cater to precisely what makes us human.”These sophisticated contextual targeting practices may have negative effects similar to those of behavioral advertising, however. For instance, contextual ads on weight loss programs can be placed alongside content related to dieting and eating disorders due to its semantic, emotional, and visual content. This may have disastrous consequences similar to targeted behavioral ads aimed at teenagers with eating disorders. Therefore, it is important to question how different these practices are from individual user tracking and ad targeting. If content can be analyzed and profiled along very finely tuned classification schemes, advertisers don’t need to track users across the web. They simply need to track the content that will deliver the relevant audience and engage individuals based on their interests and feelings.Apart from the manipulative nature of contextual advertising, the use of personal data and associated privacy violations are also concerning. Many contextual ad tech companies claim to engage in contextual targeting “without any user data.” But, in fact, so-called contextual ad tech companies often rely on session data such as browser and page-level data, device and app-level data, IP address, and “whatever other info they can get their hands on to model the potential user,” framing it as “contextual 2.0.” Until recently, this practice might have been more accurately referred to as device fingerprinting. The claim is that session data is not about tracking, but only about the active session and usage at one point in time. No doubt, however, the line between contextual and behavioral advertising becomes blurry when such data is involved.Location-based targeting is another aspect of contextual advertising that raises privacy concerns. Should location-based targeting be considered contextual? Uber’s “Journey Ads” lets advertisers target users based on their destination. A trip to a restaurant might trigger alcohol ads; a trip to the movie theater might result in ads for sugary beverages. According to AdExchanger, Uber claims that it is not “doing any individual user-based targeting” and suggests that it is a form of contextual advertising.Peer 39 also includes location data in its ad-targeting capabilities and still refers to these practices as contextual advertising. The use of location data can reveal some of the most sensitive information about a person, including where she works, sleeps, socializes, worships, and seeks medical treatment. When combined with session data, the information obtained from sentiment, image, video, and location analysis can be used to create sophisticated inferences about individuals, and ads placed in this context can easily clash with consumer expectations of privacy.Furthermore, placing contextual ads next to user-generated content or within chat groups changes the parameters of contextual targeting. Instead of targeting the content itself, the ad becomes easily associated with an individual user. Reddit’s “contextual keyword targeting” allows advertisers to target by community and interests, discussing LGBTQ+ sensitive topics, for example. This is similar to the personalized nature of targeted behavioral advertising, and can thus raise privacy concerns.Cohort targeting, also referred to as “affinity targeting” or “content affinity targeting,” further blurs the line between behavioral and contextual advertising by combining content analytics with audience insights. “This bridges the gap between Custom Cohorts and your contextual signals, by taking learning from consented users to targeted content where a given Customer Cohort shows more engagement than the site average,” claims Permutive.Oracle uses various cohorts with demographic characteristics, including age, gender, and income, for example, as well as “lifestyle” and “retail” interests, to understand what content individuals are more likely to consume. While reputedly “designed for privacy from the ground up,” this approach allows Oracle to analyze what an audience cohort views and to “build a profile of the content types they’re most likely to engage with,” allowing advertisers to find their “target customers wherever they are online.” Playground XYZ enhances contextual data with eye-tracking data from opt-in panels, which measures attention and helps to optimize which content is most “eye-catching,” “without the need for cookies or other identifiers.”Although these practices may seem privacy neutral (relying on small samples of online users or “consented users”), they still allow advertisers to target and manipulate their desired audience. Message targeting based on content preferences of fine-tuned demographic characteristics (household income less than $20K or over $500K, for example) can lead to discriminatory practices and disparate impact that can deepen social inequities, just like the personalized targeting of online users.Hyper-contextual content analysis with a focus on measuring sentiment and attention, the use of session information, placing ads next to user-generated content as well as within interest group chats, and employing audience panels to profile content are emerging practices in contextual advertising that require critical examination. The touted privacy-first promise of contextual advertising is deceptive. It seems that contextual advertising is more manipulative, invasive of privacy, and likely to contribute to discrimination and perpetuate inequities among consumers than we all initially thought.What’s more, the convergence of highly sensitive content analytics with content profiling based on demographic characteristics (and potentially more), could result in even more potent digital marketing practices than those currently being deployed. By merging contextual data with behavioral data, marketers might gain a more comprehensive understanding of their target audience and develop more effective messaging. Additionally, we can only speculate about how modifications to the incentive structure for content delivery of audiences to advertisers might impact content quality.In the absence of policy intervention, these developments may lead to a surveillance system that is even more formidable than the one we currently have. Contextual advertising will not serve as a solution to surveillance-based “behavioral” marketing and its manipulative and privacy invasive nature, let alone the numerous other negative consequences associated with it, including the addictive nature of social media, the promotion of disinformation, and threats to public health.It is vital to formulate a comprehensive and up-to-date definition of contextual advertising that takes into consideration the adverse effects of surveillance advertising and strives to mitigate them. Industry self-regulation cannot be relied on, and legislative proposals do not adequately address the complexities of contextual advertising. The FTC’s 2009 definition of contextual advertising is also outdated in light of the advancements and practices described here. Regulatory bodies like the FTC must assess contemporary practices and provide guidelines to safeguard consumer privacy and ensure fair marketing practices. The FTC’s Children’s Online Privacy Protection Act rule update and its Commercial Surveillance and Data Security Rule provide opportunity to get it right.Failure to intervene may ultimately result in the emergence of a surveillance system disguised as consumer-friendly marketing. This article was originally published by Tech Policy Press.
    Katharina Kopp
  •  September 18, 2023 Comment on the 2023 Merger GuidelinesCenter for Digital DemocracyFTC-2023-0043 The Center for Digital Democracy (CDD) urges the U.S. Department of Justice (DoJ) and the Federal Trade Commission (FTC) to adopt the proposed merger guidelines.  The guidelines are absolutely necessary to ensure the U.S. operates a 21st century antitrust regime and doesn’t keep repeating the mistakes of the last several decades. Failures to understand and address contemporary practices, especially related to data assets, has brought us further consolidation in key markets, including in the digital media. Over rhe decades, CDD has been at the forefront of NGOs sounding the alarm on the consolidation of the digital marketing and advertising industry, including our opposition to such transactions as the Google/Doubleclick merger, Facebook/Instagram, Google/YouTube, Google/AdMob, Oracle/BlueKai and Datalogix, among others. Regulatory approval for these deals has accelerated the consolidation of the online media marketplace, where a tiny handful of companies—Alphabet (Google), Meta and Amazon—dominate the marketplace in terms of advertising revenues and online marketing applications. It has also helped deliver today’s vast commercial surveillance marketplace, with its unrelenting collection and use of information from consumers, small businesses and other potential competitors. The failure to address effectively the role that data assets and processing capabilities play in merger transactions has had unfortunate systemic consequences for the U.S. public. Privacy has been largely lost as a result, since by permitting these data-related deals both agencies signaled that policymakers approved unfettered data-driven commercial surveillance operations. It has also led to the widespread adoption by the largest commercial entities and brands, across all market verticals, to adopt the “Big Data” and personalized digital marketing applications developed by Google, Meta and Amazon—furthering the commercial surveillance stranglehold and helping fuel platform dominance. It has also had a profound and unfortunate impact on the structure of contemporary media, which have embraced the data-driven commercial surveillance paradigm with all its manipulative and discriminatory effects. In this regard, the failure to ensure meaningful antitrust policies has had consequences for the health of our democracy as well. The proposed guidelines should aid regulators better address specific transactions, their implications for specific markets, and the wider “network effects” that such digitally connected mergers trigger. An overall guideline for antitrust authorities should be an examination of the data assets assembled by each entity. Over the last half-decade or so, nearly every major company—regardless of “vertical” market served—has become a “big data” company, using both internal and external assets to leverage a range of data and decision intelligence designed to gather, process and make “actionable” data insights. Such affordances are regularly used for product development, supply, and marketing, among other uses. Artificial intelligence and machine learning applications are also “baked in” to these processes, extending the affordances across multiple operations. Antitrust regulators should inventory the data and digital assets of each proposed transaction entity, including data partnerships that extend capabilities; analyze them in terms of specific market capabilities and industry-wide standards; and review how a given combination might further anti-competitive effects (especially through leveraging data assets via cloud computing and other techniques). As markets further converge in the digital era, where, for example, data-driven marketing operations affect multiple sectors, we suggest that regulators will need to be both creative and flexible in addressing potential harms arising from cross-sectoral impacts. This point relates to Guideline 10 and “multi-sided” platforms. Regarding Guideline 3, we urge the agencies to review how both Alphabet/Google and Meta especially, as a result of prior merger approvals, have been able to determine how the broader online marketplace operates—creating a form of “coordination” problem. The advertising and data techniques developed by the two companies have had an inordinate influence over the development of online practices generally, in essence “dictating” formats, affordances, and market structures. By allowing Alphabet and Meta to grow unchecked, antitrust regulators have allowed the dog to wag the “long tail” of the digital marketplace. We also want to raise the issue of partnerships, since they are a very significant feature of the online market today. In addition to consolidation through acquisitions, companies have assembled a range of data and marketing partners who provide significant resources to these entities. This leveraging of the market through affiliates undermines competition (as well as compounding related issues involving privacy and consumer protection). The steady stream of acquisitions in rapidly evolving markets, such as “over-the-top” streaming video, that further entrenches dominant players and also creates new hurdles for potential competitors, raises the issue addressed in Guideline 8. Repeatedly, especially in digitally connected markets (such as media), there are daily acquisitions that clearly further consolidation. Today they go unchecked, something we hope will be reversed under the proposed paradigm here. Each proposed guideline is essential, in our view, to ensure that relevant information gathering and analysis are conducted for each proposed transaction. We are at a critical period of transition for markets, as data, digital media and technological progress (AI especially) continue to challenge traditional perspectives on dominance and competition. Broader network effects, regarding privacy, consumer protection and impact on democratic institutions should also be addressed by regulators moving forward. The proposed DoJ and FTC merger guidelines will provide critical guidance for the antitrust work to come.  
  • FOR IMMEDIATE RELEASEThursday, September 14, 2023 Contacts:David Monahan, Fairplay, david@fairplayforkids.orgJeff Chester, CDD, jeff@democraticmedia.org Statement of Fairplay and the Center for Digital Democracy on FTC’s Announcement: Protecting Kids From Stealth Advertising in Digital MediaBOSTON, MA, WASHINGTON DC—Today, the Federal Trade Commission released a new staff paper, “Protecting Kids from Stealth Advertising in Digital Media.” The paper’s first recommendation states:“Do not blur advertising. There should be a clear separation between kids’ entertainment/educational content and advertising, using formatting techniques and visual and verbal cues to signal to kids that they are about to see an ad.”This represents a major shift for the Commission. Prior guidance only encouraged marketers to disclose influencer and other stealth marketing to children. For years – including in filings last year and at last year’s FTC Workshop—Fairplay and the Center for Digital Democracy had argued that disclosures are inadequate for children and that stealth marketing to young people should be declared an unfair practice. Below are Fairplay’s and CDD’s comments on today’s FTC staff report:Josh Golin, Executive Director, Fairplay:“Today is an important first step towards ending an exploitative practice that is all too common on digital media for children.  Influencers—and the brands that deploy them—have been put on notice: do not disguise your ads for kids as entertainment or education.”Katharina Kopp, Deputy Director, Director of Policy, Center for Digital Democracy“Online marketing and advertising targeted at children and teens is pervasive, sophisticated and data-driven. Young people are regularly exposed to an integrated set of online marketing operations that are manipulative, unfair, invasive.  These commercial tactics can be especially harmful to the mental and physical health of youth.   We call on the FTC to build upon its new report to address how marketers use the latest cutting-edge marketing tactics to influence young people—including neuro-testing, immersive ad formats and ongoing data surveillance.”###
    person holding smartphone by Rodion Kutsaiev
  • Press Release

    Advocates demand Federal Trade Commission investigate Google for continued violations of children’s privacy law

    Following news of Google’s violations of COPPA and 2019 settlement, 4 advocates ask FTC for investigation

    Contact:Josh Golin, Fairplay: josh@fairplayforkids.orgJeff Chester, Center for Digital Democracy: jeff@democraticmedia.org Advocates demand Federal Trade Commission investigate Google for continued violations of children’s privacy lawFollowing news of Google’s violations of COPPA and 2019 settlement, 4 advocates ask FTC for investigation BOSTON and WASHINGTON, DC – WEDNESDAY, August 23, 2023 – The organizations that alerted the Federal Trade Commission (FTC) to Google’s violations of the Children’s Online Privacy Protection Act (COPPA) are urging the Commission to investigate whether Google and YouTube are once again violating COPPA, as well as the companies’ 2019 settlement agreement and the FTC Act. In a Request for Investigation filed today, Fairplay and the Center for Digital Democracy (CDD) detail new research from Adalytics, as well as Fairplay’s own research, indicating Google serves personalized ads on “made for kids” YouTube videos and tracks viewers of those videos, even though neither is permissible under COPPA. Common Sense Media and the Electronic Privacy Information Center (EPIC), joined Fairplay and CDD in calling on the Commission to investigate and sanction Google for its violations of children’s privacy. The advocates suggest that the FTC should seek penalties upwards of tens of billions of dollars. In 2018, Fairplay and Center for Digital Democracy led a coalition asking the FTC to investigate YouTube for violating the Children’s Online Privacy Protection Act (COPPA) by collecting personal information from children on the platform without parental consent. As a result of the advocates’ complaint, Google and YouTube were required to pay a then-record $170 million fine in a 2019 settlement with the FTC and comply with COPPA going forward. Rather than getting the required parental permission before collecting personally identifiable information from children on YouTube, Google claimed instead it would comply with COPPA by limiting data collection and eliminating personalized advertising on “made for kids.” But an explosive new report released by Adalytics last week called into question Google’s assertions and compliance with federal privacy law. The report detailed how Google appeared to be surreptitiously using cookies and identifiers to track viewers of “made for kids” videos. The report also documented how YouTube and Google appear to be serving personalized ads on “made for kids” videos and transmitting data about viewers to data brokers and ad tech companies. In response to the report, Google told the New York Times that ads on children’s videos are based on webpage content, not targeted to user profiles. But follow-up research conducted independently by both Fairplay and ad buyers suggests the ads are, in fact, personalized and Google is both violating COPPA and making deceptive statements about its targeting of children. Both Fairplay and the ad buyers ran test ad campaigns on YouTube where they selected a series of users of attributes and affinities for ad targeting and instructed Google to only run the ads on “made for kids” channels. In theory, these test campaigns should have resulted in zero placements, because under Google and YouTube’s stated policy, no personalized ads are supposed to run on “made for kids” videos. Yet, Fairplay’s targeted $10 ad campaign resulted in over 1,400 impressions on “made for kids” channels and the ad buyers reported similar results. Additionally, the reporting Google provided to Fairplay and the ad buyers to demonstrate the efficacy of the ad buys would not be possible if the ads were contextual, as Google claims. “If Google’s representations to its advertisers are accurate, it is violating COPPA,” said Josh Golin, Executive Director of Fairplay. “The FTC must launch an immediate and comprehensive investigation and use its subpoena authority to better understand Google’s black box child-directed ad targeting. If Google and YouTube are violating COPPA and flouting their settlement agreement with the Commission, the FTC should seek the maximum fine for every single violation of COPPA and injunctive relief befitting a repeat offender.” The advocates’ letter urges the FTC to seek robust remedies for any violations, including but not limited to: ·       Civil penalties that demonstrate that continued violations of COPPA and Section 5 of the FTC Act are unacceptable. Under current law, online operators can be fined $50,120 per violation of COPPA. Given the immense popularity of many “made for kids” videos, it is likely millions of violations have occurred, suggesting the Commission should seek civil penalties upwards of tens of billions of dollars.·       An injunction requiring relinquishment of all ill-gotten gains·       An injunction requiring disgorgement of all algorithms trained on impermissibly collected data·       A prohibition on the monetization of minors’ data·       An injunction requiring YouTube to move all “made for kids” videos to YouTube Kids and remove all such videos from the main YouTube platform. Given Google’s repeated failures to comply with COPPA on the main YouTube platform – even when operating under a consent decree – these videos should be cabined to a platform that has not been found to violate existing privacy law·       The appointment of an independent “special master” to oversee Google’s operations involving minors and provide the Commission, Congress, and the public semi-annual compliance reports for a period of at least five yearsKatharina Kopp, Deputy Director of the Center for Digital Democracy, said “The FTC must fully investigate what we believe are Google’s continuous violations of COPPA, its 2019 settlement with the FTC, and Section 5 of the FTC Act. These violations place many millions of young viewers at risk. Google and its executives must be effectively sanctioned to stop its ‘repeat offender’ behaviors—including a ban on monetizing the personal data of minors, other financial penalties, and algorithmic disgorgement. The Commission’s investigation should also review how Google enables advertisers, data brokers, and leading online publisher partners to surreptitiously surveil the online activities of young people. The FTC should set into place a series of ‘fail-safe’ safeguards to ensure that these irresponsible behaviors will never happen again.” Caitriona Fitzgerald, Deputy Director of the Electronic Privacy Information Center (EPIC), said "Google committed in 2019 that it would stop serving personalized ads on 'made for kids' YouTube videos, but Adalytics’ research shows that this harmful practice is still happening. The FTC should investigate this issue and Google should be prohibited from monetizing minors’ data."Jim Steyer, President and CEO of Common Sense Media, said "The Adalytics findings are troubling but in no way surprising given YouTube’s history of violating the kids’ privacy. Google denies doing anything wrong and the advertisers point to Google, a blame game that makes children the ultimate losers. The hard truth is, companies — whether it’s Big Tech or their advertisers — basically care only about their profits, and they will not take responsibility for acting against kids’ best interests. We strongly encourage the FTC to take action here to protect kids by hitting tech companies where it really hurts: their bottom line." ### 
    a red play button with a white arrow by Eyestetix Studio
  • In comments to the Federal Trade Commission, EPIC, the Center for Digital Democracy, and Fairplay urged the FTC to center privacy and data security risks as it evaluates Yoti Inc’s proposed face-scanning tool for obtaining verifiable parental consent under the Children’s Online Privacy Protection Act (COPPA).In a supplementary filing CDD urges the Federal Trade Commission (FTC) to reject the parent-consent method proposed by the applicants Entertainment Software Rating Board (ESRB) and EPIC Games’ SuperAwesome division. Prior to any decision, the FTC must first engage in due diligence and investigate the contemporary issues involving the role and use of facial coding technology and its potential impact on children’s privacy. The commission must have a robust understanding of the data flows and insight generation produced by facial coding technologies, including the debate over their role as a key source of “attention” metrics, which are a core advertising measurement modality. Since this proposal is designed to deliver a significant expansion of children’s data collection—given the constellation of brands, advertisers and publishers involved with the applicants and their child-directed market focus—a digital “cautionary” principle on this consent method is especially required here. Moreover, one of the applicants, as well as several key affiliates of the ESRB—EPIC Games, Amazon, and Microsoft—have recently been sanctioned for violating COPPA, and any approval in the absence of a thorough fact-finding here would be premature. 
  • New research released today by Adalytics raises serious questions about whether Google is violating the Children's Online Privacy Protection Act (COPPA) by collecting data and serving personalized ads on child-directed videos on YouTube. In 2019, in response to a Request for Investigation by Fairplay and the Center for Digital Democracy, the Federal Trade Commission fined Google $170 million for violating COPPA on YouTube and required Google to change its data-collection and advertising practices on child-directed videos. As a result of that settlement, Google agreed to stop serving personalized ads and limit data collection on child-directed videos. Today's report - and subsequent reporting by The New York Times - call into question whether Google is complying with the settlement.  STATEMENTS FROM FAIRPLAY AND CDD:Josh Golin, Executive Director, Fairplay:This report should be a wake-up call to parents, regulators and lawmakers, and anyone who cares about children -- or the rule of law, for that matter. Even after being caught red-handed in 2019 violating COPPA, Google continues to exploit young children, and mislead parents and regulators about its data collection and advertising practices on YouTube. The FTC must launch an immediate and comprehensive investigation of Google and, if they confirm this report's explosive allegations, seek penalties and injunctive relief commensurate with the systematic disregard of the law by a repeat offender. Young children should be able to watch age-appropriate content on the world's biggest video platform with their right to privacy guaranteed, full stop. Jeff Chester, Executive Director, Center for Digital Democracy:Google operates the leading online destination for kids’ video programming so it can reap enormous profits, including through commercial surveillance data and advertising tactics.  It must be held accountable by the FTC for what appears are violations of the Children’s Online Privacy Protection Act and its own commitments.   Leading advertisers, ad agencies, media companies and others partnering with Google appear to have been more interested in clicks than the safety of youth. There is a massive and systemic failure across the digital marketplace when it comes to protecting children’s privacy.   Congress should finally stand up to the powerful “Big Data” ad lobby and enact long-overdue privacy legislation.  Google’s operations must also be dealt with by antitrust regulators.  It operates imperiously in the digital arena with no accountability. The Adalytics study should serve as a chilling reminder that our commercial surveillance system is running amok, placing even our most vulnerable at great risk.
  • CDD tells FTC to apply strong data privacy and security rules for health data

    Filing also focuses on role commercial surveillance marketers play targeting physicians and patients

    The Center for Digital Democracy (CDD) endorses the Federal Trade Commission’s (FTC) proposal to better protect health consumer and patient information in the digital era. CDD warned the commission in 2010, as well as in its 2022 commercial surveillance comments, that health data—including information regarding serious medical conditions—are routinely (and cynically) gathered and used for online marketing. This has placed Americans at risk—for loss of their privacy, health-decision autonomy, and personal financial security. The commercial surveillance health data digital marketing system also triggers major strains on the fiscal well-being of federal and private health insurance systems, creating demand for products and services that can be unnecessary and costly. The commission should “turn off the tap” of data flooding the commercial surveillance marketplace, including both direct and inferred health information. The commission can systemically address the multiple data flows—including those on Electronic Health Record (EHR) systems—that require a series of controls. EHR, or personal health record systems, have served as a digital “Achilles heel” of patient privacy, with numerous commercial entities seizing that system to influence physicians and other prescribers as well as to gain insights used for ongoing tracking. The commercialization of health-connected data is ubiquitous, harvested from mobile “apps,” online accounts, loyalty programs, social media posts, data brokers, marketing clouds and elsewhere. Given the contemporary commercial data analytic affordances to generate insights and actions operational today, information gathered for other purposes can be used to generate health-related data. Health information can be combined with numerous other datasets that can reveal ethnicity, location, media use, etc., to create a robust target marketing profile. As programmatic advertising trade publication “AdExchanger” recently noted, “sensitive health data can be collected or revealed through dozens of noncovered entities, from location data providers to retail media companies. And these companies aren’t prevented from sharing data, unless the data was sourced from a covered entity.” The FTC’s Health Breach Notification Rule (HBNR) proposal comes at an especially crucial time for health privacy in the U.S. A recent report on “The State of Patient Privacy,” as noted by Insider Intelligence/eMarketer in July 2023, shows that a majority of Americans “distrust” the role that “Big Tech Companies” play with their health data. A majority of patients surveyed explained that “they are worried about security and privacy protections offered by vendors that handle their health data.” Ninety-five percent of the patients in the survey “expressed concern about the possibility of data breaches affecting their medical records.” These concerns, we suggest, reflect consumer unease regarding their reliance on the online media to obtain health information. For example, “half of US consumers use at least one health monitoring tool,” and “healthcare journeys often start online,” according to the “Digital Healthcare Consumer 2023” report. There is also a generational shift in the U.S. underway, where at least half of young adults (so-called Generation Z) now “turn to social media platforms for health-related purposes either all the time or often…via searches, hashtags QR codes…[and] have the highest rate of mobile health app usage.” The Covid-19 pandemic triggered greater use of health-related apps by consumers. So-called “telehealth” services generate additional data as well, including for online “lead generation.” The growing use of “digital pharmacies” is being attributed to the rising costs of medications—another point where consumer health data is gathered. The FTC should ensure the health data privacy of Americans who may be especially vulnerable—such as those confronting financial constraints, pre-existing or at-risk conditions, or have long been subjected to predatory and discriminatory marketing practices—and who are especially in need of stronger protections. These should include addressing the health-data-related operations from the growing phalanx of retail, grocery, “dollar,” and drug store chains that are expanding their commercial surveillance marketing operations (so-called “retail media”), while providing direct-to-consumer delivered health services. Electronic Health Record systems are a key part of the health and commercial surveillance infrastructure: EHRs have long served as “prime real estate for marketers…[via] data collection, which makes advanced targeting a built-in benefit of EHR marketing.” EHRs are used to influence doctors and other prescribers relying on what’s euphemistically called point-of-care marketing. Marketing services for pharmaceutical and other life science companies can be “contextually integrated into the EHR workflow [delivered] to the right provider at the right time within their EHR [using] awareness messaging targeted on de-identified real-time data specific to the patient encounter.” Such applications are claimed to operate as “ONC-certified and HIPPA-compliant (ONC stands for “Office of the National Coordinator for Health Information,” HHS). The various, largely unaccountable, methods used to target and influence how physicians treat their patients by utilizing EHRs raise numerous privacy and consumer protection issues. For example, “EHR ads can appear in several places at all the stages along the point-of-care journey,” one company explained. Through an “E-Prescribing Screen,” pharma companies are able to offer “co-pay coupons, patient savings offers and relevant condition brand messaging.” Data used to target physicians, including prescription information derived from a consumer, using EHR systems, help trigger more information from and about a health consumer (think about the subsequent role of drug stores, search engines and social media use, gathering of data for coupons, etc.). This “non-virtuous” circle of health surveillance should be subjected to meaningful health data breach and security safeguards. Patient records on EHRs must be safeguarded and the methods used to influence healthcare professionals require major privacy reforms. Contemporary health data systems reflect the structures that comprise the overall commercial surveillance apparatus, including databrokers, marketing clouds, AI: The use of digital marketing to target U.S. health consumers has long been a key “vertical” for advertisers. For example, there are numerous health-focused subsidiaries run by the leading global advertising agencies, all of which have extensive data-gathering and targeting capabilities. These include Publicis Health: “Our proprietary data and analytics community, paired with the unsurpassed strengths of Sapient and Epsilon allow us to deliver unmatched deterministic, behavioral, and transactional data, powered by AI.” IPG Health uses “a proprietary…media, tech and data engine [to] deliver personalized omnichannel experiences across touchpoints.” Its “comprehensive data stack [is] powered by Acxiom.” Ogilvy Health recently identified some of the key social media strategies used by pharmaceutical firms to generate consumer engagement with their brands—helping generate invaluable data. They include, for example, a “mobile-first creative and design approach,” including the use of “stickers, reels, filters, and subtitles” on Instagram and well as “A/B testing” on Facebook and the use of “influencers.” A broad range of consumer-data-collecting partners also operates in this market, providing information and marketing facilitation. Google, Meta, Salesforce, IQVIA, and Adobe are just a few of the companies integrated into health marketing services designed to “activate customer journeys (healthcare professionals and patients) across physical and digital channels [using] real-time, unified data.” Machine learning and AI are increasingly embedded in the health data surveillance market, helping to “transform sales and marketing outcomes,” for example. The use of social media, AI and machine learning, including for personalization, raises concerns that consent is insufficient alone for the release of patient and consumer health information. The commission should adopt its proposed rule, but also address the system-wide affordances of commercial surveillance to ensure health data is truly protected in terms of privacy and security. The commission should endorse a patient health record information definition that reflects both the range and type of data collected, but also the processes used to gather or generate it. The prompting and inducement of physicians, for example, to prescribe specific medications or treatments to a patient, based on the real-time “point-of-care” information transmitted through EHRs, ultimately generate identifiable information. So any interaction and iterative process used to do so should be covered under the rule, reflecting all the elements involved in that decision-making and treatment determinative process. By ensuring that all the entities involved in this system—including health care services or suppliers—must comply with data privacy and security rules, the commission will critically advance data protection in the health marketplace. This should include health apps, which increasingly play a key role in the commercial data-driven marketing complex. All partnering organizations involved in the sharing, delivering, creating and facilitation of health record information should also be held accountable. We applaud the FTC’s work in the health data privacy area, including its important GoodRx case and its highlighting the role that “dark patterns” play in “manipulating or deceiving consumers.” Far too much of the U.S. health data landscape operates as such a “dark pattern.” The commission’s proposed HBNR rules will illuminate this sector, and, in the process, help secure greater privacy and protection for Americans.
  • Blog

    Profits, Privacy and the Hollywood Strike

    Addressing commercial surveillance in streaming video is key to any deal for workers and viewers says Jeff Chester, the executive director of the Center for Digital Democracy.

    Leading studios, networks and production companies in Hollywood—such as Disney, Paramount, Comcast/NBCU, Warner Bros. Discovery and Amazon—know where their dollars will come from in the future. As streaming video becomes the dominant form of TV in the U.S., the biggest players in the entertainment industry are harvesting the cornucopia of data increasingly gathered from viewers. While some studio chiefs publicly chafe over the demands from striking actors and writers as being unrealistic, they know that their heavy investments in “adtech” will drive greater profitability. Streaming video data not only generates higher advertising and commerce revenues, but also serves as a valuable commodity for the precise online tracking and targeting of consumers.Streaming video is now a key part in what the Federal Trade Commission (FTC) calls the “commercial surveillance” marketplace. Data about our viewing behaviors, including any interactions with the content, is being gathered by connected and “smart” TVs, streaming devices such as Roku, in-house studio and network data mining operations, and by numerous targeting and measurement entities that now serve the industry. For example, Comcast’s NBCUniversal “One Platform” uses what it calls “NBCU ID”—a “first-party identifier [that] provides a persistent indicator of who a consumer is to us over time and across audiences.” Last year it rolled out “200 million unique person-level NBCU IDs mapped to 80 million households.” Disney’s Select advertising system uses a “proprietary Audience Graph” incorporating “100,000 attributes” to help “1800 turnkey” targeting segments. There are 235 million device IDs available to reach, says Disney, 110 million households. It also operates a “Disney Real-time Ad Exchange (DRAX), a data clean room and what it calls “Yoda”—a “yield optimized delivery allocation” empowering its ad server.Warner Bros. Discovery recently launched “WBD Stream,” providing marketers with “seamless access… to popular and premium content.” It also announced partnerships with several data and research companies designed to help “marketers to push consumers further down the path to purchase.” One such alliance involves “605,” which helps WBD track how effective its ads are in delivering actual sales from local retailers, including the use of set-top box data from Comcast as well as geolocation tracking information. Amazon has long supported its video streaming advertising sales, including with its “Freevee” network, through its portfolio of cutting-edge data tools. Among the ad categories targeted by Amazon’s streaming service are financial services, candy and beauty products. One advantage it touts is that streaming marketers can get help from “Amazon’s Ads data science team,” including an analysis of “signals in [the] Amazon Marketing Cloud.”Other major players in video streaming have also supercharged their data technologies, including Roku, Paramount, and Samsung, in order to target what are called “advanced audiences.” That’s the capability to have so much information available that a programmer can pinpoint a target for personalized marketing across a vast universe of media content. While subscription is a critical part of video revenues, programmers want to draw from multiple revenue streams, especially advertising. To help advance the ability of the TV business to have access to more thorough datasets, leading TV, advertising and measurement companies have formed the “U.S. Joint Industry Committee” (JIC). Warner Bros. Discovery, Fox, NBCU, TelevisaUnivision, Paramount, and AMC are among the programmers involved with JIC. They are joined by a powerhouse composed of the largest ad agencies (data holders as well), including Omnicom, WPP and Publicis. One outcome of this alliance will be a set of standards to measure the impact of video and other ads on consumers, including through the use of “Big Data” and cross-platform measurement.Of course, today’s video and filmed entertainment business includes more than ad-supported services. There’s subscription revenue for streaming–said to pass $50 billion for the U.S. this year– as well as theatrical release. But it’s very evident that the U.S. (as well as the global) entertainment business is in a major transition, where the requirement to identify, track and target an individual (or groups of people) online and as much offline as possible is essential. For example, Netflix is said to be exploring ways it can advance its own solution to personalized ad targeting, drawing its brief deal with Microsoft Advertising to a close. Leading retailers, including Walmart (NBCU) and Kroger (Disney), are also part of today’s streaming video advertising landscape. Making the connections to what we view on the screen and then buy at a store is a key selling point for today’s commercial surveillance-oriented streaming video apparatus. A growing part of the revenue from streaming will be commissions from the sale of a product after someone sees an ad and buys that product, including on the screen during a program. For example, as part of its plans to expand retail sales within its programming, NBCU’s “Checkout” service “identifies objects in video and makes them interactive and shoppable.”Another key issue for the Hollywood unions is the role of AI. With that technology already a core part of the advertising industry’s arsenal, its use will likely be integrated into video programming—something that should be addressed by the SAG-AFTRA and WGA negotiations.The unions deserve to capture a piece of the data-driven “pie” that will further drive industry profits. But there’s more at stake than a fair contract and protections for workers. Rather than unleashing the creativity of content providers who are part of a environment promoting diversity, equity and the public interest, the new system will be highly commercialized, data driven, and controlled by a handful of dominant entities. Consider the growing popularity of what are called “FAST” channels—which stands for “free ad supported streaming television.” Dozens of these channels, owned by Comcast/NBCU, Paramount, Fox, and Amazon, are now available, and filled with relatively low-cost content that can reap the profits from data and ads.The same powerful forces that helped undermine broadcasting, cable TV, and the democratic potential of what once was called the “information superhighway”—the Internet—are now at work shaping the emerging online video landscape. Advertising and marketing, which are already the influence behind the structure and affordances of digital media, are fashioning video streaming to be another—and critically important—component fostering surveillance marketing.The FTC’s forthcoming proposed rulemaking on commercial surveillance must address the role of streaming video. And the FCC should open up its own proceeding on streaming, one designed to bring structural changes to the industry in terms of ownership of content and distribution.  There’s also a role for antitrust regulators to examine the data partnerships emerging from the growing collaboration by networks and studios to pool data resources.  The fight for a fairer deal for writers and actors deserves the backing of regulators and the public. But a successful outcome for the strike should be just “Act One” of a comprehensive digital media reform effort. While the transformation of the U.S. TV system is significantly underway, it’s not too late to try to program “democracy” into its foundation. Jeff Chester is the executive director of the Center for Digital Democracy, a DC-based NGO that works to ensure that digital technologies serve and strengthen democratic values and institutions. Its work on streaming video is supported, in part, by the Rose Foundation for Communities and the Environment.This op-ed was initially published by the Tech Policy Press.
    Jeff Chester
  • CFPB Data Broker Filing - - U.S. Public Interest Research Group (PIRG) and Center for Digital Democracy (CDD)

    In response to the Request for Information Regarding Data Brokers and Other Business Practices Involving the Collection and Sale of Consumer Information Docket No. CFPB-2023-0020