CDD

Newsroom

  • Government Needs to Step up its Efforts to Provide Meaningful and Effective Regulation.Under intensifying pressure from Congress and the public, top social media platforms popular with young people – Instagram, Snapchat, TikTok, Twitch, and YouTube – have launched dozens of new safety features for children and teens in the last year, according to a report from the Center for Digital Democracy (CDD). Researchers at CDD conducted an analysis of tech industry strategies to head off regulation in the wake of the 2021 Facebook whistleblower revelations and the rising tide of public criticism, Congressional hearings, and pressures from abroad. These companies have introduced a spate of new tools, default navigation systems, and AI software aimed at increasing safeguards against child sexual abuse material, problematic content, and disinformation, the report found. But tech platforms have been careful not to allow any new safety systems to interfere significantly with advertising practices and business models that target the lucrative youth demographic. As a consequence, while industry spokespersons tout their concerns for children, “their efforts to establish safeguards are, at best, fragmented and conflicted,” the report concludes.  “Most of the operations inside these social media companies remain hidden from public view, leaving many questions about how the various safety protocols and teen-friendly policies actually function.”  More attention should also be placed on advertisers, the report suggests, which have become a much more powerful and influential force in the tech industry in recent years. Researchers offer a detailed description of the industry’s “brand safety” system –  an “expanding infrastructure of specialized companies, technological tools, software systems, and global consortia that now operate at the heart of the digital economy, creating a highly sophisticated surveillance system that can determine instantaneously which content can be monetized and which cannot.” This system, which was set up to protect the advertisers from having their ads associated with problematic content, could do much more to ensure better protections for children. “The most effective way to ensure greater accountability and more meaningful transparency by the tech industry,” the authors argue, “is through stronger public policies.” Pointing out that protection of children online remains a strong bipartisan issue, researchers identify a number of current legislative vehicles and regulatory proceedings – including bills that are likely to be reintroduced in the next Congress – which could provide more comprehensive protections for young people, and rein in some of the immense power of the tech industry. “Tech policies in the U.S. have traditionally followed a narrow, piecemeal approach to addressing children’s needs in the online environment,” the authors note, “providing limited safeguards for only the youngest children, and failing to take into account the holistic nature of young peoples’ engagement with the digital media environment.” What is needed is a more integrated approach that protects privacy for both children and teens, along with safeguards that cover advertising, commercial surveillance, and child safety.   Finally, the report calls for a strategic campaign that brings together the diverse constituencies working on behalf of youth in the online media. “Because the impacts of digital technologies on children are so widespread, efforts should also be made to broaden the coalition of organizations that have traditionally fought for children’s interests in the digital media to include groups representing the environment, civil rights, health, education, and other key stakeholder communities.”
    Jeff Chester
  • Commercial Surveillance expands via the "Big" Screen in the Home Televisions now view and analyze us—the programs we watch, what shows we click on to consider or save, and the content reflected on the “glass” of our screens. On “smart” or connected TVs, streaming TV applications have been engineered to fully deliver the forces of commercial surveillance. Operating stealthily inside digital television sets and streaming video devices is an array of sophisticated “adtech” software. These technologies enable programmers, advertisers and even TV set manufacturers to build profiles used to generate data-driven, tailored ads to specific individuals or households. These developments raise important questions for those concerned about the transparency and regulation of political advertising in the United States.Also known as “OTT” (“over-the-top” since the video signal is delivered without relying on traditional set-top cable TV boxes), the streaming TV industry incorporates the same online advertising techniques employed by other digital marketers. This includes harvesting a cornucopia of information on viewers through alliances with leading data-brokers. More than 80 percent of Americans now use some form of streaming or Smart TV-connected video service. Given such penetration, it is no surprise that streaming TV advertising is playing an important role in the upcoming midterm elections. And, streaming TV will be an especially critical channel for campaigns to vie for voters in 2024. Unlike political advertising on broadcast television or much of cable TV, which is generally transmitted broadly to a defined geographic market area, “addressable” streaming video ads appear in programs advertisers know you actually watch (using technologies such as dynamic ad insertion). Messaging for these ads can also be fine-tuned as a campaign progresses, to make the message more relevant to the intended viewer. For example, if you watch a political ad and then sign up to receive campaign literature, the next TV commercial from a candidate or PAC can be crafted to reflect that action. Or, if your data profile says you are concerned about the costs of healthcare, you may see a different pitch than your nextdoor neighbor who has other interests. Given the abundance of data available on households, including demographic details such as race and ethnicity, there will also be finely tuned pitches aimed at distinct subcultures produced in multiple languages.An estimated $1.4 billion dollars will be spent on streaming political ads for the midterms (part of an overall $9 billion in ad expenditures). With more people “cutting the cord” by signing up for cheaper, ad-supported streaming services, advances in TV technologies to enable personalized data-driven ad targeting, and the integration of streaming TV as a key component of the overall online marketing apparatus, it is evident that the TV business has changed. Even what’s considered traditional broadcasting has been transformed by digital ad technologies. That’s why it’s time to enact policy safeguards to ensure integrity, fairness, transparency and privacy for political advertising on streaming TV. Today, streaming TV  political ads already combine information from voter records with online and offline consumer profile data in order to generate highly targeted messages. By harvesting information related to a person’s race and ethnicity, finances, health concerns, behavior, geolocation, and overall digital media use, marketers can deliver ads tied to our needs and interests. In light of this unprecedented marketing power and precision, new regulations are needed to protect consumer privacy and civic discourse alike. In addition to ensuring voter privacy, so personal data can’t be as readily used as it is today, the messaging and construction of streaming political ads must also be accountable. Merely requiring the disclosure of who is buying these ads is insufficient. The U.S. should enact a set of rules to ensure that the tens of thousands of one-to-one streaming TV ads don’t promote misleading or false claims, or engage in voter suppression and other forms of manipulation. Journalists and campaign watchdogs must have the ability to review and analyze ads, and political campaigns need to identify how they were constructed—including the information provided by data brokers and how a potential voter’s viewing behaviors were analyzed (such as with increasingly sophisticated machine learning and artificial intelligence algorithms). For example, data companies such as Acxiom, Experian, Ninth Decimal, Catalina and LiveRamp help fuel the digital video advertising surveillance apparatus. Campaign-spending reform advocates should be concerned. To make targeted streaming TV advertising as effective as possible will likely require serious amounts of money—for the data, analytics, marketing and distribution. Increasingly, key gatekeepers control much of the streaming TV landscape, and purchasing rights to target the most “desirable” people could face obstacles. For example, smart TV makers– such as LG, Roku, Vizio and Samsung– have developed their own exclusive streaming advertising marketplaces. Their smart TVs use what’s called ACR—”automated content recognition”—to collect data that enables them to analyze what appears on our screens—“second by second.” An “exclusive partnership to bring premium OTT inventory to political clients” was recently announced by LG and cable giant Altice’s ad division. This partnership will enable political campaigns that qualify to access 30 million households via Smart TVs, as well as the ability to reach millions of other screens in households known to Altice. Connected TVs also provide online marketers with what is increasingly viewed as essential for contemporary digital advertising—access to a person’s actual identity information (called “first-party” data). Streaming TV companies hope to gain permission to use subscriber information in many other ways. This practice illustrates why the Federal Trade Commission’s (FTC) current initiative designed to regulate commercial surveillance, now in its initial stage, is so important. Many of the critical issues involving streaming political advertising could be addressed through strong rules on privacy and online consumer protection. For example, there is absolutely no reason why any marketer can so easily obtain all the information used to target us, such as our ethnicity, income, purchase history, and education—to name only a few of the variables available for sale. Nor should the FTC allow online marketers to engage in unfair and largely stealth tactics when creating digital ads—including the use of neuroscience to test messages to ensure they respond directly to our subconscious. The Federal Communications Commission (FCC), which has largely failed to address 21st century video issues, should conduct its own inquiry “in the public interest.” There is also a role here for the states, reflecting their laws on campaign advertising as well as ensuring the privacy of streaming TV viewers.This is precisely the time for policies on streaming video, as the industry becomes much more reliant on advertising and data collection. Dozens of new ad-supported streaming TV networks are emerging—known as FAST channels (Free Ad Supported TV)—which offer a slate of scheduled shows with commercials. Netflix and Disney+, as well as Amazon, have or are soon adopting ad-supported viewing. There are also coordinated industry-wide efforts to perfect ways to more efficiently target and track streaming viewers that involve advertisers, programmers and device companies. Without regulation, the U.S. streaming TV system will be a “rerun” of what we historically experienced with cable TV—dashed expectations of a medium that could be truly diverse—instead of a monopoly—and also offer both programmers and viewers greater opportunities for creative expression and public service. Only those with the economic means will be able to afford to “opt-out” of the advertising and some of the data surveillance on streaming networks. And political campaigns will be allowed to reach individual voters without worry about privacy and the honesty of their messaging. Both the FTC and FCC, and Congress if it can muster the will, have an opportunity to make streaming TV a well-regulated, important channel for democracy. Now is the time for policymakers to tune in.***This essay was originally published by Tech Policy Press.Support for the Center for Digital Democracy’s review of the streaming video market is provided by the Rose Foundation for Communities and the Environment.
    Jeff Chester
  • Discussion by Jeff Chester at the Global Alcohol Policy Alliance Alcohol Marketers are now big data companies.  They are also commercial surveillance marketing enterprises, which is how data driven digital marketing is increasingly described by regulators and critics.  Like many other global industries, alcohol marketing uses an ever expanding set of diverse and sophisticated online and offline techniques designed to identify and deeply influence its target audiences.  Alcoholic beverage companies have broadly adopted the business model and tactics perfected by Google, Meta/Facebook, and Amazon. This includes “omnichannel” marketing operations that identify a single person and follow them on their various devices, such as gaming, mobile, and streaming.   The alcoholic beverage industry engages in cutting edge digital marketing campaigns throughout the world.  However, the use of contemporary marketing techniques for alcoholic beverages enables us to use various regulatory and other legal tools to protect public health and the public at large.  That includes pursuing various privacy complaints, across state, national or regional data protection regulators (as well as class actions where possible); developing related complaints for consumer protection regulators on the kinds of unfair advertising practices that embody digital marketing, such as the use of neuromarketing to influence subconscious and emotional processes; the reliance on “immersive” ad applications involving virtual and augmented reality (such as metaverse), whose effects also impact non rational processes; the role of influencers used to penetrate youth culture to promote the brand; and, on the data practices itself, the widespread adoption of machine learning and Artificial Intelligence systems to generate predictive and personalized marketing plans on individuals, groups and communities.  Another critical aspect of data marketing, as we know, is the gathering and use of a host of data on people—their race, ethnicity, income, health concerns, geolocation, etc., that when assembled in today’s real-time online marketing machine are used to reach us with a highly informed assessment of who we are and what we do.  In addition to regulation and judicial recourse, there are also the public shaming aspects that can be generated through the news media and other informational campaigns.I will summarize several of the troubling practices of the alcohol marketing industry today that could form the basis for potential regulatory interventions.The use of Big Data operations:  As leading advertisers, alcoholic beverage companies already hold a vast—and growing--array of data on their customers and targets.  For example, AB InBev relies on [quote] 1000 different data sources and has more than 70.1 million unique customer records [unquote].  Its data sources include information gathered thru mobile devices, social media, and ecommerce, among others.   AB InBev has invested in the latest technologies to consolidate, manage and make actionable this information, including Data Management Platforms (DMPs—which integrate and analyze diverse data points) that help identify and target an individual.  Through state-of-the-art online campaigns, companies like ABInBev  collect huge amounts of key data.  For example, the company created a platform in Columbia not long ago—[quote] “a central online store where customers could share their location and place their order which was then sent via Whatsapp to their local grocer to be fulfilled…it digitized every (convenience) store, in every corner, in every block, in every neighborhood and connected them” [unquote] to its online store. Pervasive Surveillance on social media used for insight generation.  Alcohol companies deploy abundant “social listening” strategies that use sentiment mining, AI-driven computer vision and other tools to understand what is being said, by who and where, about the brand or topics that can be better leveraged for marketing; for example, to help pinpoint who are the most influential or useful voices to reach out to.   Much of this work is conducted 24/7 with real-time capabilities to take advantage of what is identified.  E-commerce: Online is increasingly an environment that seamlessly merges content, sales, marketing, and payment.  Alcoholic beverage companies are taking advantage of the powerful data driven promotion engines that operate these online sales channels, to make sure you see its product, place it in the shopping cart, and buy it.  Leading grocery and retail companies have also established their own highly developed online marketing operations that work with alcoholic beverages and other brands to showcase them on their e-commerce and online marketing sites; another source of privacy concern, as data sets merge].The use of neuroscience and other emotional technologies. Used to identify how to trigger non-rational responses to marketing, including measuring the emotional intensity of an ad as well as assessing how well a person’s memory encodes that message.  Alcohol companies (and many others) hook subjects up to EEGs and other similar tech to map their brainwaves responses to ads and content. Then an ad or message is honed and deployed.  These tools are also used “in flight” [during a running ad campaign] to correct errors and fine-tune their impact.Repositioning themselves as providers of economic opportunity and social good.  A recent trend by alcohol marketers is to position itself as generating economic opportunity for small businesses, as a strategy to deepen its connections for data.  For example, in Brazil last year during Carnival, one alcoholic beverages company used emails, push notifications, text messaging, an app, ecommerce platform, personalized QR codes and social media to support nearly 11,000 street vendors working out of their homes that ended up selling 200,000 of the brand’s products.  It established a critical digital link between the vendors, the alcohol brand, and its customers. Providers of technology:  This is especially true with branded alcoholic beverage company mobile apps, which are a key source of data gathering, monitoring of consumer behaviors (inc. geolocation), enrollment in loyalty programs and becomes an immediate influence and marketing channel. These apps are aksi used for sales and payments, creating another highly valuable data source.Penetrating further into the community.  Mobile and other digital marketing tech enables highly targeted, geo-aware, campaigns.  For example, in South Africa one brand—as part of a wider social media effort—used what’s known as DOOH—giving away software while encouraging its targets to [quote] create a personalized shout out to someone special and then select a digital billboard at a specific location for their message to be displayed on. [unquote]. Finally, creating impressive online experiences--such as music events to connect to youth.  In China, Jagermeister, who knew it was loosing its youth demographic, created [quote] “two days-worth of performance lineups and subculture experiences” [unquote] with livestreaming music and other ways to engage and interact with its young audience.  This event claimed to reach 200m impressions.  There are many more examples of such experiential virtual campaigns by alcoholic beverages companies.Policy Options:This is an optimum time to seek safeguards regarding the marketing of alcoholic beverages, to both underage consumers as well as address public health concerns overall on adult consumption.  Concern over the loss of privacy and autonomy, as well as its impact on youth development and health, is fueling greater interest by policymakers to regulate digital marketing. For example, here in the U.S. we have a new proposed rulemaking on surveillance marketing by the Federal Trade Commission, which offers multiple opportunities for the public health community to call for safeguards.    In the EU, there is the GDPR, Digital Services Act and other consumer legislation at the national and EU level that can be consideed.  The UK’s privacy commissioner has begun to enforce its new “Design Code” that governs how the online industry interacts with children and adolescents.  There are data protection commissioners in many countries, as well as varying laws, that should be assessed.   To advance these opportunities, public health advocates will likely find support from the global community of public interest privacy and consumer protection NGOs and scholars, who could be enlisted to identify the potential remedies and develop the appropriate regulatory complaints.   The WHO, of course, is in the forefront of documenting many of the practices we’ve discussed, including its recent work on digital marketing on unhealthy foods and beverages, breast milk substitutes, and alcohol marketing.  As these reports show, and as this conference reflects, the significant advances by these producers and marketers into the digital sphere, which operates now as such a key force in our lives, should be challenged.  Limits and expectations for this industry should be set, along with ongoing research into the effects of such marketing as well as analyzing its marketing operations. With timely action, we might be able to set a healthier course for the role that alcoholic beverages can play in our societies.  Thank you.
    Jeff Chester
  • A coalition of more than 100 organizations is sending two letters to Congress urging action. A letter addressed to Senate Majority Leader Chuck Schumer and Minority Leader Mitch McConnell, from 145 organizations, urges them to advance KOSA and COPPA to full Senate votes. A letter addressed to House Energy and Commerce Chair Frank Pallone and Ranking Member Cathy McMorris Rodgers, from 158 organizations, urges them to introduce a House companion bill to KOSA. The advocates state in the letter to the Senate: “The enormity of the youth mental health crisis needs to be addressed as the very real harms of social media are impacting our children today. Taken together, the Kids Online Safety Act and the Children and Teens’ Online Privacy Protection Act would prevent online platforms from exploiting young users’ developmental vulnerabilities and targeting them in unfair and harmful ways.” kosa_coppa_senate_leadership_letter_final_9.12.22-1.pdf, eandc_leadership_kosa_letter_final_9.12.22-1.pdf, kosa_coppa_rally_press_release_embargo_to_9_13.pdf
    person using smartphone by Priscilla Du Preez
  • Press Statement regarding today’s FTC Notice(link is external) of Proposed Rulemaking Regarding the Commercial Surveillance and Data SecurityKatharina Kopp, Deputy Director, Center for Digital Democracy:Today, the Federal Trade Commission issued its long overdue advanced notice of proposed rulemaking (ANPRM) regarding a trade regulation rule on commercial surveillance and data security. The ANPRM aims to address the prevalent and increasingly unavoidable harms of commercial surveillance. Civil society groups including civil rights groups, privacy and digital rights and children’s advocates had previously called on the commission to initiate this trade regulation rule to address the decades long failings of the commission to reign in predatory corporate practices online. CDD had called on the commission repeatedly over the last two decades to address the out-of-control surveillance advertising apparatus that is the root cause of increasingly unfair, manipulative, and discriminatory practices harming children, teens, and adults and which have a particularly negative impact on equal opportunity and equity.The Center for Digital Democracy welcomes this important initial step by the commission and looks forward to working with the FTC. CDD urges the commission to move forward expeditiously with the rule making and to ensure fair participation of stakeholders, particularly those that are disproportionately harmed by commercial surveillance.press_statement_8-11fin.pdf
  • CDD Comments to FTC for "Stealth" Marketing Inquiry The Center for Digital Democracy (CDD) urges the FTC to develop and implement a set of policies designed to protect minors under 18 from being subjected to a host of pervasive, sophisticated and data-driven digital marketing practices. Children and teens are targeted by an integrated set of online marketing operations that are manipulative, unfair, invasive and can be especially harmful to their mental and physical health. The commission should make abundantly clear at the forthcoming October workshop that it understands that the many problems generated by contemporary digital marketing to youth transcend narrow categories such as “stealth advertising” and “blurred content.” Nor should it propose “disclosures” as a serious remedy, given the ways advertising is designed using data science, biometrics, social relationships and other tactics. Much of today’s commercially supported online system is purposefully developed to operate as “stealth”—from product development, to deployment, to targeting, tracking and measurement. Age-based cognitive development capacities to deal with advertising, largely based on pre-digital (especially TV) research, simply don’t correspond to the methods used today to market to young people. CDD calls on the commission to acknowledge that children and teenagers have been swept into a far reaching commercial surveillance apparatus.The commission should propose a range of safeguards to protect young people from the current “wild west” of omnichannel directed at them. These safeguards should address, for example, the role market research and testing of child and teen-directed commercial applications and messaging play in the development of advertising; how neuromarketing[pdf] practices designed to leverage a young person’s emotions and subconscious are used to deliver “implicit persuasion”; the integration by marketers and platforms of “immersive” applications, including augmented and virtual reality, designed to imprint brand and other commercial messages; the array of influencer-based strategies, including the extensive infrastructure used by platforms and advertisers to deliver, track and measure their impact; the integration of online marketing with Internet of Things objects, including product packaging and the role of QR codes, (experiential marketing) and digital out-of-the-home advertising screens; as well as contemporary data marketing operations that use machine learning and artificial intelligence to open up new ways for advertisers to reach young people online. AI services increasingly deliver personalized content online, further automating the advertising process to respond in real-time.It is also long overdue for the FTC to investigate and address how online marketing targets youth of color, who are subjected to a variety of advertising practices little examined by privacy and other regulators.The FTC should use all its authority and power to stop data-driven surveillance marketing to young people under 18; end the role sponsored influencers play; enact rules designed to protect the online privacy for teens 13-17 who are now subjected to ongoing tracking by marketers; and propose policies to redress the core methods employed by digital advertisers and online platforms to lure both children and teens. For more than 20 years, CDD and its allies have urged the FTC to address the ways digital marketing has undermined consumer protection and privacy, especially for children and adolescents. Since the earliest years of the commercial internet, online marketers have focused on young people, both for the revenues they deliver as well as to secure loyalty from what the commercial marketing industry referred to as “native” users. The threat to their privacy, as well as to their security and well-being, led to the complaint our predecessor organization filed in 1996, which spurred the passage of the Children’s Online Privacy Protection Act (COPPA) in 1998. COPPA has played a modest role protecting some younger children from experiencing the totality of the commercial surveillance marketing system. However, persistent failures of the commission to enforce COPPA; the lack of protections for adolescents (despite decades-long calls by advocates for the agency to act on this issue); and a risk-averse approach to addressing the methods employed by the digital advertising, even when applied to young people, have created ongoing threats to their privacy, consumer protection and public health. In this regard, we urge the commission to closely review the comments submitted in this proceeding by our colleague Fairplay and allies. We are pleased Fairplay supports these comments.If the FTC is to confront how the forces of commercial digital surveillance impact the general public, the building blocks to help do so can be found in this proceeding. Young people are exposed to the same unaccountable forces that are everywhere online: a largely invisible, ubiquitous, and machine-intelligence-driven system that tracks and assesses our every move, using an array of direct and indirect techniques to influence behaviors. If done correctly, this proceeding can help inform a larger policy blueprint for what policy safeguards are needed—for young people and for everyone else.The commission should start by reviewing how digital marketing and data-gathering advertising applications are “baked in” at the earliest stages of online content and device development. These design and testing practices have a direct impact on young people. Interactive advertising standards groups assess and certify a host of approved ad formats, including for gaming, mobile, native advertising, and streaming video. Data practices for digital advertising, including ways that ads are delivered through the behavioral/programmatic surveillance engines, as well as their measurement, are developed through collaborative work involving trade organizations and leading companies. Platforms such as Meta, as well as ad agencies, adtech companies, and brands, also have their own variations of these widely adopted formats and approaches. The industry-operated standards process for identifying new methods for digital advertising, including the real-world deployment of applications such “playable” ads or the ways advertisers can change its personalized messaging in real-time, have never been seriously investigated by the commission. A review of the companies involved show that many are engaged in digital marketing to young people.Another critical building block of contemporary digital marketing to address when dealing with youth-directed advertising is the role of “engagement.” As far back as 2006, the Interactive Advertising Bureau (IAB) recognized that to effectively secure the involvement of individuals with marketing communications, at both the subconscious and conscious levels, it was necessary to define and measure the concept of engagement. IAB initially defined “Engagement… [as] turning on a prospect to a brand idea enhanced by the surrounding context..” By 2012, there were more elaborate definitions identifying “three major forms of engagement… cognitive, physical and emotional.” A set of corresponding metrics, or measurement tools, were used, including those tracking “attention” (“awareness, interest, intention”); emotional and motor functioning identified through biometrics (“heart palpitations, pupil dilation, eye tracking”); and through omnipresent tracking of online behaviors (“viewability and dwell time, user initiated interaction, clicks, conversions, video play rate, game play”). Today, research and corresponding implementation strategies for engagement are an ongoing feature for the surveillance-marketing economy. This includes conducting research and implementing data-driven and other ad strategies targeting children—known as “Generation Alpha”—children 11 and younger—and teens—“Generation Z.”We will briefly highlight some crucial areas this proceeding should address:Marketing and product research on children and adolescents: An extensive system designed to ensure that commercial online content, including advertising and marketing, effectively solicits the interest and participation of young people, is a core feature of the surveillance economy. A host of companies are engaged in multi-dimensional market research, including panels, labs, platforms, streaming media companies, studios and networks, that have a direct impact on the methods used to advertise and market to youth. CDD believes that such product testing, which can rely on a range of measures designed to promote “implicit persuasion” should be considered an unfair practice generally. Since CDD and U.S. PIRG first urged the commission to investigate neuromarketing more than a decade ago, this practice has in ways that enable it to play a greater role influencing how content and advertising is delivered to young people.For example, MediaScience (which began as the Disney Media and Advertising Lab), serves major clients including Disney, Google, Warner Media, TikTok, Paramount, Fox and Mars. It conducts research for platforms and brands using such tools as “neurometrics (skin conductivity and heart rate), eye tracking, facial coding, and EEGs, among others, that assess a person’s responses across devices. Research is also conducted outside of the lab setting, such as directly through a subject’s “actual Facebook feed.” It has a panel of 80,000 households in the U.S., where it can deliver digital testing applications using a “variety of experimental designs… facilitated in the comfort of people’s homes.” The company operates a “Kids” and “Teens” media research panel. Emblematic of the far-reaching research conducted by platforms, agencies and brands, in 2021 TikTok’s “Marketing Science team” commissioned MediaScience to use neuromarketing research to test “strong brand recall and positive sentiment across various view durations.” The findings indicated that “ads on TikTok see strong brand recall regardless of view duration…. Regardless of how long an ad stays on screen, TikTok draws early attention and physiological engagement in the first few seconds.”NBCUniversal is one of the companies leveraging the growing field of “emotional analytics” to help advance advertising for streaming and other video outlets. Comcast’s NBCU is using “facial coding and eye-tracking AI to learn an audience’s emotional response to a specific ad.” Candy company Mars just won a “Best Use of Artificial Intelligence” award for its “Agile Creative Expertise (ACE) tool that “tracks attentional and emotional response to digital video ads.” Mars is partnering with neuromarketer Realeyes to “measure how audience’s attention levels respond as they view Mars' ads.Knowing what captures and retains attention or even what causes distraction, generated intelligence that enabled Mars to optimize the creative itself or the selection of the best performing ads across platforms including TikTok, Facebook, Instagram and YouTube.” TikTok, Meta/Facebook, and Google have all used a variety of neuromarketing measures. The Neuromarketing Science and Business Association (NMSBA) includes many of the leading companies in this field as members. There is also an “Attention Council” within the digital marketing industry to help advance these practices, involving Microsoft, Mars, Coca-Cola, AB/InBev, and others. A commercial research infrastructure provides a steady drumbeat of insights so that marketers can better target young people on digital devices. Children’s streaming video company Wildbrain, for example, partnered with Ipsos for its 2021 research report, “The Streaming Generation,” which explained that “Generation Alpha [is] the most influential digital generation yet…. They have never known a world without digital devices at their fingertips, and for Generation Alpha (Gen A), these tech-first habits are now a defining aspect of their daily lives.” More than 2,000 U.S. parents and guardians of children 2-12 were interviewed for the study, which found that “digital advertising to Gen A influences the purchasing decisions of their parents…. Their purchasing choices, for everything from toys to the family car, are heavily influenced by the content kids are watching and the ads they see.” The report explains that among the “most popular requests” are toys, digital games, clothing, tech products and “in-game currencies” for Roblox and Fortnite.Determining the levels of “brand love” by children and teens, such as the use of “Kidfinity” and “Teenfinity” scores—“proprietary measures of brand awareness, popularity and love”—are regularly provided to advertisers. Other market researchers, such as Beano Studios, offer a “COPPA-compliant” “Beano Brain Omnibus” website that, through “games, quizzes, and bespoke questions” for children and teens, “allows bands to access answers to their burning questions.” These tools help marketers better identify, for example, the sites—such as TikTok—where young people spend time. Among the other services Beano provides, which reflect many other market-research companies’ capabilities, are “Real-time UX/UI and content testing—in the moment, digital experience exploration and evaluation of brands websites and apps with kids and teens in strawman, beta or live stages,” and “Beano at home—observing and speaking to kids in their own homes. Learning how and what content they watch.” Adtech and other data marketing applications: In order to conduct any “stealth” advertising inquiry, the FTC should review the operations of contemporary “Big Data”-driven ad systems that can impact young people. For example, Disney has an extensive and cutting-edge programmatic apparatus called DRAX(Disney Real-Time Ad Exchange) that is delivering thousands of video-based campaigns. DRAX supports “Disney Select,” a "suite of ad tech solutions, providing access to an extensive library of first-party segments that span the Disney portfolio, including streaming, entertainment and sports properties…. Continuously refined and enhanced based on the countless ways Disney connects with consumers daily. Millions of data inputs validated through data science…. Advertisers can reach their intended audiences by tapping into Disney’s proprietary Audience Graph, which unifies Disney’s first party data and audience modeling capabilities….” As of March 2022, Disney Select contained more than 1,800 “audience segments built from more than 100,000 audience attributes that fuel Disney’s audience graph.” According to Disney Advertising, its “Audience Graph” includes 100 million households, 160 million connected TV devices and 190 million device IDs, which enables modeling to target households and families. Children and teens are a core audience for Disney, and millions of their households receive its digital advertising. Many other youth-directed leading brands have developed extensive internal adtech applications designed to deliver ongoing and personalized campaigns. For example, Pepsi, Coca-Cola, McDonald’s, and Mondelez have in-house capabilities and extensive partnerships that create targeted marketing to youth and others. The ways that “Big Data” analytics affect marketing, especially how insights can be used to target youth, should be reviewed. Marketers will say to the FTC that they are only targeting 18-year-olds and over, but an examination of their actual targets, and asking for child-related brand-safety data they collect, should provide the agency with a robust response to such claims.New methods to leverage a person’s informational details and then target them, especially without “cookies,” requires the FTC to address how this is being used to market to children and teens. This review should also be extended to “contextual” advertising, since that method has been transformed through the use of machine learning and other advanced tactics—called “Contextual 2.0.”Targeting youth of color: Black, Hispanic, Asian-American and other “multicultural” youth, as the ad industry has termed it, are key targets for digital advertising. An array of research, techniques, and services is focused on these young people, whose behaviors online are closely monitored by advertisers. A recent case study to consider is the McDonald’s U.S. advertising campaign designed to reverse its “decline with multicultural youth.” The goal of its campaign involving musician Travis Scott was to “drive penetration by bringing younger, multicultural customers to the brands… and drive immediate behavior too.” As a case study explains, “To attract multicultural youth, a brand… must have cultural cachet. Traditional marketing doesn’t work with them. They don’t watch cable TV; they live online and on social media, and if you are not present there you’re out of sight, out of mind.”It’s extremely valuable to identify some of the elements involved in this case, which are emblematic of the integrated set of marketing and advertising practices that accompany so many campaigns aimed at young people. These included working with a celebrity/influencer who is able to “galvanize youth and activate pop culture”; offering “coveted content—keepsakes and experiences to fuel the star’s fanbase, driving participation and sales”; employing digital strategies through a proprietary (and data-collecting) “app to bring fans something extra and drive digital adoption”; and focusing on “affordability”—to ensure “youth with smaller wallets” would participate. To illustrate how expenditures for paid advertising are much less relevant with digital marketing, McDonald’s explains that “Before a single dollar had been spent on paid media, purely on the strength of a few social posts by McDonald’s and Travis Scott, and reporting in the press, youth were turning up at restaurants across the country, asking for the Travis Scott meal.” This campaign was a significant financial success for McDonald’s. Its partnership with this influencer was effective as well in terms of “cultural response: hundreds of thousands of social media mentions and posts, fan-art and memes, unboxing videos of the meal…, fans selling food and stolen POS posters on eBay…, the multi merch drops that sold out in seconds, the framed receipts.” Online ads targeted to America’s diverse communities of young people, who can also be a member of a group at risk (due to finances, health, and the like) have long required an FTC investigation. The commission should examine the data-privacy and marketing practices on these sites, including those that communicate via languages other than English.Video and Video Games: Each of these applications have developed an array of targeted advertising strategies to reach young people. Streaming video is now a part of the integrated surveillance-marketing system, creating a pivotal new place to reach young people, as well as generate data for further targeting. Children and teens are viewing video content on Smart TVs, other streaming devices, mobile phones, tablets as well as computers. Household data where young people reside, which is amplified through the use of a growing number of “identity” tools that permit cross-device tracking, enable an array of marketing practices to flourish. The commission should review the data-gathering, ad-formatting, and other business practices that have been identified for these “OTT” services and how they impact children and teens. There are industry-approved ad-format guidelines for digital video and Connected TV. Digital video ads can use “dynamic overlays,” “shoppable and actionable video,” “voice-integrated video ads,” “sequential CTV creative,” and “creative extensions,” for example. Such ad formats and preferred practices are generally not vetted in terms of how they impact the interests of young people.Advertisers have strategically embedded themselves within the video game system, recognizing that it’s a key vantage point to surveil and entice young people. One leading quick-service restaurant chain that used video games to “reach the next generation of fast-food fans” explained that “gaming has become the primary source of entertainment for the younger generation. Whether playing video games or watching others play games on social platforms, the gaming industry has become bigger than the sports and music industries combined. And lockdowns during the global pandemic accelerated the trend. Gaming is a vital part of youth culture.” Illustrating that marketers understand that traditional paid advertising strategies aren’t the most effective to reach young people, the fast-food company decided to “approach gaming less like an advertising channel and more like an earned social and PR platform…. [V]ideo games are designed as social experiences.” As Insider Intelligence/eMarketer reported in June 2022, “there’s an ad format for every brand” in gaming today, including interstitial ads, rewarded ads, offerwalls, programmatic in-game ads, product placement, advergames, and “loot boxes.” There is also an “in-game advertising measurement” framework, recently released for public comment by the IAB and the Media Ratings Council. This is another example where leading advertisers, including Google, Microsoft, PepsiCo and Publicis, are determining how “ads that appear within gameplay” operate. These guidelines will impact youth, as they will help determine the operations of such ad formats as “Dynamic In-Game Advertising (DIGA)—Appear inside a 3D game environment, on virtual objects such as billboards, posters, etc. and combine the customization of web banners where ads rotate throughout the play session”; and “Hardcoded In-Game Ad Objects: Ads that have not been served by an ad server and can include custom 3D objects or static banners. These ads are planned and integrated into a video game during its design and development stage.” Leading advertising platforms such as Amazon sell as a package video ads reaching both streaming TV and gaming audiences. The role of gaming and streaming should be a major focus in October, as well as in any commission follow-up report.Influencers: What was once largely celebrity-based or word-of mouth style endorsements has evolved into a complex system including “nano-influencers (between 1,000 and 10,000 followers); micro-influencers (between 10,000 and 100,000); macro-influencers (between 100,000 and a million); and mega or celebrity influencers (1 million-plus followers). According to a recent report in the Journal of Advertising Research, “75 percent of marketers are now including social-media influencers in their marketing plans, with a worldwide market size of $2.3 billion in 2020.” Influencer marketing is also connected to social media marketing generally, where advertisers and others have long relied on a host of surveillance-related systems to “listen,” analyze and respond to people’s social online communications.Today, a generation of “content creators” (aka influencers) is lured into becoming part of the integrated digital sales force that sells to young people and others. From “unboxing videos” and “virtual product placement” in popular content, to “kidfluencers” like Ryan’s World and “brand ambassadors” lurking in video games, to favorite TikTok creators pushing fast-food, this form of digital “payola” is endemic online.Take Ryan’s World. Leveraging “more than one billion views” on YouTube, as well as a Nickelodeon show, has “catapulted him... to a global multi-category force,” notes his production and licensing firm. The deals include a “preschool product line in multiple categories, “best in class partnerships, and a “Tag with Ryan” app that garnered 16 million downloads. Brands seeking help selling products, says Ryan’s media agency, “can connect with its kid fanbase of millions that leverages our world-class portfolio of kid-star partners to authentically and seamlessly connect your brand with Generation Alpha across YouTube, social media, mobile games, and OTT channels—everywhere kids tune in!... a Generation Alpha focused agency that delivers more than 8 BILLION views and 100 MILLION unique viewers every month!” (its emphasis). Also available is a “custom content and integrations” feature that can “create unique brand experiences with top-tier kid stars.” Ryan’s success is not unique, as more and more marketers create platforms and content, as well as merge companies, to deliver ads and marketing to children and teens. An array of influencer marketing platforms that offer “one-stop” shopping for brands to employ influencers, including through the use of programmatic marketing-like data practices (to hire people to place endorsements, for example) is a core feature of the influencer economy. There are also software programs so brands and marketers can automate their social influencer operations, as well as social media “dashboards” that help track and analyze social online conversations, brand mentions and other communications. The impact of influencers is being measured through a variety of services, including neuromarketing. Influencers are playing a key role in “social commerce,” where they promote the real-time sales of products and services on “shoppable media.” U.S. social commerce sales are predicted to grow to almost $80 billion in 2025 from its 2022 estimated total of $45.74 billion. Google, Meta, TikTok, Amazon/Twitch and Snapchat all have significant influencer marketing operations. As Meta/Facebook recently documented, there is also a growing role for “virtual” influencers that are unleashed to promote products and services. While there may be claims that many promotions and endorsements should be classified as “user generated content” (UGC), we believe the commission will find that the myriad influencer marketing techniques often play a role spurring such product promotion.The “Metaverse”: The same forces of digital marketing that have shaped today’s online experience for young people are already at work organizing the structure of the “metaverse.” There are virtual brand placements, advertisements, and industry initiatives on ad formats and marketing experiences. Building on work done for gaming and esports, this rapidly emerging marketing environment poses additional threats to young people and requires timely commission intervention.Global Standards: Young people in the U.S. have fewer protections than they do in other countries and regions, including the European Union and the United Kingdom. In the EU, for example, protections are required for young people until they are 18 years of age. The impact of the GDPR, the UK’s Design Code, the forthcoming Digital Services Act (and even some self-regulatory EU initiatives by companies such as Google) should be assessed. In what ways do U.S.-based platforms and companies provider higher or more thorough safeguards for children when they are required to do so outside of this country? The FTC has a unique role to ensure that U.S. companies operating online are in the forefront—not in the rear—of protecting the privacy and interests of children.The October Workshop: Our review of the youth marketing landscape is just a partial snapshot of the marketplace. We have not discussed “apps” and mobile devices, which pose many concerns, including those related to location, for example. But CDD hopes this comment will help inform the commission about the operations of contemporary marketing and its relationship to young people. We call on the FTC to ensure that this October, we are presented with an informed and candid discussion of the nature and impact of today’s marketing system on America’s youth.ftcyouthmarketing071822.pdf
    Jeff Chester
  • Considering Privacy Legislation in the context of contemporary digital data marketing practices Last week, the leading global advertisers, online platforms and data marketers gathered for the most important awards given by the ad industry—the “Cannes Lions.” Reviewing the winners and the “shortlist” of runners-up—competing in categories such as “Creative Data,” “Social and Influencer,” “Brand Experience & Activation,” “Creative Commerce” and “Mobile”—is essential to learn where the data-driven marketing business—and ultimately much of our digital experiences—is headed. An analysis of the entries reveals a growing role for machine learning and artificial intelligence in the creation of online marketing, along with geolocation tracking, immersive content and other “engagement” technologies. One takeaway, not surprisingly, is that the online ad industry continues to perfect techniques to secure our interest in its content so it can to gather more data from us.A U.S.-based company that also generated news during Cannes was The Trade Desk, a relatively unknown data marketing service that is playing a major role assisting advertisers and content providers to overcome any new privacy challenges posed by emerging or future legislation. The Trade Desk announced last week a further integration of its data and ad-targeting service with Amazon’s cloud AWS division, as well as a key role assisting grocer Albertsons new digital ad division. The Trade Desk has brokered a series of alliances and partnerships with Walmart, the Washington Post, Los Angeles Times, Gannett, NBC Universal, and Disney—to name only a few.There are several reasons these marketers and content publishing companies are aligning themselves with The Trade Desk. One of the most important is the company’s leadership in developing a method to collect and monetize a person’s identity for ongoing online marketing. “Unified ID 2.0” is touted to be a privacy-focused method that enables surveillance and effective ad targeting. The marketing industry refers to these identity approaches as “currencies” that enable the buying and selling of individuals for advertising. There are now dozens of identity “graph” or “identity spine” services, in addition to UDID, which reflect far-reaching partnerships among data brokers, publishers, adtech specialists, advertisers and marketing agencies. Many of these approaches are interoperable, such as the one involving Acxiom spin-off LiveRamp and The Trade Desk. A key goal, when you listen to what these identity brokers say, is that they would like to establish a universal identifier for each of us, to directly capture our attention, reap our data, and monetize our behavior. For the last several years, as a result of the enactment of the GDPR in the EU, the passage of privacy legislation in California, and the potential of federal privacy legislation, Google, Apple, Firefox and others have made changes or announced plans related to their online data practices. So-called “third party cookies,” which have long enabled commercial surveillance, are being abandoned—especially since their role has repeatedly raised concerns from data-protection regulators. Taking their place are what the surveillance marketing business believes are privacy-regulation-proof strategies. There are basically two major, but related, efforts that have been underway—here in the U.S. and globally.The first tactic is for a platform or online publisher to secure the use of our information through an affirmative consent process—called a “first-party” data relationship in the industry. The reasoning goes is that an individual wants an ongoing interaction with the site—for news, videos, groceries, drugs and other services, etc. Under this rationale, we are said to understand and approve how platforms and publishers will use our information as part of the value exchange. First-party data is becoming the most valuable asset in the global digital marketing business, enabling ongoing collection, generating insights, and helping maintain the surveillance model. It is considered to have few privacy problems. All the major platforms that raise so many troubling issues—including Google, Amazon, Meta/Facebook—operate through extensive first-party data relationships. It’s informative to see how the lead digital marketing trade group—the Interactive Advertising Bureau (IAB)—explains it: “ “first party data is your data…presents the least privacy concerns because you have full control over its collection, ownership and use.”The second tactic is a variation on the first, but also relies on various forms of identity-resolution strategies. It’s a response in part to the challenges posed by the dominance of the “walled garden” digital behemoths (Google, etc.) as well the need to overcome the impact of privacy regulation. These identity services are the replacement for cookies. Some form of first-party data is captured (and streaming video services are seen as a gold mine here to secure consent), along with additional information using machine learning to crunch data from public sources and other “signals.” Multimillion member panels of consumers who provide ongoing feedback to marketers, including information about their online behaviors, also help better determine how to effectively fashion the digital targeting elements. The Trade Desk-led UDID is one such identity framework. Another is TransUnion’s “Fabrick,” which “provides marketers with a sustainable, privacy-first foundation for all their data management, marketing and measurement needs.” Such rhetoric is typical of how the adtech/data broker/digital marketing sectors are trying to reframe how they conduct surveillance.Another related development, as part of the restructuring of the commercial surveillance economy, is the role of “data clean rooms.” Clean rooms enable data to be processed under specific rules set up by a marketer. As Advertising Agerecently explained, clean rooms enable first-party and other marketers to provide “access to their troves of data.” For Comcast’s NBCU division and Disney, this treasure chest of information comes from “set-top boxes, streaming platforms, theme parks and movie studios.” Various privacy rules are supposed to be applied; in some cases where they have consent, two or more parties will exchange their first-party data. In other cases, where they may not have such open permission, they will be able to “create really interesting ad products; whether it's a certain audience slice, or audience taxonomy, or different types of ad units….” As an NBCU executive explained about its clean room activity, “we match the data, we build custom audiences…we plan, activate and we measure. The clean room is now the safe neutral sandbox where all the parties can feel good sharing first party data without concerns of data leakage.”We currently have at least one major privacy bill in Congress that includes important protections for civil rights and restricts data targeting of children and teens, among other key provisions. It’s also important when examining these proposals to see how effective they will be in dealing with the surveillance marketing industry’s current tactics. If they don’t effectively curtail what is continuous and profound surveillance and manipulation by the major digital marketers, and also fail to rein in the power of the most dominant platforms, will such a federal privacy promise really deliver? We owe it to the public to determine whether such bills will really “clean up” the surveillance system at the core of our online lives.
    Jeff Chester
  • lootboxletterfull.pdf
  • Groups say FIFA: Ultimate Team preys on children’s vulnerability with loot boxes, “funny money" Contact:David Monahan, Fairplay david@fairplayforkids.orgJeff Chester, CDD jeff@democraticmedia.org; 202-494-7100Advocates call on FTC to investigate manipulative design abuses in popular FIFA gameGroups say FIFA: Ultimate Team preys on children’s vulnerability with loot boxes, “funny money”BOSTON and WASHINGTON, DC – Thursday, June 2, 2022 – Today, advocacy groups Fairplay and Center for Digital Democracy (CDD) led a coalition of 15 advocacy groups in calling on the Federal Trade Commission (FTC) to investigate video game company Electronic Arts (EA) for unfairly exploiting young users in EA’s massively popular game, FIFA: Ultimate Team. In a letter sent to the FTC, the advocates described how the use of loot boxes and virtual currency in FIFA: Ultimate Team exploits the many children who play the game, especially given their undeveloped financial literacy skills and poor understanding of the odds of receiving the most desirable loot box items.Citing the Norwegian Consumer Council’s recent report, Insert Coin: How the Gaming Industry Exploits Consumers Using Lootboxes, the advocates’ letter details how FIFA: Ultimate Team encourages gamers to engage in a constant stream of microtransactions as they play the game. Users are able to buy FIFA points, a virtual in-game currency, which can then be used to purchase loot boxes called FIFA packs containing mystery team kits; badges; and player cards for soccer players who can be added to a gamer’s team. In their letter, the advocates noted the game’s use of manipulative design abuses such as “lightning round” sales of premium packs to promote the purchase of FIFA packs, which children are particularly vulnerable to. The advocates also cite the use of virtual currency in the game, which obscures the actual cost of FIFA packs to adult users, let alone children. Additionally, the actual probability of unlocking the best loot box prizes in FIFA: Ultimate Team is practically inscrutable to anyone who is not an expert in statistics, according to the advocates and the NCC report. In order to unlock a specific desirable player in the game, users would have to pay around $14,000 or spend three years continuously playing the game. “By relentlessly marketing pay-to-win loot boxes, EA is exploiting children’s desire to compete with their friends, despite the fact that most adults, let alone kids, could not determine their odds of receiving a highly coveted card or what cards cost in real money. The FTC must use its power to investigate these design abuses and determine just how many kids and teens are being fleeced by EA.” Josh Golin, Executive Director, Fairplay“Lootboxes, virtual currencies, and other gaming features are often designed deceptively, aiming to exploit players’ known vulnerabilities. Due to their unique developmental needs, children and teens are particularly harmed. Their time and attention is stolen from them, they're financially exploited, and are purposely socialized to adopt gambling-like behaviors. Online gaming is a key online space where children and teens gather in millions, and regulators must act to protect them from these harmful practices.” Katharina Kopp, Deputy Director, Center for Digital Democracy“As illustrated in our report, FIFA: Ultimate Team uses aggressive in-game marketing and exploits gamers’ cognitive biases - adults and children alike - to manipulate them into spending large sums of money. Children especially are vulnerable to EA’s distortion of real-world value of its loot boxes and the complex, misleading probabilities given to describe the odds of receiving top prizes. We join our US partners in urging the Federal Trade Commission to investigate these troubling practices.” Finn Lützow-Holm Myrstad, Digital Policy Director, Norwegian Consumer Council"The greed of these video game companies is a key reason why we're seeing a new epidemic of child gambling in our families. Thanks to this report, the FTC has more than enough facts to take decisive action to protect our kids from these predatory business practices." Les Bernal, National Director of Stop Predatory Gambling and the Campaign for Gambling-Free Kids“Exploiting consumers, especially children, by manipulating them into buying loot boxes that, in reality, rarely contain the coveted items they are seeking, is a deceptive marketing practice that causes real harm and needs to stop. TINA.org strongly urges the FTC to take action.” Laura Smith, Legal Director at TINA.orgAdvocacy groups signing today's FTC complaint include Fairplay; the Center for Digital Democracy; Campaign for Accountability; Children and Screens: Institute of Digital Media and Child Development; Common Sense Media; Consumer Federation of America; Electronic Privacy Information Center (EPIC); Florida Council on Compulsive Gambling, Inc.; Massachusetts Council on Gaming and Health; National Council on Problem Gambling; Parent Coalition for Student Privacy; Public Citizen; Stop Predatory Gambling and the Campaign for Gambling-Free Kids; TINA.org (Truth in Advertising, Inc.); U.S. PIRG### lootboxletter_pr.pdf, lootboxletterfull.pdf
  • Press Statement regarding today’s FTC Policy Statement on Education Technology and the Children’s Online Privacy Protection ActJeff Chester, Executive Director, Center for Digital Democracy:Today, the Federal Trade Commission adopts a long overdue policy designed to protect children’s privacy. By shielding school children from the pervasive forces of commercial surveillance, which gathers their data for ads and marketing, the FTC is expressly using a critical safeguard from the bipartisan Children’s Online Privacy Protection Act (COPPA). Fairplay, Center for Digital Democracy, and a coalition of privacy, children’s health, civil and consumer rights groups had previously called on the commission to enact policies that make this very Edtech safeguard possible.   We look forward to working with the FTC to ensure that parents can be confident that their child’s online privacy and security is protected in—or out of-the classroom.  However, the Commission must also ensure that adolescents receive protections from what is now an omniscient and manipulative data-driven complex that profoundly threatens their privacy and well-being.
    boy in red hoodie wearing black headphones by Compare Fibre
  • 60 leading advocacy organizations say unregulated Big Tech business model is “fundamentally at odds with children’s wellbeing”Contact:David Monahan, Fairplay david@fairplayforkids.org(link sends e-mail)Jeff Chester, Center for Digital Democracy, jeff@democraticmedia.org(link sends e-mail), 202-494-7100Diverse coalition of advocates urges Congress to pass legislation to protect kids and teens online60 leading advocacy organizations say unregulated Big Tech business model is “fundamentally at odds with children’s wellbeing”BOSTON, MA and WASHINGTON, DC - March 22, 2022 – Congressional leaders in the House and Senate were urged today to enact much needed protections for children and teens online. In a letter to Senate Majority Leader Chuck Schumer, Senate Minority Leader Mitch McConnell, House Speaker Nancy Pelosi and House Minority Leader Kevin McCarthy, a broad coalition of health, safety, privacy and education groups said it was time to ensure that Big Tech can no longer undermine the wellbeing of America’s youth. The letter reiterated President Biden’s State of the Union address call for increased online protections for young people.In their letter, the advocates outlined how the prevailing business model of Big Tech creates a number of serious risks facing young people on the internet today, including mental health struggles, loss of privacy, manipulation, predation, and cyberbullying. The advocates underscored the dangers posed by rampant data collection on popular platforms, including algorithmic discrimination and targeting children at particularly vulnerable moments.  The reforms called for by the advocates include:Protections for children and teens wherever they are online, not just on “child-directed” sites;Privacy protections to all minors;A ban on targeted advertising to young people;Prohibition of algorithmic discrimination of children and teens;Establishment of a duty of care that requires digital service providers to make the best interests of children a primary design consideration and prevent and mitigate harms to minors;Requiring platforms to turn on the most protective settings for minors by default;Greater resources for enforcement by the Federal Trade Commission.United by the desire to see Big Tech’s harmful business model regulated, the advocates’ letter represents a landmark moment for the movement to increase privacy protections for children and teenagers online, especially due to the wide-ranging fields and focus areas represented by signatories. Among the 60 signatories to the advocates’ letter are: Fairplay, Center for Digital Democracy, Accountable Tech, American Academy of Pediatrics, American Association of Child and Adolescent Psychiatry, American Psychological Association, Center for Humane Technology, Common Sense, Darkness to Light, ECPAT-USA, Electronic Privacy Information Center (EPIC), National Alliance to Advance Adolescent Health, National Center on Sexual Exploitation, National Eating Disorders Association, Network for Public Education, ParentsTogether, Public Citizen, Society for Adolescent Health and Medicine, and Exposure Labs, creators of The Social Dilemma.Signatories on the need for legislation to protect young people online:“Congress last passed legislation to protect children online 24 years ago – nearly a decade before the most popular social media platforms even existed. Big Tech's unregulated business model has led to a race to the bottom to collect data and maximize profits, no matter the harm to young people. We agree with the president that the time is now to update COPPA, expand privacy protections to teens, and put an end to the design abuses that manipulate young people into spending too much time online and expose them to harmful content.” – Josh Golin, Executive Director, Fairplay.“It’s long past time for Congress to put a check on Big Tech’s pervasive manipulation of young people’s attention and exploitation of their personal data. We applaud President Biden’s call to ban surveillance advertising targeting young people and are heartened by the momentum to rein in Big Tech and establish critical safeguards for minors engaging with their products.” – Nicole Gill, Co-Founder and Executive Director, Accountable Tech.“Digital technology plays an outsized role in the lives of today’s children and adolescents, exacerbated by the dramatic changes to daily life experienced during the pandemic. Pediatricians see the impact of these platforms on our patients and recognize the growing alarm about the role of digital platforms, in particular social media, in contributing to the youth mental health crisis. It has become clear that, from infancy through the teen years, children’s well-being is an afterthought in developing digital technologies. Strengthening privacy, design, and safety protections for children and adolescents online is one of many needed steps to create healthier environments that are more supportive of their mental health and well-being.”– Moira Szilagyi, MD, PhD, FAAP, President, American Academy of Pediatrics.“Children and teens are at the epicenter of a pervasive data-driven marketing system that takes advantage of their inherent developmental vulnerabilities. We agree with President Biden: now is the time for Congress to act and enact safeguards that protect children and teens.  It’s also long overdue for Congress to enact comprehensive legislation that protects parents and other adults from unfair, manipulative, discriminatory and privacy invasive commercial surveillance practices.”  – Katharina Kopp, Ph.D. Policy Director, Center for Digital Democracy."President Biden's powerful State of the Union plea to Congress to hold social media platforms accountable for the ‘national experiment’ they're conducting on our kids and teens could not be more important. It is clear that young people are being harmed by these platforms that continue to prioritize profits over the wellbeing of its youngest users. Children and teens' mental health is at stake. Congress and the Administration must act now to pass legislation to protect children’s and teens' privacy and well-being online." – Jim Steyer, Founder and CEO, Common Sense.“Online protections for children are woefully outdated and it's clear tech companies are more interested in profiting off of vulnerable children than taking steps to prevent them from getting hurt on their platforms. American kids are facing a mental health crisis partly fueled by social media and parents are unable to go it alone against these billion dollar companies. We need Congress to update COPPA, end predatory data collection on children, and regulate design practices that are contributing to social media addiction, mental health disorders, and even death.”– Justin Ruben, Co-Founder and Co-Director, ParentsTogether."A business model built on extracting our attention at the cost of our well being is bad for everyone, but especially bad for children. No one knows this better than young people themselves, many of whom write to us daily about the ways in which Big Social is degrading their mental health. Left unregulated, Big Social will put profits over people every time. It's time to put our kids first. We urge Congress to act swiftly and enact reforms like strengthening privacy, banning surveillance advertising, and ending algorithmic discrimination for kids so we can begin to build a digital world that supports, rather than demotes child wellbeing." – Julia Hoppock, Partnerships Director, The Social Dilemma, Exposure Labs.# # #press_release_letter_to_congress_updated_embargo_to_3_22.pdf, letter_to_congress_re_children_online_3_22_22.pdf
  • Deal reflects Big Tech move to grab more data for omnipresent tracking & targeting Microsoft is rapidly expanding its surveillance advertising complex—first acquiring AT&T’s powerful Xandr targeting system last December, and adding a few weeks later the online gaming and eSports giant Activision Blizzard. The combination of Microsoft, AT&T and Activision assets raises a set of concerns regarding competition in the gaming and eSports marketplaces; privacy/surveillance protections, given the pervasive data gathering on users; and consumer protection, such as the methods that Microsoft and Activision (and other gaming services) implement to monetize players (including youth) through in-stream advertising and other marketing efforts. It also has implications for the ways we protect privacy in streaming media as well as in the evolving “metaverse.”The FTC must review this proposed deal, with the agency’s privacy and consumer-protection roles at the fore. This proposed Microsoft/Activision combination is emblematic of the ongoing transformation of how Big Tech companies track and target people across all their devices and applications. In order to continue its surveillance-advertising-based model, the online industry is undergoing a massive shift in tactics. It is pivoting to what’s called a “First-Party” data use strategy, claiming that it is obtaining our permission to continue to follow us online and deliver personalized ads and marketing. Getting our consent is the Big Tech plan to undermine any privacy legislation in the U.S. and elsewhere. For example, if this merger goes through, users of Activision games will likely be asked to consent to data collection and tracking on all of Microsoft’s services—such as Bing and LinkedIn. Given that Microsoft and Activision have already baked into its ad services relationships with Google and Meta/Facebook, this acquisition also illustrates the numerous deals that are aligned in all of these digital giants. Owning Xandr will bring a host of additional surveillance advertising resources to Microsoft’s already robust consumer-profiling and marketing infrastructure (including information contributed by AT&T’s own data practices). As explained in the data marketing newsletter AdExchanger, the Xandr and Activision acquisitions, if approved, will enable the leveraging of Microsoft’s already “strong first-party data set and monetize inventory across its wide portfolio of platforms, including its video game business, LinkedIn, Bing, Edge, Office 365, Skype and more.” Microsoft was already working with AT&T’s Xandr surveillance ad targeting apparatus, including for its gaming division. For example Xandr explains that its enables marketers to “Access real users in immersive and engaging environments” via its current ability to target people thru the Microsoft Advertising Exchange.Microsoft’s data targeting currently involves its “Microsoft Search Network,” which “sees 14.6 billion monthly searches globally across nearly 700 million users. Its Audience Network engages in an array of targeting tactics, including via leveraging a person’s identity, location, use of LinkedIn or other sites and a variety of “custom” approaches. Advertisers are able to “target audiences…across more than 1 billion Window devices.” Microsoft also offers its “Dynamics 365 Customer Insights” data platform to help marketers package their own data to use on its ad platform. Activision engages in an array of ad practices that raise concerns about unfairness and privacy, from in-stream ads to “rewarded videos” to product placement. As it explains, “Activision Blizzard Media connects brands and players with fan-first integrated advertising experiences across gaming and esports…. We create user-initiated in-game advertising experiences that allow brands to reward 245M+ players at key moments of gameplay to drive reach, frequency and engagement…. In-game User-initiated video ads allow brands to reward players at key moments of gameplay.”In this context, the FTC needs to review all the third-party tracking YouTube-related companies serving ads to Activision Blizzard Esports, including Google Campaign Manager 360, Flashtalking, Adform, Innovid and Extreme Reach. For example, Flashtalking explains that it helps gaming services “drive customer lifetime value…, understand who bought your games, how they interact with your brand, and which touch points drove engagement.” Innovid helps marketers create “accurate, persistent identity across devices.” For measurement, which is also a privacy issue long overlooked by previous FTC commissions, we have Activision partners that include Oracle’s MOAT, Kantor, Google Campaign Manager, and others. Everything from potato chips, candy and toilet paper is pushed via its gaming services. Activision uses neuromarketing and other research-related online ad industry tactics to figure out how best to deliver marketing to its users (including teens)—all of which have privacy and consumer protection implications. For decades, the Federal Trade Commission has approved Big Tech mergers without examining their impact on consumer protection and privacy (and also on competition—think of all the Google and Facebook takeovers the commission has okayed). This is unacceptable. Gaming is a hugely important market, with a set of data-gathering tactics that impact both consumers and competition. We expect this FTC to do much better than what we have witnessed for the past several decades. 
    Jeff Chester
  • Microsoft's further expansion into gaming, data gathering, digital marketing must trigger close scrutiny, inc. impacts on gamers, youth Microsoft’s proposed purchase of Activision-Blizzard raises serious red flags, Public Citizen, the Center for Digital Democracy, the Repair Association, the Communications Workers of America, and 11 additional groups said today in a letter to the Federal Trade Commission (FTC). The merger could give Microsoft an unfair amount of market power, threaten data privacy and security, limit consumers’ and independent business’ right to repair game consoles, and lead to union busting and wage suppression, the groups said.“If the FTC clears this merger, Microsoft will become the third largest gaming company in the world,” the letter reads. “The proposed merger fits an alarming pattern of concentration in the gaming industry over the past several years. Microsoft’s expanding role in the gaming market may result in the company using its leverage to raise subscription prices and limit options, among other possible consumer harms.”In January, Microsoft announced its deal to buy game publisher and developer Activision-Blizzard, subject to FTC approval. Activision is a titan of the gaming world, boasting 400 million monthly active users and incredibly popular titles like Call of Duty. Microsoft already is a major player in gaming as a hardware producer, platform provider, and game distributor. Combining the two companies could lessen competition in a market that’s seen a rash of consolidation in recent years.Workers at Activision have mobilized over the past year to shine a light on an abusive workplace culture. Now, as these workers seek to form a union to address their collective interests, the potential takeover by Microsoft threatens to further undermine workers’ rights and suppress wages.Microsoft’s move also has negative implications for data privacy and surveillance advertising. Adding Activision’s roster of game titles opens opportunities for advanced data collection, including the use of AI, influencers, neuromarketing, and other practices now used for its gaming operations.Additionally, the merger could strengthen Microsoft’s power to impinge on consumers’ right to repair their own video game equipment or to have it repaired by a service provider of their choice. Microsoft’s Xbox platform is already notoriously difficult to independently repair.
    Jeff Chester
  • The European Union’s efforts to legislate digital markets, specifically with its Digital Markets Act (DMA), make them fairer and more open, and benefit consumers. See PDF of full letter below. 20220203-letter-digital_markets_act-president-joe-r-biden.pdf
  • Documents 25 years of failures by agency to rein in practices that have eviscerated privacy and consumer protection in the U.S. and globallyThe Center for Digital Democracy (CDD) urges the Federal Trade Commission to develop a comprehensive set of rules to address a problem largely of its own making—the unfettered growth of commercial “surveillance marketing.” We submit this comment based on the nearly 25-year record of CDD and its key consumer-protection and privacy colleagues, providing detailed documentation and analysis of the need for the commission to regulate what is known as behavioral, programmatic and surveillance-based advertising.[1]The systemic and multiple failures of the FTC over the decades to respond meaningfully to the role and nature of online marketing—which has eviscerated the privacy rights of Americans (and consumers worldwide)—have enabled data-driven surveillance to thrive ubiquitously. Nearly every platform, application, device and experience in which Americans engage has been shaped by the commercial spying and manipulation apparatus that the commission has allowed to evolve and expand without constraint. In addition, by long ignoring the impact that the approval of countless mergers and acquisitions involving leading digital marketing companies had on commercial surveillance operations, the commission and Department of Justice have helped foster an online marketplace that is dominated by a few giants. There is no real competition in terms of how Americans are treated in the online surveillance marketing economy. Google, Meta/Facebook, Amazon and their partners set the global standards for how everyone else has to conduct data and digital marketing operations. FTC inaction on commercial surveillance practices has perversely promoted the widespread adoption of these practices. Today, nearly every major company is a big-data-driven information broker, surveillance advertiser, and real-time targeter of consumers.[2]At each critical moment—the expansion of behavioral advertising; the emergence of mobile marketing; the widespread adoption of programmatic, real-time, algorithmic-driven buying and selling of people for targeting ads; the deployment of omnichannel (cross-device) tracking and targeting; and the widespread integration of artificial intelligence and machine learning to deliver enhanced predictive targeting—the failure of the FTC to challenge the online data-driven model sent a message to the commercial surveillance industry, that it faced no serious regulatory or political consequences for its actions. This included regulatory immunity for the host of manipulative elements that are within the foundation of commercial surveillance, such as the deep analysis of a person’s emotions, interests, relationships, location, income, race, and ethnicity. By acting as an “enabler” to the forces that have shaped our online platform and experiences, the commission has done more than harm consumer protection, privacy and competition. It is also responsible for allowing the online platform marketplace to grow in ways that have undermined democracy, at once diminishing civic discourse, enabling efforts to promote voter suppression, and facilitating the communication and spread of hate speech and uncivil acts, among other major harms. FTC—“Eyes Wide Shut”: The commission has been engaged in risk-averse behaviors since the early 1980s, as a reaction to the successful attacks on it by the advertising lobby, which was able to convince Congress that the agency had engaged in regulatory excess when it tried to protect children from the harmful impact of marketing. The legacy of what is known as the “kidvid” episode, which resulted in a significant loss of its rulemaking authority, has permeated the agency’s operations for decades. It unleashed a “don’t ask, don’t tell” approach at the commission when it came to seriously confronting the impact of the digital marketplace on the public. Even when it came to children’s privacy—the one area where advocates had successfully convinced Congress to give the agency rulemaking authority—the agency repeatedly failed to enforce the law (allowing Google’s YouTube, for example, to openly violate COPPA for years, despite hearing repeatedly from advocates that it was doing so). The failure of the FTC to seriously implement its only congressionally mandated data-privacy law also sent a loud message to the data surveillance business that the commission wasn’t to be taken seriously.The commission has never condemned the online surveillance model developed by the digital marketing industry. It had countless opportunities to challenge behavioral ads, mobile and geo-location surveillance, social media profiling, and real-time buying and selling of individual profiles for the purposes of micro-targeted advertising.[3] In this comment, we will briefly highlight how the FTC has been an “enabler” of the unfettered operations of surveillance advertising, despite the many calls by CDD and its allies for the agency to act. The FTC and the Information Superhighway: At the earliest stages of what was then called the “Information Superhighway,” cyberspace, or the “National Information Infrastructure,” the FTC convened multiple “workshops” focused on privacy and related ecommerce issues. Reflecting the priorities of the then-Clinton administration, the commission spent several years imploring marketers to implement a set of “fair information practice” principles.[4]During this time, a number of consumer and privacy advocates urged the commission to call on Congress to regulate online data marketing practices. It was evident even then—especially to those who had tracked the online marketing business during the “dial-up” era, that privacy was not a priority at all for the marketing industry. The self-regulatory system was a total sham.[5]The lone exception to self-regulation was a data privacy law covering children 12 and under, an issue that this NGO’s predecessor group, the Center for Media Education (along with the Consumer Federation of America and the Institute for Public Representation, Georgetown University Law Center) championed—which led to the enactment of the Children’s Online Privacy Protection Act in 1998.[6]By the time the commission finally recommended (in 2000) that Congress enact privacy legislation “to supplement self-regulatory efforts and guarantee basic consumer protections,” the political winds had changed. There would be no further progress from Congress on privacy, given the clout of the big data marketing lobby.[7]Enabling behavioral advertising: In 2006, CDD and U.S. PIRG filed a complaint with the commission calling on it to use its Section 5 power to protect consumer privacy online. Specifically, we asked the commission to conduct an investigation of online advertising practices, focusing on five areas of concern: User Tracking/Web Analytics, Behavioral Targeting, Audience Segmentation, Data Gathering/Mining, and Industry Consolidation. As we explained in our petition, “Collectively, these five areas represented the foundations of an entirely new online environment, one in which engagement gives way to entrapment, in which personalization impinges on privacy.” The complaint discussed in detail all the methods used to track and target consumers, via their personal data, mobile phone use, and much more. It also urged action on the growing consolidation of what is now called the platform or ad-tech industry, explaining thatThe past few years have witnessed an alarming degree of consolidation in the Web analysis, advertising, and Internet data collection industries. The result of these transactions is not only the concentration of power in fewer hands, but also an increased ability, as our complaint has shown, for these companies to use their massive compilations of user data to violate consumer privacy in the U.S. Such consolidation within the core of the online marketing infrastructure also requires the FTC to conduct an anti-trust analysis to determine whether there is undue market power in this sector.[8]Ignoring the structure and consequences of behavioral advertising: In part due to the opposition to the proposed Google acquisition of DoubleClick that CDD, EPIC, U.S. PIRG and its allies in the U.S. and in the EU generated (discussed below), the commission convened several workshops, town halls and other forums focused on privacy and online data marketing. As CDD’s executive director, at the FTC’s 2007 “Ehavioral Advertising: Tracking, Targeting & Technology” event, warned, I just want to underscore that the future of online advertising has profound consequences for the future of our democracy and democracies everywhere. The kind of society we are creating right now for ourselves, and particularly our children, in many ways, is being shaped by the forces of advertising and marketing.... [W]e’ve watched since 2000 the ever-growing sophisticated array of techniques that had been deployed to track our every move, not just on individual websites, but through the development of new approaches called re-targeting where we were becoming digitally shadowed wherever we went, site to site…. [T]he time for fact-finding is over. The Commission is the designated Federal agency which is supposed to safeguard consumer privacy. It must act now to protect Americans from the unfair and deceptive practices that have evolved as part of what the industry calls the digital interactive marketing system.[9]Endorsing the monopolistic “Surveillance Marketplace”: In 2007, EPIC, U.S. PIRG, and CDD filed a complaint opposing plans by Google to acquire DoubleClick. As our initial filing explained, the acquisition, if approved, “will give one company access to more information about the Internet activities of consumers than any other company in the world.” In a supplemental petition, we explained that “the massive quantity of user information collected by Google coupled with DoubleClick’s business model of consumer profiling will enable the merged company to construct extremely intimate portraits of its users’ behavior.” We also identified a major conflict of interest at the commission regarding this deal. Needless to say, it was approved anyway, paving the way for the unprecedented role that Google now plays in our lives, with its domination of the commercial surveillance marketplace.[10] (CDD also raised objections to the Google/AdMob, Facebook/Instagram/WhatsApp, and other big-data-driven mergers that the FTC failed to address, again paving the way for the contemporary commercial surveillance apparatus).[11]Mobile surveillance: In 2009, CDD and U.S. PIRG urged the commission to “to protect consumers from a growing number of deceptive and unfair marketing practices and the resultant threats to consumer privacy that are a part of the rapidly growing U.S. mobile advertising landscape…. [M]obile devices, which know our location and other intimate details of our lives, are being turned into portable behavioral tracking and targeting tools….” The group’s FTC filing cited a Google official who called the mobile phone “the ultimate ad vehicle. It’s the first one ever in the history of the planet that people go to bed with. It’s ubiquitous across the world, across demographics, across age groups. People are giving these things to ever-younger children for safety and communication…. [I]t can know where you’ve been, where you’ve lingered, what store you stopped in, what car dealership you visited. It goes beyond any traditional advertising....” The complaint also discussed the myriad techniques, tactics, mergers and other critical issues to support the commission’s investigation and action.[12]Yet the FTC did nothing, and geolocation-based surveillance marketing has thrived, including via the leading platforms. Nor has the commission challenged cross-device tracking, a component of the surveillance marketing industry that financially benefited from the agency’s inability to protect consumer privacy, as unique identifiers are used to track and target the public.[13]Real-time programmatic, behavioral and algorithmic-based targeting: Also in 2009, CDD and U.S. PIRG submitted to the commission a comment as part of the agency’s “Privacy Roundtable” process, which noted thatToday, consumers online face the rapid growth and ever-increasing sophistication of the various techniques advertisers employ for data collection, profiling, and targeting across all online platforms. The growth of ad and other optimization services for targeting, involving real-time bidding on ad exchanges; the expansion of data collection capabilities from the largest advertising agencies (with the participation of leading digital media content and marketing companies); the increasing capabilities of mobile marketers to target users via enhanced data collection; and a disturbing growth of social media surveillance practices for targeted marketing are just a few of the developments the commission must address. But despite technical innovation and what may appear to be dramatic changes in the online data collection/profiling/targeting market, the commission must recognize that the underlying paradigm threatening consumer privacy online has been constant since the early 1990’s. So-called “one-to-one marketing,” where advertisers collect as much as possible on individual consumers so they can be targeted online, remains the fundamental approach. …Advertisers and marketers have developed an array of sophisticated and ever-evolving data collection and profiling applications, honed from the latest developments in such fields as semantics, artificial intelligence, auction theory, social network analysis, data-mining, and statistical modeling. Behavioral targeting is just one tool in the interactive advertisers’ arsenal.... We are being intensively tracked on many individual websites and across the Internet.[14]The filing called for action to address the buying and selling of individuals via online ad exchanges and giants such as Google; identified many other leading companies and practices; and explained how all of this was affecting mobile-device and social media users. It documented how self-regulation had been a failure, and how the “self-learning of contemporary interactive ad systems” threaten privacy and consumer welfare. Again, the FTC ignored these issues, enabling today’s programmatic (surveillance marketing) system to evolve unchallenged. Surveillance marketing of health behaviors, including through social media: In 2010, CDD and allied consumer and privacy groups filed a petition on the role that behavioral advertising, as well as manipulative ad tactics such as “neuromarketing,” play in the promotion of health and medical products. Google, Microsoft, and others were the subjects of this complaint. As we explained,A far-reaching complex of health marketers has unleashed an arsenal of techniques to track and profile consumers, including so-called medical “condition targeting,” to eavesdrop on their online discussions via social media data mining; to collect data on their actions through behavioral targeting; to use viral and so-called “word-of-mouth” techniques online to drive interest in prescriptions, over-the counter drugs, and health remedies; and to influence their subconscious perceptions via pharma-focused “neuromarketing…. Digital marketing raises many distinct consumer protection and privacy issues, including an overall lack of transparency, accountability and personal control, which consumers should have over data collection and the various interactive applications used to track, target, and influence them online (including on mobile devices). The use of these technologies by pharmaceutical, health product, and medical information providers that directly affect the public health and welfare of consumers requires immediate action.[15]As before, the FTC did nothing. Failures with Google and Facebook consent-decree enforcement: EPIC, CDD and allied consumer and privacy organizations, which helped bring cases against Google and Facebook, repeatedly told high-level commission staff and commissioners that these entities were routinely violating their respective consent decrees.[16] The failure of the commission to enforce its own decrees—reflecting the inability of the agency to analyze contemporary digital data and online marketing practices—permitted these companies, and the industry as a whole, to expand their surveillance capabilities still further.Neglecting communities of color: CDD also has repeatedly urged the commission to investigate and address how racial and ethnic data are used to target individuals and groups. For years, such data have been used to subject these communities to unfair treatment through predatory online marketing and other harmful practices. At best, the agency has given lip-service to these issues in the past, but has yet to take any meaningful action.[17]Failing our children, COPPA enforcement, and teens: CDD (along with Fair Play and Common Sense Media) is also filing comments in this docket on these issues. But we want to underscore that despite our repeated calls for action, the commission has never done anything to protect adolescents. Consequently, when someone turns 13 in the U.S., they are swept into the commercial surveillance marketing system that negatively affects every adult in the U.S.[18]Where we are today: Every day brings advances in the capabilities of commercial surveillance, led by the giant entities that dominate the marketplace, along with their affiliates. As we noted earlier, AI and machine-learning-based data analytic and targeting operations are routine for the commercial surveillance apparatus. And now the industry is poised to add what is known as “emotional intelligence,” a sophisticated new enhancement to ascertain and “understand how people feel in order to make AI more emotionally aware. There will be a shift from passive and grey interaction with AI, to an understanding of not only the cognitive, but also the emotive, channels of human interaction.”[19] Surveillance applications are also shaping the Internet of Things, the metaverse, and “over-the-top” streaming video as well.[20]We urge the commission to act on this petition, as well as calls by civil rights, consumer and privacy groups that it engage in a comprehensive rulemaking that will help promote competition, data protection, fairness and civil rights online. [1] We especially want to single out two individuals whose leadership role in all these years has been so critical, including with the FTC. Marc Rotenberg, who created and led the Electronic Privacy Information Center for decades, has been at the forefront of these and other key digital democracy issues since the earliest days of the internet. Ed Mierzwinski, now senior director, Federal Consumer Program, U.S. PIRG, understood that the same commercial forces that had undermined consumer rights in the “analog” world was doing so online as well. [2] So-called consumer data platforms and similar technologies permeate the corporate environment. See, for example, Mariah Cooper, “PepsiCo Launches Data Practice to Help Food and Beverage Retailers Grow,” Campaign, 9 Sept. 2021, /article/pepsico-launches-data-practice-to-help-food-and-beverage-retailers-grow/472436; Josh Wolf, “Where Does a Customer Data Platform Fit in With My AWS Data Lake?” AWS Blog, 13 May 2021, /blogs/apn/where-does-a-customer-data-platform-fit-in-with-my-aws-data-lake/; Wavicle Data Solutions, “Global QSR Uses Micro-segmentation to Improve Customer Engagement and Sales,” Dec. 2020, /wp-content/uploads/2020/12/Quick_Service_Restaurant_Customer_360_032421.pdf. [3] Federal Trade Commission, “Privacy in the Electronic Age,” The Privacy & American Business Conference, Washington, D.C., 1 Nov. 1995, /public-statements/1995/11/privacy-electronic-age.[4] Federal Trade Commission, “Staff Report: Public Workshop on Consumer Privacy on the Global Information Infrastructure” Dec. 1996, /reports/staff-report-public-workshop-consumer-privacy-global-information-infrastructure; Federal Trade Commission, About Privacy: Protecting the Consumer on the Global Information Infrastructure,” 8 Dec. 1998, /public-statements/1998/12/about-privacy-protecting-consumer-global-information-infrastructure; “Privacy in the Electronic Age.” See also Jeff Chester’s comments in U.S. Department of Commerce and Federal Trade Commission, “Public Workshop on Online Profiling, Washington, D C, 8 Nov. 1999, /sites/default/files/documents/public_events/online-profiling-public-workshop/online.pdf. [5] See especially the digital marketing industry fundamental paradigm laid out in Don Peppers and Martha Rogers, The One to One Future (New York: Crown Business, 1993); see also, Jeff Chester, Digital Destiny: New Media and the Future of Democracy (New York: The New Press, 2008). [6] See, for example, Federal Trade Commission, “Privacy Online: A Report to Congress,” June 1998. /sites/default/files/documents/reports/privacy-online-report-congress/priv-23a.pdf; Federal Trade Commission, “FTC Staff Sets Forth Principles For Online Information Collection From Children” 16 July 1997, /news-events/press-releases/1997/07/ftc-staff-sets-forth-principles-online-information-collection. To better understand the campaign developed to enact COPPA, including the industry pushback on teens, see Kathryn C. Montgomery, Generation Digital: Politics, Commerce, and Childhood in the Age of the Internet (Cambridge, MA: 2007). CDD pressed to have “cookies” and other identifiers included as personal information under COPPA. Jeff Chester, “Leading Consumer, Privacy, Child Advocacy & Public Health Groups Call on FTC for Stronger Children's Privacy Safeguards Under COPPA,” Center for Digital Democracy, 25 Sept. 2012, /content/leading-consumer-privacy-child-advocacy-public-health-groups-call-ftc-stronger-childrens. [7] See, for example, Marc Rotenberg, letter to Sen. Jay Rockefeller, Chairman of the Senate Committee of Commerce, Science and Transportation, et al, 5 May 2010, /wp-content/uploads/privacy/facebook/EPIC_FB_FTC_Complaint_Letter.pdf.[8] Jeff Chester and Ed Mierzwinski, “Complaint and Request for Inquiry and Injunctive Relief Concerning Unfair and Deceptive Online Marketing Practices,” Federal Trade Commission, 1 Nov.2006, /sites/default/files/FTCadprivacy_0_0.pdf. CDD and U.S. PIRG filed a supplemental petition a year letter, which included an analysis of advances in behavioral marketing, including through the DoubleClick Advertising Exchange, among others. While the commission staff and commissioners made various proposals, there was no real attempt to address the surveillance ad system. See, for example, , “A Preliminary FTC Staff Report on Protecting Consumer Privacy in an Era of Rapid Change: A Proposed Framework for Businesses and Policymakers,” Dec. 2010, /reports/preliminary-ftc-staff-report-protecting-consumer-privacy-era-rapid-change-proposed-framework.[9] Federal Trade Commission, “Ehavaorial Advertising: Tracking, Targeting & Technology,” meeting transcript, 1 Nov. 2007, pp. 35-36, /sites/default/files/documents/public_events/ehavioral-advertising-tracking-targeting-and-technology/71101wor.pdf. See also, Chester’s comment that “A very sophisticated commercial surveillance system has been put in place,” in Louise Story, “F.T.C. to Review Online Ads and Privacy,” New York Times, 1 Nov. 2007, /2007/11/01/technology/01Privacy.html. [10] Jeff Chester, “CDD, EPIC, USPIRG Opposition to Google/Doubleclick ‘Big Data’ Merger,” Center for Digital Democracy, 11 Sept. 2019, /article/cdd-epic-uspirg-opposition-googledoubleclick-big-data-merger; Roy Mark, “FTC Chair’s Impartiality Questioned,” eWeek, 13 Dec. 2007, /news/ftc-chair-s-impartiality-questioned/; “Conflict of Interest in Google-Doubleclick Merger Review,” EPIC.org, /documents/epic-v-federal-trade-commission/. [11] Tom Krazit, “Consumer Groups Urge Block of Google-AdMob Deal,” CNET, 28 Dec. 2009, /news/consumer-groups-urge-block-of-google-admob-deal/; Jeff Chester, “EPIC and CDD file ‘Unfair and Deceptive’ Practices Complaint at FTC on Facebook/WhatsApp Deal: WhatsApp Users Were Promised Privacy/Now They Will Have Facebook,” Center for Digital Democracy, 6 Mar. 2014, /content/epic-and-cdd-file-unfair-and-deceptive-practices-complaint-ftc-facebookwhatsapp-deal; Jeff Chester, “Big Data Gets Bigger: Consumer and Privacy Groups Call on FTC to Play Greater Role in Data Mergers/Investigation and Public Workshop Needed,” Center for Digital Democracy, 6 Feb. 2015, /content/big-data-gets-bigger-consumer-and-privacy-groups-call-ftc-play-greater-role-data-mergers.[12] “Consumer Groups Petition Federal Trade Commission to Protect Consumers from Mobile Marketing Practices Harmful to Privacy: Complaint Documents the Migration of Data Tracking, Profiling and Targeting to Mobile Phone Devices,” Center for Digital Democracy, 13 Jan. 2009, /mobile-marketing-harmful.[13] Center for Digital Democracy, “Ten Questions that the Federal Trade Commission Should Answer on Cross­ Device Online Tracking of Individuals,” /system/files/documents/public_comments/2015/11/00061-99851.pdf. For a current example of such tracking, see, for example, LiveRamp, “Measurement: Omnichannel Identity Linking,” /our-platform/cross-channel-measurement/omnichannel-identity-linking/. [14] Center for Digital Democracy and U.S. PIRG, “Cookie Wars, Real-Time Targeting, and Proprietary Self Learning Algorithms: Why the FTC Must Act Swiftly to Protect Consumer Privacy,” FTC Privacy Roundtables – Comment, Project No. P095416, 4 Nov. 2009, /sites/default/files/documents/public_comments/privacy-roundtables-comment-project-no.p095416-544506-00013/544506-00013.pdf. [15] Center for Digital Democracy, U.S. PIRG, Consumer Watchdog, and the World Privacy Forum, “Complaint, Request for Investigation, Public Disclosure, Injunction, and Other Relief: Google, Microsoft, QualityHealth, WebMD, Yahoo, AOL, HealthCentral, Healthline, Everyday Health, and Others Named Below,” FTC filing, 23 Nov. 2010, /sites/default/files/public/2015/101123publiccmptdigitaldemocracy.pdf. This complaint was one of several where CDD also placed a spotlight on social media marketing—another area in which the commission has repeatedly failed. For example, the complaint noted that “new surveillance tools have been developed to monitor conversations among social network users to identify what is being said about a particular issue or product. Marketers then work to insert brand-related messages into the social dialogue, often by identifying and targeting individuals considered brand ‘loyalists’ or ‘influencers….’ Increasingly, advertisers are using Facebook’s marketing apparatus—which is largely invisible to its users—…to … connect to the social communications of a very large pool of consumers.” [16] See, for example, Center for Digital Democracy, “Facebook’s Misleading Data and Marketing Policies and Practices,” Oct. 2013, /sites/default/files/field/public-files/2019/ftcfacebookdatapracticesfinal1013.pdf.[17] See for example, Jeff Chester, “Digital Target Marketing to African Americans, Hispanics and Asian Americans: A New Report,” Center for Digital Democracy,18 Feb. 2013, /content/digital-target-marketing-african-americans-hispanics-and-asian-americans-new-report; Center for Digital Democracy, “In the Matter of ‘Privacy and Security Implications of the Internet of Things,” FTC public workshop filing, 1 June 2013, /sites/default/files/documents/public_comments/2013/07/00006-86145.pdf; Jeff Chester, Kathryn Montgomery, and Lori Dorfman, “Alcohol Marketing in the Digital Age,” May 2010, /sites/default/files/documents/public_comments/alcohol-reports-project-no.p114503-00014%C2%A0/00014-58260.pdf.[18] “Children's Online Privacy.” C-Span, 17 Oct. 2018, /video/?453170-1/childrens-online-privacy; Center for Digital Democracy, “Digital Youth,” /projects/focus/digital-youth.[19] Yasmin Borain, “Marketing Trends for 2022: Technology, Artificial Intelligence and Internet of Things,” WARC, Nov. 2021, /content/article/warc-exclusive/marketing-trends-for-2022-technology-artificial-intelligence-and-internet-of-things/141212 (subscription required).[20] Hannah Murphy, “Facebook Patents Reveal How It Intends to Cash in on Metaverse: Meta Hopes to Use Tiny Human Expressions to Create Virtual World of Personalised Ads,” Financial Times, 17 Jan. 2022, /content/76d40aac-034e-4e0b-95eb-c5d34146f647 (subscription required).surveillanceadvertisingftccdd012622b.pdf
    Jeff Chester
  • CDD and Advocates Call on the FTC to Begin Rulemaking to Prohibit Surveillance AdvertisingJanuary 26, 2022Federal Trade CommissionOffice of the Secretary600 Pennsylvania Avenue NWWashington, DC 20580Re: Comment on Petition for Rulemaking by Accountable Tech, FTC-2021-0070  INTRODUCTIONCenter for Digital Democracy, Common Sense, Fairplay, Parent Coalition for Student Privacy and ParentsTogether strongly support the Petition for Rulemaking to Prohibit Surveillance Advertising filed by Accountable Tech1. We agree that this action is necessary to stop the exploitation of children and teens2.Surveillance advertising, also known as behavioral or targeted advertising, has become the standard business model for a wide array of online platforms with companies utilizing this practice to micro-target all consumers, including children and teens. Surveillance advertising involves the collection of vast amounts of personal data of online users, their demographics, behaviors, preferences, characteristics, and the production of inferences. To create detailed advertising profiles from this data, users are  tracked across websites and devices; they are classified, sorted, and even discriminated against via targeting and exclusion; and ultimately are left vulnerable to manipulation and exploitation.Young people are especially susceptible to the risks posed by surveillance advertising, which is why leading public health advocates like the American Academy of Pediatrics have called for a ban on surveillance advertising to children under 18 years old3. Children’s and teens’ online experiences are shaped by the affordances of surveillance marketing, which entrap them in a complex system purposefully designed to manipulate their behaviors and emotions, while leveraging their data in the process. Young people are a significant audience for the real-time ad profiling and targeting apparatus operated through programmatic platforms and technologies, which poses fundamental risks to their privacy, safety and well-being.  Surveillance advertising is harmful to young people in several ways. First, young people are already more susceptible to advertising’s negative effects and surveillance advertising allows marketers to manipulate children and teens even more effectively. Second, surveillance advertising allows advertisers to target children’s individual vulnerabilities. Third, surveillance advertising can exacerbate inequities by allowing advertisers to target (or abstain from targeting) marginalized communities. Fourth, behavioral advertising is the driving force behind a complex system of data collection and surveillance that tracks all of children’s online activity, undermining young people’s privacy and wellbeing. Finally, the Children’s Online Privacy Protection Act has failed to effectively protect children under thirteen from surveillance advertising and a more expansive prohibition is needed to protect the youngest and most vulnerable users online.For these reasons, we urge the Commission to protect children and teens by prohibiting surveillance advertising.......Please read the full petition, see PDF below......____________________________________________186 Fed. Reg. 73206 (Dec. 27, 2021).2Pet’n for Rulemaking at 32-33.3Jenny Radesky, Yolanda (Linda) Reid Chassiakos, Nusheen Ameenuddin, Dipesh Navsaria, Council on Communications and Media; Digital Advertising to Children. Pediatrics July 2020; 146 (1): e20201681. 10.1542/peds.2020-1681.childrens_coalition_survadv_1-26-22.pdf
  • Congresswomen Anna G. Eshoo (D-CA) and Jan Schakowsky (D-IL) and Senator Cory Booker (D-NJ) Introduce Bill to Ban Surveillance AdvertisingWashington, DC 1-18-2022“Identifying, tracking, discriminating, sorting, targeting, and manipulating online users lies at the heart of all that is toxic about today’s digital world. Surveillance advertising drives discrimination and compounds inequities, it destroys democratic institutions and rights, strengthens monopoly power of Big Tech platforms, and is harmful to children, teens, families, and communities. If enacted, the Banning Surveillance Advertising Act would put a stop to surveillance advertising and would be an important first step in building a digital world that is less toxic to our democracy, economy, and collective well-being,” said Katharina Kopp, Ph.D., Director of Policy for the Center for Digital Democracy.Click here for statements of support.Click here for bill text.Click here for a section-by-section summary.Click here for additional background.
  • "Surveillance" Marketing meets what Google calls "embedded finance" online [excerpt]USPIRG and CDD believe the U.S. is at an especially critical inflection point regarding digital platforms, digital payment services and online consumer protection: the pervasive tracking of data on individuals, families and groups, online and off; the nearly real-time ability to target a consumer with financial and other product offers regardless of where they are or device they use; and the development of a highly sophisticated and now machine-driven apparatus to deliver personalized marketing and communications, have all led to a largely unaccountable digital marketplace. A handful of digital platform giants and their partners stealthily operate what is known as a “surveillance marketing” system, which now pervades every aspect of our lives—increasingly affecting how the public engages with the financial services sector. As the Bureau’s Request for Comment illustrates, it is aware of the serious ramifications to consumers and small businesses as the U.S. accelerates its transition to what Google calls “embedded finance.” The leading platforms and online services, as they accelerate their roles as America’s new bankers and lenders, bring with them a host of critical issues that the Bureau must address. Moreover, the industry’s “closed-loop” business model, where platforms and online data and ad practices are able to operate in a non-transparent manner, which has already caused an uproar from global marketers, is poised to have even greater consequences as it assumes greater control over our daily financial experiences.  The growing role of platforms to leverage their market positions to shape the digital payment system is now disintermediating “banks and credit card companies from consumers.” These platforms are poised to dominate consumer and small business financial markets as much as they now do ecommerce, entertainment, and communications. In the process, these platforms and their online financial and other service partners will pose a series of threats. Their operating model, as we discuss below, engages in far-reaching forms of consumer manipulation, relying on a host of online marketing tactics designed to trigger a range of responses. There is a very real risk that without Bureau action, the digital payments and platform complex will aggressively push Americans to new levels of debt, as the Big Data and artificial intelligence (AI) apparatus now at the core of the consumer digital economy encourages impulse buying and other potentially consequential practices. These entities have so much information on individuals, communities and commerce, they easily dislodge smaller and locally-based businesses. Since data analyzed by the platforms is used to identify commercial opportunities across the range of their product offerings—which is basically making everything available for sale—a consumer will not be aware, let alone control, how this information can be used to target them with other products and services. As Alphabet/Google highlighted in a recent report on “embedded finance,” “online finance” has altered what people think is banking and managing finances: “now, most financial transactions happen via mobile apps, websites, email, text messages and other digital communications.” Google’s “white paper” suggests that embedded finance may be “the new gold rush” for financial services. Among the competitive benefits claimed by Google are that embedded finance “enables business to reach new customers at the moment when they need your services.” There is also an added “bonus”: “the data you collect from each transaction….”bigtechpaymentplatforms121921final.pdf
    Jeff Chester
  • CDD's Jeff Chester contributed to report's focus on online marketing practices, inc. use of big data analytics, by alcoholic beverage companies(Excerpt from WHO release):  Just as with tobacco, a global and comprehensive approach is required to restrict digital marketing of alcohol.“The vast majority of alcohol advertising online is “dark”Children and young people are especially at risk from the invasion of their social spaces by communication promoting alcohol consumption, normalising alcohol in all social contexts and linked to development of adult identities.“Current policies across the WHO European Region are insufficient to protect people from new formats of alcohol marketing. Age verification schemes, where they exist, are usually inadequate to protect minors from exposure to alcohol marketing. The fact that the vast majority of alcohol advertising online is “dark”, in the sense that it is only visible to the consumer to whom it is marketed, is challenging for policy makers thus requiring new mechanisms and a new approach,” said Dr Carina Ferreira-Borges, Acting Director for Noncommunicable Diseases and Programme Manager for Alcohol and Illicit Drugs at WHO/Europe.Link to releaseLink to report
    Jeff Chester
  • Groups urge Congress to stop Big Tech’s manipulation of young people BOSTON – Thursday, December 2, 2021 – Today a coalition of leading advocacy groups launched Designed With Kids in Mind, a campaign demanding a design code in the US to protect young people from online manipulation and harm. The campaign seeks to secure protections for US children and teens similar to the UK’s groundbreaking Age-Appropriate Design Code (AADC), which went into effect earlier this year. The campaign brings together leading advocates for child development, privacy, and a healthier digital media environment, including Fairplay, Accountable Tech, American Academy of Pediatrics, Center for Digital Democracy, Center for Humane Technology, Common Sense, ParentsTogether, RAINN, and Exposure Labs, creators of The Social Dilemma. The coalition will advocate for legislation and new Federal Trade Commission rules that protect children and teens from a business model that puts young people at risk by prioritizing data collection and engagement.The coalition has launched a website that explains how many of the most pressing problems faced by young people online are directly linked to platform’s design choices. They cite features that benefit platforms at the expense of young people’s wellbeing, such as: Autoplay: increases time on platforms, and excessive time on screens is linked to mental health challenges, physical risks like less sleep, and promotes family conflict.Algorithmic recommendations: risks exposure to self-harm, racist content, pornography, and mis/disinformation.Location tracking: makes it easier for strangers to track and contact children.Nudges to share: leads to loss of privacy, risks of sexual predation and identity theft.The coalition is promoting three bills which would represent a big step forward in protecting US children and teens online: the Children and Teens’ Online Privacy Protection Act S.1628; the Kids Internet Design and Safety (KIDS) Act S. 2918; and the Protecting the Information of our Vulnerable Children and Youth (PRIVCY) Act H.R. 4801. Taken together, these bills would expand privacy protections to teens for the first time and incorporate key elements of the UK’s AADC, such as requiring the best interest of children to be a primary design consideration for services likely to be accessed by young people. The legislation backed by the coalition would also protect children and teens from manipulative design features and harmful data processing. Members of the coalition on the urgent need for a US Design Code to protect children and teens:Josh Golin, Executive Director, Fairplay:We need an internet that helps children learn, connect, and play without exploiting their developmental vulnerabilities; respects their need for privacy and safety; helps young children disconnect at the appropriate time rather than manipulating them into spending even more time online; and prioritizes surfacing high-quality content instead of maximizing engagement. The UK’s Age-Appropriate Design Code took an important step towards creating that internet, and children and teens in the US deserve the same protections and opportunities. It’s time for Congress and regulators to insist that children come before Big Tech’s profits.Nicole Gill, Co-Founder and Executive Director of Accountable Tech:You would never put your child in a car seat that wasn't designed for them and met all safety standards, but that's what we do every day when our children go online using a network of apps and websites that were never designed with them in mind. Our children should be free to learn, play, and connect online without manipulative platforms like Facebook and Google's YouTube influencing their every choice. We need an age appropriate design code that puts kids and families first and protects young people from the exploitative practices and the perverse incentives of social media.Lee Savio Beers, MD, FAAP, President of the American Academy of Pediatrics:The American Academy of Pediatrics is proud to join this effort to ensure digital spaces are safe for children and supportive of their healthy development. It is in our power to create a digital ecosystem that works better for children and families; legislative change to protect children is long overdue. We must be bold in our thinking and ensure that government action on technology addresses the most concerning industry practices while preserving the positive aspects of technology for young people.Jeff Chester, Executive Director, Center for Digital Democracy:The “Big Tech” companies have long treated young people as just a means to generate vast profits – creating apps, videos and games designed to hook them to an online world designed to surveil and manipulate them. It’s time to stop children and teens from being victimized by the digital media industry. Congress and the Federal Trade Commission should adopt commonsense safeguards that ensure America’s youth reap all the benefits of the online world without having to constantly expose themselves to the risks.Randima Fernando, Executive Director, Center for Humane Technology:We need technology that respects the incredible potential – and the incredible vulnerability – of our kids' minds. And that should guide technology for adults, who can benefit from those same improvements.Irene Ly, Policy Counsel, Common Sense:This campaign acknowledges harmful features of online platforms and apps like autoplay, algorithms amplifying harmful content, and location tracking for what they are: intentional design choices. For too long, online platforms and apps have chosen to exploit children’s vulnerabilities through these manipulative design features. Common Sense has long supported designing online spaces with kids in mind, and strongly supports US rules that would finally require companies to put kids’ well-being first.Julia Hoppock, The Social Dilemma Partnerships Director, Exposure Labs:For too long, Big Social has put profits over people. It's time to put our kids first and build an online world that works for them.Dalia Hashad, Online Safety Director, ParentsTogether: From depression to bullying to sexual exploitation, tech companies knowingly expose children to unacceptable harms because it makes the platforms billions in profit. It's time to put kids first.Scott Berkowitz, President of RAINN (Rape, Abuse & Incest National Network):Child exploitation has reached crisis levels, and our reliance on technology has left children increasingly vulnerable. On our hotline, we hear from children every day who have been victimized through technology. An age-appropriate design code will provide overdue safeguards for children across the U.S.launch_-_design_code_to_protect_kids_online.pdf