In its first formal move to enter the children’s digital marketplace, Facebook has taken a responsible approach to this sensitive age group. It has created a “walled garden” messenger service designed exclusively for younger children; established strong parental controls; kept the service free of advertising; and restricted the use of many data collection and targeting practices that are employed routinely in its other services.
The Children’s Online Privacy Protection Act (COPPA) – which we helped pass in 1998, and which was updated in 2012 – has established a strong framework for protecting children 12 and under from unfair data collection and targeting. However, additional safeguards are necessary to protect young people from powerful new forms of commercial surveillance in the Big Data and Internet-of-things era. By designing an ad-free and safe environment for children, Facebook is playing a leadership role in developing responsible corporate practices that could be the basis for industry-wide guidelines.
But it is too early to understand fully how young people’s engagement with this new generation of digital interactive platforms will impact their psychosocial development. All stakeholders—including health professionals, educators, scholars, advocates, policymakers, and corporations — will need to monitor very closely how these services evolve.