How to protect children online
The digital world was not built for children. That’s why age-appropriate design should be at the Forefront of Utah’s Social Media Act
Written by: Krystina Sorensen
With the increase of digital tools at one’s fingertips, young people quickly adopt modern technologies for entertainment, education, and socialization. The digital environment was not originally designed for children, yet it plays a significant role in their lives.
However, this presents a double-edged sword as the widespread use of technology exposes youth to various risks stemming from deliberate design choices. Companies often prioritize maximizing user engagement, inadvertently fostering excessive online time and difficulties in disengaging. These risks include exposure to harmful or age-inappropriate content, unsolicited contact from adults, cyberbullying, as well as concerns related to data harvesting, commercial pressures, and encouragement to gamble.
For instance, studies have revealed that large social media platforms, such as TikTok, recommend such content to new teen accounts within mere seconds, exposing minors to potentially detrimental body image and mental health narratives every 39 seconds. Considering this, state, federal, and international lawmakers are franticly focusing their attention on the online privacy and safety of minors.
At the federal level in the United States, the Children’s Online Privacy Protection Act (COPPA) governs children’s online privacy, ensuring fair information practices for collecting personal data from children under 13. COPPA operates on a consent-based, data-focused approach, providing guidelines for parental consent and data collection to empower informed decisions by parents regarding their children's online activities.
However, it faces challenges such as reliance on self-reporting mechanisms, which can be bypassed by children, and it lacks comprehensive measures to address broader design elements crucial for ensuring child safety in the digital realm.
At the state level, an uneven patchwork of emerging policies does not consistently align with methods aimed at “protecting” minors. In a wave of growing concerns across states about the online platform’s addictive features and targeted nature, Utah enacted the Social Media Regulation Act (SMRA).
Utah Governor Spencer Cox applauded the Republican lawmakers behind the new laws for combatting what he considers a plague on the mental health of our youth. While the SMRA offers stronger parental control features compared to COPPA, potentially limiting children's exposure to harmful content, it raises other rights-based concerns.
There are numerous legal risks posed by Utah's regulatory approach, extending beyond privacy concerns to encompass potential infringements on free speech rights for children, teens, and adults. Excessive enforcement could inadvertently lead to censorship, restrictions on legitimate content, and encroachments on individual liberties. This has given rise to legal challenges, exemplified by a lawsuit filed by NetChoice, a lobbyist group representing big tech platforms.
The suit alleges violations against First Amendment free speech protections and equal protection under the 14th Amendment. Further complicating matters, the Foundation for Individual Rights and Expression (FIRE) has filed a separate lawsuit asserting the unconstitutionality of the age verification requirement and seeking a preliminary injunction to rescind the law.
Amidst a flurry of raised concerns, Utah's governor acted on March 13, 2024, approving an overhaul of social media laws. This decisive move comes as the state faces numerous lawsuits challenging the constitutionality of these regulations. This means that Utah’s laws place the responsibility on social media companies to prove that their algorithmically curated content didn't contribute, either fully or partially, to a child's depression, anxiety, or self-harm behaviours.
However, the laws also offer greater legal protection to companies that adhere to certain restrictions, such as limiting minors in Utah to three hours of app usage per day, mandating parental permission for account creation, and implementing a statewide blackout for youths on social media between 10:30 p.m. and 6:30 a.m.
There are some important considerations that the law seems to miss the mark at considering, a child’s voice and their evolving capacity. For instance, in the Convention on the Rights of Child U.N.’s General Comment No. 25 for its thoughtful guidance on this point: “Parents’ and caregivers’ monitoring of a child’s digital activity should be proportionate and in accordance with the child’s evolving capacities.”
While considering this, the comment also considers promoting the rights of every child is imperative, given the profound impact of technological innovation. Digital access helps children to realize a spectrum of civil, political, cultural, economic, and social rights, without exacerbating existing or creating new disparities.
Child rights are holistic, guiding duty bearers, including states and businesses to balance the opportunities and risks of play and consider children's age, maturity, evolving capacities, and best interests in real-world circumstances.
Rights are important in design incorporation as noted by research on children aged 6-17 in the United Kingdom which emphasized the contribution of rights-respecting design to children's enjoyment of digital play which included age-appropriate play, privacy, safe contact, and easy onboarding, enhancing children's participation and evolving capacity. This underscores a notable gap in current digital products, suggesting a business incentive to design alternatives that prioritize rights-respecting features, ensuring satisfying experiences without undermining the qualities of digital play of children.
Some proposed measures to consider for Utah’s Social Media Act are:
Age-appropriate Design as the centrepiece:
The Information Commissioner’s Office (ICO) has established 15 standards to create a secure digital space for children, drawing inspiration from the UK's Age-Appropriate Design Code. Building on the UK's success, the ICO contends that Age-Appropriate Design Codes can harmoniously coexist with freedom of expression.
This emphasizes the delicate balance between upholding free expression and safeguarding children's privacy, ensuring a secure digital environment. The code prioritizes a child’s best interests in conflicts with commercial interests, recognizing the potential for harmonization. California has similarly embraced these principles, recognizing the benefits of shared experiences in implementing child online safety regulations.
The ICO’s Age-Appropriate Design (2020) introduces a comprehensive code for online services used by children, serving as a tool for compliance evaluation with regulations such as the Data Protection Act 2018 and the General Data Protection Regulation. Additionally, the ICO’s DPIA guidance offers a structured risk management process for developers of children-targeted online services.
This approach aligns seamlessly with the first segment of the STAR framework, where Safety by Design principles advocate for a preventative systems approach to mitigate harm. It involves integrating safety considerations through thorough risk assessments during the design, implementation, and modification phases of products and services. Safety by Design sets a fundamental consumer standard, reflecting expectations across various sectors.
STAR Framework:
Aligning with the STAR Framework, Utah's Social Media Regulation Act should prioritize transparency in three critical areas: Algorithms, Rules Enforcement, and Advertising Economics. This involves compelling platforms to provide detailed insights into their algorithms, establishing clear protocols for communicating and enforcing platform rules, and implementing direct measures to reveal the economic aspects of advertising, minimizing potential manipulations.
To fortify regulatory governance and ensure effective oversight, recommended measures include robust implementation of accountability systems to monitor statutory duties, counteracting potential profit-driven inaction by establishing mechanisms to mitigate profit motives, and introducing independent pathways for stakeholders to challenge regulatory decisions, fostering transparency, accountability, and fairness within the regulatory framework.
For accountability and responsible corporate behaviour within the digital landscape, explicit legislative frameworks should define clear responsibilities for social media and search engine companies. This entails outlining specific duties to ensure transparency and adherence to established norms. Additionally, a comprehensive framework should be introduced to hold companies and senior executives accountable for any resulting harm, aligning with established practices for fostering responsible corporate behaviour and contributing to a safer online environment.
Youth participation – Utah Youth Online Safety Council
Establishing the Utah Youth Online Safety Council is crucial to address the underrepresentation of youth in decision-making processes and the inherent biases within existing policies. Findings from reports, including the 2023 Youth Assembly on Digital Rights and Safety in Canada, highlight the need for diverse inputs to ensure policies meet the needs of varied communities and youth (2023).
Incorporating youth in both policy development and platform design is imperative. Studies, such as the one conducted at Oxford University, indicate that community moderation models based on youth-centric approaches to governance are more effective than deterrence-based moderation.
The Utah Youth Online Safety Council can play a pivotal role in bridging this gap by providing a platform for youth voices and fostering a collaborative environment for inclusive policymaking.
Children pave the way for the digital space they want.
Although the digital space was originally not created with their best interests in mind, Utah must consider all of this to build a digital space that young people deserve.
Krystina, an aspiring MPP candidate at the Max Bell School of Public Policy, also holds the esteemed role of Assistant Editor at The Bell. With a strong academic foundation in Criminology and Psychology, she brings over three years of diverse experience gained in both the non-profit sector and the esteemed Federal Public Service. Krystina is passionate about championing human rights causes, promoting gender equality, and advocating for meaningful law and legislative reforms.