Social Media Giants in UK and US Implement Significant Changes to Protect Children Online, Report Reveals

Featured & Cover Social Media Giants in UK and US Implement Significant Changes to Protect Children Online Report Reveals

Social media companies in the United Kingdom have implemented nearly 100 modifications to their platforms to adhere to new standards aimed at enhancing online safety for children, as outlined in a recent report by the U.S.-based nonprofit Children and Screens: Institute of Digital Media and Child Development.

The U.K.’s Children’s Code, also known as the Age Appropriate Design Code, was enforced in 2020, granting social media firms a year to comply with the updated regulations. The alterations spotlighted in the report encompass adjustments that various social media giants, including popular platforms among youngsters such as TikTok, YouTube, Instagram, and Snapchat, have publicly announced. These adjustments also extend to platforms utilized in the United States.

Notably, these companies are part of the industry consortium NetChoice, which has actively opposed U.S. legislation concerning online safety by resorting to legal challenges.

Kris Perry, executive director of Children and Screens, applauds the report, stating, “It’s promising that despite the protests of the various platforms, they are actually taking the feedback from [researchers] and, obviously, policymakers.”

Similarly, child and adolescent psychologist Mary Alvord, co-author of The Action Mindset Workbook for Teens, acknowledges the companies’ responsiveness to feedback, noting, “It’s promising that despite the protests of the various platforms, they are actually taking the feedback from [researchers] and, obviously, policymakers.”

The modifications in platform design target four principal areas: youth safety and well-being, privacy, security, and data management, age-appropriate design, and time management.

To enhance youth safety and well-being, there have been 44 adjustments made across platforms. Instagram, for instance, has introduced features such as filtering out bullying comments and employing machine learning to detect bullying in photos. YouTube now notifies users about offensive comments and actively removes hate speech.

In terms of privacy, security, and data management, platforms have made 31 alterations. Instagram now notifies minors when they interact with adults flagged for suspicious behavior and prohibits adults from messaging minors who are more than two years younger than them.

Moreover, 11 changes across platforms aim to improve time management among minors. For instance, YouTube Kids has disabled autoplay by default, and regular reminders to take breaks are included for users aged 13 to 17.

Mitch Prinstein, a neuroscientist at the University of North Carolina at Chapel Hill and chief science adviser at the American Psychological Association, expresses optimism about the adjustments, remarking, “From what we know about the brain and what we know about adolescent development, many of these are the right steps to take to try and reduce harms.”

However, he highlights the lack of empirical evidence regarding the effectiveness of these measures in ensuring children’s safety and well-being on social media platforms.

Prinstein emphasizes the detrimental impact of addictive platform designs on children’s developing brains, particularly citing features like infinite scrolling, which aim to prolong user engagement but can be harmful to children’s ability to regulate their behaviors.

Furthermore, he commends the focus on eliminating hazardous or hateful content from platforms, emphasizing the importance of removing content that promotes disordered behaviors.

The report underscores that several U.S. states are considering legislation modeled after the U.K.’s Children’s Code. California, for instance, passed its own Age-Appropriate Design Code, although it faces temporary injunctions.

At the federal level, the U.S. Senate is poised to vote on the Kids Online Safety Act, a bipartisan initiative sponsored by Senators Richard Blumenthal and Marsha Blackburn. The bill seeks to compel social media platforms to mitigate harm to children and prioritize their privacy.

Despite legislative efforts, many families feel powerless as they await changes from both lawmakers and social media companies. Prinstein urges parents to engage in conversations with their children about their online activities, fostering digital literacy and awareness to promote safer usage of social media platforms.

Prinstein acknowledges the challenges ahead in enacting legislation but stresses the urgency of addressing children’s safety online without further delay.

Leave a Reply

Your email address will not be published. Required fields are marked *

More Related Stories