US Senate’s ‘Kids Online Safety Act’ arrives – here’s what you should know
- The Kids Online Safety Act applies to any app or online service that could be used by kids 16-and-younger
- Under the bill, platforms would have a duty to prevent the promotion of certain harmful behaviors
- It also gives parents and users under-16 the ability to opt-out of algorithmic recommendations, prevent third parties from viewing a minor’s data, and limit the time kids spend on the platform
In 2021, Facebook whistleblower Frances Haugen came forward with internal research on Instagram’s impact on teens’ mental health. Leaders of other social media giants such as Facebook, TikTok, Snap and YouTube were also called forward to testify on those findings. The fiasco, if anything, has created momentum for an online safety act aimed especially for kids’ protection.
Haugen, who was a former Facebook employee, leaked a trove of internal company documents, illuminating just how harmful apps like Instagram can be for teens. Those revelations sparked five Senate subcommittee hearings on children’s internet safety, featuring testimony from executives at TikTok, Snap, YouTube, Instagram, and Facebook.
Five months later, the hearings resulted in Senator Richard Blumenthal and Senator Marsha Blackburn introducing the Kids Online Safety Act (KOSA) yesterday.
What does the Kids Online Safety Act comprise?
To put it simply, the Online Safety Act applies to any app or online service that could be used by kids 16 and younger. Under the bill, those platforms would have a duty to prevent the promotion of certain harmful behaviors, including suicide and self-harm, eating disorders, substance abuse and more.
According to the bill, platforms would also have to give parents and users under 16 the ability to opt out of algorithmic recommendations, prevent third parties from viewing a minor’s data and limit the time kids spend on the platform, among other things. Those controls would have to be turned on by default.
The bill also includes provisions regarding platforms’ disclosure policies and advertising systems while social media platforms are required to conduct a yearly independent audit to assess their risk to minors.
Additionally, it would also require the National Telecommunications and Information Administration to set up a program under which researchers interested in studying kids’ safety on a given platform could apply for data sets that companies would then have to hand over. The legislation would also give safe harbor to researchers who collect data on potential harms to minors on their own.
Blumenthal in a statement said, “Big Tech has brazenly failed children and betrayed its trust, putting profits above safety. The Kids Online Safety Act would finally give kids and their parents the tools and safeguards they need to protect against toxic content — and hold Big Tech accountable for deeply dangerous algorithms. Algorithms driven by eyeballs and dollars will no longer hold sway.”
To recap, prior to the revelation by the whistleblower in September last year, the UK came up with an Age Appropriate Design Code (ADDC) in mid-2021. It led to a domino effect whereby social media platforms began increasing privacy standards for teen users.
The AADC basically applies to services in the UK and outlines robust protections for young children that may be able to lessen as users age into their later teens. With AADC, US lawmakers were prompted to lament the lack of online protections for teens in the US as well.
30 March 2023