By Jason Kelley
Putting children under surveillance and limiting their access to information doesn’t make them safer—in fact, research suggests just the opposite. Unfortunately those tactics are the ones endorsed by the Kids Online Safety Act of 2022 (KOSA), introduced by Sens. Blumenthal and Blackburn. The bill deserves credit for attempting to improve online data privacy for young people, and for attempting to update 1998’s Children’s Online Privacy Protection Rule (COPPA). But its plan to require surveillance and censorship of anyone under sixteen would greatly endanger the rights, and safety, of young people online.
KOSA would require the following:
- A new legal duty for platforms to prevent certain harms: KOSA outlines a wide collection of content that platforms can be sued for if young people encounter it, including “promotion of self-harm, suicide, eating disorders, substance abuse, and other matters that pose a risk to physical and mental health of a minor.”
- Compel platforms to provide data to researchers
- An elaborate age-verification system, likely run by a third-party provider
- Parental controls, turned on and set to their highest settings, to block or filter a wide array of content
There are numerous concerns with this plan. The parental controls would in effect require a vast number of online platforms to create systems for parents to spy on—and control—the conversations young people are able to have online, and require those systems be turned on by default. It would also likely result in further tracking of all users.
And in order to avoid liability for causing the listed harms, nearly every online platform would hide or remove huge swaths of content. And because each of the listed areas of concern involves significant gray areas, the platforms will over-censor to attempt to steer clear of the new liability risks.
These requirements would be applied far more broadly than the law KOSA hopes to update, COPPA. Whereas COPPA applies to anyone under thirteen, KOSA would apply to anyone under sixteen—an age group that child rights organizations agree have a greater need for privacy and independence than younger teens and kids. And in contrast to COPPA’s age self-verification scheme, KOSA would authorize a federal study of “the most technologically feasible options for developing systems to verify age at the device or operating system level.” Age verification systems are troubling—requiring such systems could hand over significant power, and private data, to third-party identity verification companies like Clear or ID.me. Additionally, such a system would likely lead platforms to set up elaborate age-verification systems for everyone, meaning that all users would have to submit personal data.
Lastly, KOSA’s incredibly broad definition of a covered platform would include any “commercial software application or electronic service that connects to the internet and that is used, or is reasonably likely to be used, by a minor.” That would likely encompass everything from Apple’s iMessage and Signal to web browsers, email applications and VPN software, as well as platforms like Facebook and TikTok—platforms with wildly different user bases and uses. It’s also unclear how deep into the ‘tech stack’ such a requirement would reach – web hosts or domain registries likely aren’t the intended platforms for KOSA, but depending on interpretation, could be subject to its requirements. And, the bill raises concerns about how providers of end-to-end encrypted messaging platforms like iMessage, Signal, and WhatsApp would interpret their duty to monitor minors’ communications, with the potential that companies will simply compromise encryption to avoid litigation.
Censorship Isn’t the Answer
KOSA would force sites to use filters to block content—filters that we’ve seen, time and time again, fail to properly distinguish“good” speech from “bad” speech. The types of content targeted by KOSA are complex, and often dangerous—but discussing them is not bad by default. It’s very hard to differentiate between minors having discussions about these topics in a way that encourages them, as opposed to a way that discourages them. Under this bill, all discussion and viewing of these topics by minors should be blocked.
Research already exists showing bans like these don’t work: when Tumblr banned discussions of anorexia, it discovered that the keywords used in pro-anorexia content were the same ones used to discourage anorexia. Other research has shown that bans like these actually make the content easier to find by forcing people to create new keywords to discuss it (for example, “thinspiration” became “thynsperation”).
The law also requires platforms to ban the potentially infinite category of “other matters that pose a risk to physical and mental health of a minor.” As we’ve seen in the past, whenever the legality of material is up for interpretation, it is far more likely to be banned outright, leaving huge holes in what information is accessible online. The law would seriously endanger access to information to teenagers, who may want to explore ideas without their parents knowledge or approval. For example, they might have questions about sexual health that they do not feel safe asking their parents about, or they may want to help a friend with an eating disorder or a substance abuse problem. (Research has shown that a large majority of young people have used the internet for health-related research.)
KOSA would allow individual state attorneys general to bring actions against platforms when the state’s residents are “threatened or adversely affected by the engagement of any person in a practice that violates this Act.” This leaves it up to individual state attorneys general to decide what topics pose a risk to the physical and mental health of a minor. A co-author of this bill, Sen. Blackburn of Tennessee, has referred to education about race discrimination as “dangerous for kids.” Many states have agreed, and recently moved to limit public education about the history of race, gender, and sexuality discrimination.
Recently, Texas’ governor directed the state’s Department of Family and Protective Services to investigate gender affirming care as child abuse. KOSA would empower the Texas attorney general to define material that is harmful to children, and the current position of the state would include resources for trans youth. This would allow the state to force online services to remove and block access to that material everywhere—not only Texas. That’s not to mention the frequent conflation by tech platforms of LGBTQ content with dangerous “sexually explicit” material. KOSA could result in loss of access to information that a vast majority of people would agree is not dangerous, but is under political attack.
Surveillance Isn’t the Answer
Some legitimate concerns are driving KOSA. Data collection is a scourge for every internet user, regardless of age. Invasive tracking of young people by online platforms is particularly pernicious—EFF has long pushed back against remote proctoring, for example.
But the answer to our lack of privacy isn’t more tracking. Despite the growing ubiquity of technology to make it easy, surveillance of young people is actually bad for them, even in the healthiest household, and is not a solution to helping young people navigate the internet. Parents have an interest in deciding what their children can view online, but no one could argue that this interest is the same if a child is five or fifteen. KOSA would put all children under sixteen in the same group, and require that specific types of content be hidden from them, and that other content be tracked and logged by parental tools. This would force platforms to more closely watch what all users do.
KOSA’s parental controls would give parents, by default, access to monitor and control a young person’s online use. While a tool like Apple’s Screen Time allows parents to restrict access to certain apps, or limit their usage to certain times, platforms would need to do much more under KOSA. They would have to offer parents the ability to modify the results of any algorithmic recommendation system, “including the right to opt-out or down-rank types or categories of recommendations,” effectively deciding for young people what they see – or don’t see – online. It would also give parents the ability to delete their child’s account entirely if they’re unhappy with their use of the platform.
The bill tackles algorithmic systems by requiring that platforms provide “an overview of how algorithmic recommendation systems are used …to provide information to users of the platform who are minors, including how such systems use personal data belonging to minors.” Transparency about how a platform’s algorithms work, and tools to allow users to open up and create their own feeds, are critical for wider understanding of algorithmic curation, the kind of content it can incentivize, and the consequences it can have. EFF has also supported giving users more control over the content they see online. But KOSA requires that parents be able to opt-out or down-rank types or categories of recommendations, without the consent or knowledge of the user, including teenage users.
Lastly, under KOSA, platforms would be required to prevent patterns of use that indicate addiction, and to offer parents the ability to limit features that “increase, sustain, or extend use of the covered platform by a minor, such as automatic playing of media, rewards for time spent on the platform, and notifications.” While minimizing dark patterns that can trick users into giving up personal information is a laudable goal, determining what features “cause addiction” is highly fraught. If a sixteen year-old spends three hours a day on Discord working through schoolwork or discussing music with their friends, would that qualify as “addictive” behavior? KOSA would likely cover features as different as Netflix’s auto-playing of episodes and iMessage’s new message notifications. Putting these features together under the heading of “addictive” misunderstands which dark patterns actually harm users, including young people.
EFF has long supported comprehensive data privacy legislation for all users. But the Kids Online Safety Act would not protect the privacy of children or adults. It is a heavy-handed plan to force technology companies to spy on young people and stop them from accessing content that is “not in their best interest,” as defined by the government, and interpreted by tech platforms.
Source: EFF
Become a Patron!
Or support us at SubscribeStar
Donate cryptocurrency HERE
Subscribe to Activist Post for truth, peace, and freedom news. Follow us on SoMee, Telegram, HIVE, Flote, Minds, MeWe, Twitter, Gab, What Really Happened and GETTR.
Provide, Protect and Profit from what’s coming! Get a free issue of Counter Markets today.
Be the first to comment on "The Kids Online Safety Act Is a Heavy-Handed Plan to Force Platforms to Spy on Young People"