A New Angle On Social Media Harms
UK and US warn about new Russian threat; Australia unveils the second half of critical infrastructure legislation
The Wavelength is now a paid subscription newsletter. Subscribe today to access all the material that used to be free.
Photo by Senjuti Kundu on Unsplash
Senators Richard Blumenthal (D-CT) and Marsha Blackburn (R-TN) issued their bill, the “Kids Online Safety Act”” (KOSA) (S.3663) that purports to address the harm and danger children and teens are being exposed to on platforms like Instagram, Snapchat, TikTok, and others. In their press release, they characterized the bill as “comprehensive bipartisan legislation to enhance children’s safety online.” They expanded on this claim:
The Kids Online Safety Act provides kids and parents with tools, safeguards, and transparency they need to protect against threats to children’s health and well-being online. The legislation requires social media platforms to put the interests of children first, providing an environment that is safe by default and help prevent these destructive impacts. The legislation also requires independent audits and supports public scrutiny from experts and academic researchers to ensure that parents and policymakers know whether social media platforms are taking meaningful steps to address risks to kids.
Blumenthal chairs and Blackburn is the ranking member of one of the Senate Commerce, Science, and Transportation Committee’s subcommittees that has jurisdiction over online platforms and social media. In December, this subcommittee held a hearing titled “Protecting Kids Online: Instagram and Reforms for Young Users,” which followed its hearings in the early fall on the risks and harm children and teens experience online (here, here, and here.) These hearings were fueled in large part by the documents and data former Facebook employee Frances Haugen made public, in part, through the Wall Street Journal’s series of articles on research Meta/Facebook had showing that Instagram negatively affected some children and teens, among other deleterious effects.
In the House, yet another bill was introduced to addressed the harm wrought allegedly by online platforms and algorithms: the “Digital Services Oversight and Safety Act” (DSOSA) (H.R.6796), which the sponsors claimed is “comprehensive transparency legislation to establish a Bureau of Digital Services Oversight and Safety at the Federal Trade Commission (FTC) that would have the authority and resources necessary to hold powerful online companies accountable for the promises they make to users, parents, advertisers, and enforcers.” But, today’s edition will not address this and the other bills looking to address the algorithms used by social media.
Big picture, KOSA would require online platforms to provide enhanced protection for those 16 years old and under through default settings and other settings minors and parents can use. However, should a minor disable those defaults and a parent does not object, he or she would subject to the same risk and harm in a world without this or similar legislation.
KOSA takes a somewhat new approach on the legal duties of online platforms. The bill establishes a duty of care for these entities, obligating them “to act in the best interests of a minor that uses the platform's products or services.” This concept follows from legislation in the U.S. and elsewhere such as the “Online Safety Bill” the British Parliament is revising. In the U.S., the highest profile bill that uses the duty of care construct is the “Data Care Act of 2021” (S.919), which establishes duties of care for online service providers with respect to people’s data much like how doctors and lawyers must protect their patients’ and clients’ data. The closer analogue is the United Kingdom’s draft Online Safety Bill, which would impose a duties of care framework for online platforms that a reviewing committee noted witnesses found “particularly complex and confusing.”
However, unlike other duties of care in law or court precedent, this duty would not allow people to sue. Traditionally, duties of care are relevant in tort law where a failure to meet one through negligence makes a party potentially liable for monetary damages. KOSA’s duty of care does not operate in the same fashion. In this bill, online platforms have “a duty to prevent and mitigate the heightened risks of physical, emotional, developmental, or material harms to minors posed by materials on, or engagement with, the platform, including—
§ promotion of self-harm, suicide, eating disorders, substance abuse, and other matters that pose a risk to physical and mental health of a minor;
§ patterns of use that indicate or encourage addiction-like behaviors;
§ physical harm, online bullying, and harassment of a minor;
§ sexual exploitation, including enticement, grooming, sex trafficking, and sexual abuse of minors and trafficking of online child sexual abuse material;
§ promotion and marketing of products or services that are unlawful for minors, such as illegal drugs, tobacco, gambling, or alcohol; and
§ predatory, unfair, or deceptive marketing practices.
This covers a lot of real estate. The first category of risks and harms from which online platforms must protect minors are the ones that have gotten the most attention since Frances Haugen exposed the research Meta/Facebook had on the effect of its products on people. One of the Wall Street Journal articles highlighted one study showing that some underage female users of Instagram developed eating disorders or their existing ones got worse because of the company’s algorithm that pushes images of idealized female bodies. The second category bleeds a bit into the first category, for substance abuse and addiction often go hand in hand, but it may sweep more widely and aim at the qualities of apps and platforms to encourage as much use as possible, often through dark patterns or other tactics. The third category is another oft-discussed group of harms children and teens face with “cyber bullying” long being a concern. The fourth category of harms steps into the same territory that legislation to peel back Section 230 protection for online platforms regarding child sexual abuse material (CSAM). Much of the fifth category are among some of the abuses of which online platforms have been accused. Finally, with respect to the last category, the FTC Act and state consumer protection states already protect people, including minors, from unfair and deceptive marketing, which should include online marketing so the last one is a bit of a headscratcher. Predatory marketing, on the other hand, may not be covered by those statutes.
It bears mention that some of the above harms may be very difficult to construe in ways that put minors, parents, and online platforms on notice about the types of harm that trigger this duty of care. For example, “other matters that pose a risk to physical and mental health of a minor” strikes me as incredibly broad and would leave online platforms open to significant liability. How significant must the risk be? Could a remote risk of grave injury meet this standard? Or how about a risk of greater certainty but less harm? Moving from the abstract to the practical, as some may recall, during the ice bucket challenge that went around social media a few years ago, some people injured themselves or others accidentally. If KOSA were law, and something like the ice bucket challenge went viral and some minors got hurt, would the FTC and state attorneys general be able to go after Instagram? Or, let’s say the FTC caught this new challenge before it went viral, but all the major platforms had some content encouraging minors to follow suit. Would this breach the duty of care even though no one gets hurt? Or could the FTC act in a case where someone might be at physical risk even though the chances are remote?
Likewise, the “harassment of a minor” could be construed very broadly, and platforms would be charged with policing the interactions of minors, not a group of people always known for restraint and good decision making in terms of handling conflict. It is fairly easy to imagine instances where a minor or his or her parents would allege harassment that others would not see the same way. As a result, this strikes me as another facet of the duty of care, however well-intended, that may be problematic in practice.
The bill is not clear on exactly how online platforms would meet the “duty to prevent and mitigate the heightened risks of physical, emotional, developmental, or material harms to minors posed by materials on, or engagement with.” Should one read this language to mean that online platforms would have an affirmative responsibility to ensure that minors are shielded from all such content. If so, what would happen if a company like Snapchat screened successfully 99% of the aforementioned material? Would it be failing to meet its duty of care? How these duties would work in practice are not clear.
The FTC would get authority to promulgate regulations to effectuate this act and through the normal notice and comment process. Consequently, the FTC may try to elucidate how online platforms would meet all these duties through regulations, policy statements, and guidance.
Moreover, there may be an outside chance U.S courts would allow the parents of children and teens harmed by an online platform’s failure to meet its duty of care under an implied right of action, but this is far from certain. Moreover, this is likely not the intent of at least one of the bill’s author’s, Blackburn, who has opposed the right of people to sue for privacy violations under a national data protection statute. She has also sponsored data privacy legislation that did not give people the right to sue for violations. It would be a strange development if Blackburn implicitly went along with an implied right of action for families to sue online platforms for failing to meet the requirements of this bill.
Setting those issues to the side, KOSA further requires online platforms to “provide a minor, or a parent acting on a minor's behalf, with readily accessible and easy-to-use safeguards to control their experience and personal data on the covered platform.” The bill spells out what these safeguards must be, which are to include settings to:
§ limit the ability of other individuals to contact or find a minor, in particular adults with no relationship to the minor;
§ prevent other individuals from viewing the minor’s personal data collected by or shared on the covered platform, in particular restricting public access to personal data;
§ limit features that increase, sustain, or extend use of the covered platform by a minor, such as automatic playing of media, rewards for time spent on the platform, and notifications;
§ opt out of algorithmic recommendation systems that use a minor’s personal data;
§ delete the minor's account and request removal of personal data;
§ restrict the sharing of the geolocation of a minor and to provide notice regarding the tracking of a minor’s geolocation; and
§ limit time spent by a minor on the covered platform.
Some of these safeguards seem designed to address well-known harm teens and children face online, notably from adults looking to prey on them. And so limits on who can connect with minors and access their personal information seem sound. In a different vein, there would be settings to allow minors and their parents to address the addictive nature of many social media apps and platforms through limiting the endless scrolling nature and through time limits.
It should be stated that these safeguards would not stop online platforms from continuing to collect and process the personal, sensitive data of minors in the same way they currently do save for a few limits on these activities. Minors and parents could opt to have accounts and personal data deleted. Another limit would be the setting that would allow minors or their parents to opt of out algorithmic recommendation systems that use a minor’s personal data. However, algorithmic recommendation systems that do not work on a minor’s personal data would still be allowed, it would seem. Finally, there would be a setting to provide notice of tracking a minor based on their geolocation and a bar on sharing the minor’s geolocation. So, it appears if these latter two settings are activated, TikTok could not share a minor’s geolocation, but if the platform decides to track the minor, it merely needs to provide notice. There does not seem to be a way that a minor could stop TikTok from tracking their location. And so, apart from these limits, an online platform could collect, process, and disclose the personal data of minors so long as it does not run afoul of its duty of care to protect the minor from risks and harms.
Nonetheless, online platforms must make these settings the default for minors, and there is a reasonable belief threshold for determining when users are minors, which is lower than actual knowledge. This raises the question of how online platforms are to handle situations in which it is not immediately apparent that a new user is a minor. Should a platform have a function that automatically switches to the default settings? And, in this example, a minor may have gotten used to the platform without any such settings and may, depending on parental knowledge and supervision, turn off the default settings. Of course, this example points to a weakness in the bill that only informed and proactive parents and minors will available themselves of these safeguards, which could easily lead to a situation where they is a digital divide in a new sense between the minors not exposed to some of the excuses of the online world and those who are fully immersed and possibly harmed.
In any event, online platforms must “provide information and control options in a manner that is age appropriate and does not encourage minors to weaken or turn off safeguards.” One wonders if these will be like the warnings on packs of cigarettes and alcohol, and just as ineffective.
In order to give parents control over their minors, online platforms must “provide readily accessible and easy-to-use parental tools for parents to appropriately supervise the use of the covered platform by a minor.” These tools will be enabled by default if a platform has a reasonable belief a user is a minor. Parental control tools must feature:
§ the ability to control privacy and account settings, including the safeguards…
§ the ability to restrict purchases and financial transactions by a minor;
§ the ability to track total time spent on the platform;
§ a clear and conspicuous mechanism for parents to opt out of or turn off any default parental tools put in place by the covered platform; and
§ access to other information regarding a minor's use of a covered platform and control options necessary to a parent's ability to address the harms…
In short, these parental tools would allow parents to supervise their children and teen’s activities on social media platforms. Of course, as noted earlier, this system will depend on parents taking the time and care to investigate each online platform and to manage their child’s usage.
Online platforms must provide parents and minors with conspicuous electronic means to report on any of the above mentioned harms. Platforms must also “establish an internal process to receive and respond to reports in a reasonable and timely manner.”
Before a minor registers, uses, or buys a social media app, the company must “provide clear, accessible, and easy-to-understand—
§ notice of the policies and practices of the covered platform with respect to personal data and safeguards for minors;
§ information about how to access the safeguards and parental tools…; and
§ notice about whether the covered platform, including any algorithmic recommendation systems used by the platform, pose any heightened risks of harm to a minor…
Online platforms also have a responsibility to provide this notice to parents if it reasonably believes the new user is a minor. I suppose the only way any such system would work is if a minor would be required to furnish the contact information of a parent. Would the online platform need to collect additional information from a parent in order to be sure a minor is not trying to access an online platform without permission?
Regarding the notice about whether the online platform poses “heightened risks of harm” to minors, before a minor can use the platform, he or she must acknowledge receipt of information about these risks. Parents may also acknowledge the disclosed risks so a minor can then proceed to use an online platform.
Moreover, online platforms would need to provide notice about their algorithms that recommend content and their marketing practices. Any platforms using algorithmic recommendation systems must disclose in their terms and conditions “in a clear, accessible, and easy-to-understand manner” an overview of the program, including how the personal data of minors is used and any options minors and parents have to manage the algorithms, including opting out or downranking types and categories of algorithms. With respect to advertising and marketing, online platforms will need to “provide clear, accessible, and easy-to-understand information and labels regarding—
§ the name of the product, service, or brand and the subject matter of an advertisement or marketing material;
§ why the minor is being targeted for a particular advertisement or marketing material if the covered platform engages in targeted advertising, including meaningful information about how the personal data of the minor was used to target the advertisement or marketing material; and
§ whether particular media displayed to a user is an advertisement or marketing material, including disclosure of endorsements of products, services, or brands made for commercial consideration by other users of the platform.
These provisions seem aimed at the influencer advertising model where people with large followings online advertise products for pay without disclosing what they are doing.
Online platforms must “issue a public report identifying the foreseeable risks of harm to minors based on an independent, third-party audit conducted through reasonable inspection of the covered platform and describe the prevention and mitigation measures taken to address such risks.” These reports must include, among other content:
§ an assessment of whether the covered platform is reasonably likely to be accessed by minors;
§ a description of the commercial interests of the covered platform in use by minors;
§ an accounting, disaggregated by category of harm, of—
o the total number of reports of the dissemination of illegal or harmful content involving minors; and
o the prevalence of content that is illegal or harmful to minors; and
§ a description of any material breaches of parental tools or assurances regarding minors, unexpected use of the personal data of minors, and other matters regarding non-compliance.
Furthermore, these reports include assessments of systemic risks on a wide range of harms to minors. Online platforms must also detail their plans to mitigate the identified risks, an update on plans to mitigate risks identified in previous reports, and the controls and measures minors and parents can take. Moreover, online platforms will have an affirmative responsibility to consult with parents, experts, and civil society on the risks to minors and to conduct research on the harms minors have faced on the platform.
KOSA would require online platforms to allow researchers vetted by a federal agency to look into the harms they pose to minors. These provisions are intended to address the lack of visibility researchers and policymakers have into platforms like Facebook. The National Telecommunications and Information Administration (NTIA) would be tasked with establishing “a program under which an eligible researcher may apply for, and a covered platform shall provide, access to data assets from the covered platform for the sole purpose of conducting public interest research regarding harms to the safety and well-being of minors.” Online platforms would need to comply, and researchers would be immunized against all legal claims related to their research under this program.
The FTC must “establish guidelines for covered platforms seeking to conduct market- and product-focused research on minors or individuals it reasonably believes to be minors.” The National Institute of Standards and Technology (NIST) must “conduct a study evaluating the most technologically feasible options for developing systems to verify age at the device or operating system level.” Finally, the Secretary of Commerce “shall establish and convene the Kids Online Safety Council for the purpose of providing advice on the implementation of this Act.”
The FTC would enforce the new regime in the usual way by treating offenses as violations of an existing rule banning an unfair or deceptive practice, allowing the agency to seek civil fines in court of more than $46,000 per violation among other relief. Like many other bills in the data protection and privacy space, state attorneys general could also enforce KOSA. There is not, however, a private right of action, and so parents could not sue on behalf of their children or teens.
Other Developments
Photo by sippakorn yamkasikorn on Unsplash
§ The United Kingdom’s (UK) National Cyber Security Centre (NCSC), and the United States (U.S.) Cybersecurity and Infrastructure Security Agency (CISA), the National Security Agency (NSA) and the Federal Bureau of Investigation (FBI) in the US “have identified that the actor known as Sandworm or Voodoo Bear is using a new malware, referred to here as Cyclops Blink.”
§ The United Kingdom’s (UK) Foreign, Commonwealth & Development Office and National Cyber Security Centre “attributed the distributed denial of service (DDoS) attacks against the Ukrainian banking sector on 15 and 16 February 2022 to have involved the Russian Main Intelligence Directorate (GRU).”
§ The “Security Legislation Amendment (Critical Infrastructure Protection) Bill 2022” (SLACIP Bill) was introduced in Australia’s Parliament that “proposes further amendments to the Security of Critical Infrastructure Act 2018 to enact a framework for risk management programs, declarations of systems of national significance and enhanced cyber security obligations.” Australia’s Cyber and Infrastructure Security Centre added “[t]he SLACIP Bill is the second of two Bills, with the first being the Security Legislation Amendment (Critical Infrastructure) Act 2021, which received Royal Assent in December last year.”This bill has been referred to the Parliamentary Joint Committee on Intelligence and Security for inquiry and report.
§ The European Data Protection Board (EDPB) kicked off its “first coordinated enforcement action” and “[i]n the coming months, 22 supervisory authorities across the EEA (including EDPS) will launch investigations into the use of cloud-based services by the public sector.”
§ United States (U.S.) Deputy Attorney General Lisa Monaco made remarks at Annual Munich Cyber Security Conference that “outline[d] new steps the Department of Justice is taking to combat this unprecedented [cyber] threat.”
§ The Future of Tech Commission issued its report, “A Blueprint for Action” that “propose[s] muscular congressional and executive actions that will strengthen
protections for all Americans; require transparency from tech companies; bolster our
nation’s ability to respond to and prevent cyberattacks; and foster innovation, competition, and consumer choice.”
§ Estonia’s Foreign Intelligence Service issued its annual public report that discusses the nation’s security issues with respect to Russia, the People’s Republic of China, and other threats online.
§ The United States (U.S.) Federal Communications Commission (FCC) and National Telecommunications and Information Administration (NTIA) “announced a new initiative to improve U.S. government coordination on spectrum management.”
§ The European Data Protection Supervisor (EDPS) made preliminary remarks on modern spyware.
§ Texas Attorney General Ken Paxton has filed suit against Meta/Facebook “for capturing and using the biometric data of millions of Texans without properly obtaining their informed consent to do so, in violation of Texas law” per his statement.
§ The United States Department of Justice announced a guilty plea of a Mexican businessman for “admitting that he conspired to sell and use hacking tools manufactured by private companies in Italy, Israel and elsewhere.”
§ The United States (U.S.) Cybersecurity and Infrastructure Security Agency (CISA) released a new CISA Insight, Preparing for and Mitigating Foreign Influence Operations Targeting Critical Infrastructure, “which provides critical infrastructure owners and operators with guidance on how to identify and mitigate the risks of influence operations that use mis-, dis-, and malinformation (MDM) narratives.”
§ The European Commission (EC) stated that “the EU is acting on its space ambitions by tabling two initiatives - a proposal for a Regulation on a space-based secure connectivity and a Joint Communication on an EU approach on Space Traffic Management (STM).”
§ The United States (U.S.) National Institute of Standards and Technology (NIST) issued a request for information “to assist in evaluating and improving its cybersecurity resources, including the “Framework for Improving Critical Infrastructure Cybersecurity” (the “NIST Cybersecurity Framework,” “CSF” or “Framework”) and a variety of existing and potential standards, guidelines, and other information, including those relating to improving cybersecurity in supply chains.”
§ United States (U.S.) Ed Markey (D-MA) and Jeff Merkley (D-OR), and Representatives Pramila Jayapal (D-WA) and Ayanna Pressley (D-MA) “sent letters to the Department of Homeland Security (DHS), Department of Justice (DOJ), Department of Defense (DOD), Department of Interior (DOI), and Department of Health and Human Services (HHS), urging the agencies to end their use of Clearview AI’s facial recognition technology” per their statement.
§ The United States (U.S.) General Services Administration (GSA) amended its acquisition regulations “(GSAR) to streamline and update requirements for contracts that involve GSA information systems.”
§ United States (U.S.) Senators Amy Klobuchar (D-MN) and Cynthia Lummis (R-WY) introduced bipartisan legislation to address negative impacts of social media, “The Nudging Users to Drive Good Experiences on Social Media (Social Media NUDGE) Act” (S.3608) that “would establish studies to examine and recommend interventions to reduce addiction and the amplification of harmful content on social media platforms…[and] [f]ollowing the initial study, the legislation would hold platforms accountable for following through on recommendations.”
§ The Democratic and Republican leadership of the United States (U.S.) House Energy and Commerce Committee introduced “bipartisan draft legislation seeking to promote competition, innovation, national security, the interests of consumers, and American leadership in the thriving commercial satellite communications industry.”
§ The United States (U.S.) Cybersecurity and Infrastructure Security Agency (CISA) published the “Free Cybersecurity Services and Tools” webpage intended to be a one-stop resource where organizations of all sizes can find free public and private sector resources to reduce their cybersecurity risk.”
§ A bill has been introduced in the California legislature, “The California Age-Appropriate Design Code Act” (AB 2273) that would implement a design code along the lines of the one in force in the United Kingdom.
§ Amnesty International launched a campaign for a ban of the New York City Police Department’s (NYPD) use of facial recognition technology, which “disproportionally threatens the rights of non-white New Yorkers.”
Further Reading
Photo by Robert Anasch on Unsplash
§ “EU to mobilize cyber team to help Ukraine fight Russian cyberattacks” By Laurens Cerulus — Politico EU.
§ “Fed Up With Google, Conspiracy Theorists Turn to DuckDuckGo” By Stuart Thompson — The New York Times.
§ “Tech Giants to Be Forced to Share More Data Under EU Proposal” By Sam Schechner and Kim Mackrael — The Wall Street Journal.
§ “Facebook Scammers Are Shilling Fake Cryptocurrency Using Big Tech’s Biggest Names” By Colin Lecher and Surya Mattu — The Markup.
§ “Behind the stalkerware network spilling the private phone data of hundreds of thousands” By Zack Whittaker — TechCrunch.
§ “Trump’s Truth Social’s disastrous launch raises doubts about its long-term viability” By Drew Harwell — The Washington Post.
§ “Trump's Truth Social app censored an account that poked fun at the app's CEO, former Rep. Devin Nunes” By Isobel Asher Hamilton — yahoo! news
§ “Russia Could Use Cryptocurrency to Blunt the Force of U.S. Sanctions” By Emily Flitter and David Yaffe-Bellany — The New York Times.
§ “Big Tech Makes a Big Bet: Offices Are Still the Future” By Kellen Browning — The New York Times.
§ “Meta’s Facebook Escalates TikTok Rivalry, Launches Reels Globally” By Salvador Rodriguez — The Wall Street Journal.
§ “US and UK expose new Russian malware targeting network devices” By Catalin Cimpanu — The Record.
Coming Events
Photo by Noiseporn on Unsplash
§ 1 March
o The United Kingdom’s (UK) House of Commons Foreign Affairs Committee will hold a Formal meeting (oral evidence session): Tech and the future of UK foreign policy.
§ 9-10 March
o The Information Security and Privacy Advisory Board (ISPAB) will hold a quarterly open meeting and the agenda is expected to include the following items:
§ Briefing from NIST on recent activities from the Information Technology Laboratory,
§ Presentation from NIST on the Artificial Intelligence Risk Management Framework,
§ Discussion on Cryptographic Brittleness and issues in implementations,
§ Presentation from NIST on Open Source Cybersecurity Assessment Language (OSCAL),
§ Discussion on the United States Government participation in National and International Standards Development Organizations,
§ Briefing on NIST Cybersecurity Updates,
§ Public Comments.
§ 16 March
o The United States Federal Communications Commission (FCC) will hold an open meeting.
§ 15-16 May
o The United States-European Union Trade and Technology Council will reportedly meet in France.
§ 16-17 June
o The European Data Protection Supervisor will hold a conference titled “The future of data protection: effective enforcement in the digital world.”