Congress Considers Alternatives To Amending Section 230, Part II
U.S. President calls for privacy legislation and legislation to protect children from social media; U.S. Senate passes package of cybersecurity measures, including cyber incident reporting
Photo by Derek Thomson on Unsplash
In the last edition, we looked at the "Algorithmic Accountability Act of 2022" (H.R. 6580), a bill that would regulate automated and algorithmic decision making, as one of the avenues Congress could use to impose new restrictions and requirements on “Big Tech.” Yesterday, the United States (U.S.) House Energy and Commerce Committee’s Consumer Protection and Commerce Subcommittee held the third in a series of hearings “focused on holding Big Tech accountable and follows a December Subcommittee hearing on several legislative proposals intended to build a safer, more transparent, and accountable internet ecosystem.” H.R.6580 was one of the bills on the agenda, and in this post, we will examine another of the bills, the "Digital Services Oversight and Safety Act of 2022" (DSOSA) (H.R. 6796). Introduced by Representatives Lori Trahan (D-MA), Adam Schiff (D-CA), and Sean Casten (D-IL), DSOSA is, they claim in a press release, “comprehensive transparency legislation to establish a Bureau of Digital Services Oversight and Safety at the Federal Trade Commission (FTC) that would have the authority and resources necessary to hold powerful online companies accountable for the promises they make to users, parents, advertisers, and enforcers.”
As threshold matters, the bill envisions the Federal Trade Commission (FTC) with the resources and authority to police much of the online world through a set of rulemakings. Whether this is feasible politically is one matter, and I have my doubts, but on the operational level, this bill may be asking too much of an agency allegedly unable to shoulder its current load of responsibilities. Sure, DSOSA would authorize funding that would more than double the agency’s funding, but, as always, an authorization does not obligate appropriators. It is easy to envision a rearguard action by opponents inside and outside of Congress to work appropriators to undercut the FTC and hamstring its efforts to draft and issue rules and to enforce the new regime.
Having said all that, DSOSA, like many other content moderation bills that do not seek to alter Section 230 liability protection, looks to make sure online platforms have clear, consistent rules users can understand that are enforced fairly and transparently. This would include rights to appeal content moderation decisions. This new system would be overseen by the FTC and failures to comply would be subject to civil fines and other relief. This package also folds in language aimed at nudging platforms to better find and remove illegal content, activity, and goods. Moreover, this bill has a much wider scope than just social media platforms, for it would focus, in large part, on hosting services, which would encompass the other large technology companies. DSOSA has language that would seek to allow users some control over whether their experience on a platform is entirely subject to algorithmic recommendations. There are also provisions on a program the FTC would oversee to allow non-profits and universities to access and study the data of platforms to examine how well they are managing a variety of risks to users.