Two House Democrats’ Legislation Would Revamp Online Content Moderation and Online Marketplaces
British court rules that EU law pertains to bulk data collection; the FTC drops case because of SCOTUS ruling that moots $487 million judgment;
The House Energy and Commerce Committee’s Consumer Protection and Commerce Subcommittee Chair Jan Schakowsky (D-IL) and Representative Kathy Castor (D-FL) introduced their “Online Consumer Protection Act” (OCPA) (H.R.3067) in mid-May 2021. However, the House Energy and Commerce Committee has not yet acted upon the bill, and there may not be any action on Section 230 soon given the apparent stalemate between Republicans and Democrats on the types of changes they want. Moreover, other legislative priorities may occupy this committee for the immediate future.
Nonetheless, this bill may represent the approach some House Democrats take in addressing some of the problems they see with how 47 U.S.C. 230 (aka Section 230) is used to shield social media platforms and online marketplaces from oversight, regulation of content moderation, and consumer protection laws.
As such, OCPA aims to address both the problems associated with the lack of transparency among social media platforms in how they remove content and suspend or block users and issues arising from counterfeit and unsafe goods sold online for which consumers are having trouble holding entities responsible. Both classes of entities have been using Section 230 to fend off litigation and calls for what Schakowsky, Castor, and other Democrats would consider proper and safe content moderation and selling and distributing of online goods.
OCPA would change U.S. regulation of both “social media platforms” such as Facebook, Twitter, TikTok, and Instagram and “online marketplaces” such as Amazon and eBay. Interestingly, this second category of entities would seem to encompass Best Buy, Walmart, and other such marketplaces with third-party sellers. And so, a Macy’s or CVS that is selling items online for itself only would not qualify. The Federal Trade Commission (FTC) would receive additional authority to regulate both classes of entities, and unlike a number of bills there is no exception for businesses and entities below a certain size. All entities that are either social media platforms or online marketplaces would experience new oversight and scrutiny from the FTC to the extent the already stretched agency can take on new tasks and missions.
Unlike a number of other Section 230 bills, H.R.3067 features some of the process-oriented reforms for how online platforms moderate and remove content like those found in the “Platform Accountability and Consumer Transparency Act” (PACT Act) (S.797). The text of the liability shield in Section 230(c) would not be altered, as some have proposed in other legislation, and instead OCPA enshrines in law a functional curtailing of Section 230 through specifying that it does not protect online marketplaces and social media platforms for violating the new statute.
Online marketplaces and social media platforms would need to publish new terms of service explaining how they deal with a range of issues and the rights users and sellers would have, new consumer protection policies, and new consumer protection programs (which are different from the aforementioned policies) to establish the procedures by which they would meet the requirements of the new law. The FTC would regulate both classes of entities and could seek civil penalties and injunctive relief as could state governments. Moreover, people could sue social media platforms and online marketplaces (the so-called private right of action.)
In terms of what the bill would require, OCPA requires social media platform and online marketplace to “establish, maintain, and make publicly available at all times and in a machine-readable format, terms of service in a manner that is clear, easily understood, and written in plain and concise language.” These terms of service must include:
§ any terms or conditions of use of any service provided by such person to a consumer;
§ any policies of such person with regard to such service or use of such service by a consumer; and
§ a consumer protection policy (more on this below)
Additionally, all terms of service must at least detail the following:
§ payment methods;
§ content ownership, including content generated by a user;
§ policies related to sharing user content with third parties;
§ any disclaimers, limitations, notices of nonliability, or the consequences of not agreeing to or complying with the terms of service; and
§ any other topic the [FTC] deems appropriate.
Of course, per the last item, the FTC may add other required items for disclosure in these new terms of service. Some of this information may cause heartburn among some of the larger companies. For example, Facebook would need to detail all the third parties with whom they share user content, and unlike many of the data privacy bills, actual entities would need to be named as opposed to categories of entities.
As mentioned, social media platforms and online marketplaces would need to draft and issue consumer protection policies, which have different elements for each class of entities. One must keep in mind these are subsections of the newly legal requirement for terms of service under OCPA. Social media platforms’ consumer protection policies must include:
§ a description of the content and behavior permitted or prohibited on its service both by the platform and by users;
§ whether content may be blocked, removed, or modified, or if service to users may be terminated and the grounds upon which such actions will be taken;
§ whether a person can request that content be blocked, removed, or modified, or that a user’s service be terminated, and how to make such a request;
§ a description of how a user will be notified of and can respond to a request that his or her content be blocked, removed, or modified, or service be terminated, if such actions are taken;
§ how a person can appeal a decision to block, remove, or modify content, allow content to remain, or terminate or not terminate service to a user, if such actions are taken; and
§ any other topic the Commission deems appropriate.
A number of these new requirements would address claims that Facebook, Twitter, and others act opaquely and arbitrarily in punishing some users and removing some content while other, similar conduct and content goes unaddressed. Now social media platforms would need to explain what is allowed and what is not, the conditions under which content may be moderated, if users can be banned and under what circumstances, how users can ask that content be moderated, have an appeals process for those who content is indeed moderated, and any other thing the FTC deems necessary.
For online marketplaces, the new consumer protection policies must include:
§ a description of the products, product descriptions, and marketing material, allowed or disallowed on the marketplace;
§ whether a product, product descriptions, and marketing material may be blocked, removed, or modified, or if service to a user may be terminated and the grounds upon which such actions will be taken;
§ whether users will be notified of products that have been recalled or are dangerous, and how they will be notified;
This information would go to the problem of online marketplaces (predominantly Amazon) operating as black boxes into which consumers cannot determine the company’s policies about what can and cannot be sold, if or when products can be removed, and if the marketplace will tell them about recalled or dangerous products.
There are additional policies that must be divulged for both sellers and users of online marketplaces. For sellers, these companies must detail in their consumer protection policies:
§ how sellers are notified of a report by a user or a violation of the terms of service or consumer protection policy;
§ how to contest a report by a user;
§ how a seller who is the subject of a report will be notified of what action will be or must be taken as a result of the report and the justification for such action;
§ how to appeal a decision of the online marketplace to take an action in response to a user report or for a violation of the terms of service or consumer protection policy; and
§ the policy regarding refunds, repairs, replacements, or other remedies as a result of a user report or a violation of the terms of service or consumer protection policy.
These disclosures seem designed to clarify when a seller may have her products removed for violating terms of service or consumer protection policies, how she would be notified, how she may appeal, and the marketplace’s policies about redress for users regarding products that fail to meet standards.
The consumer protection policies of online marketplaces would also address how users can report violations, including dangerous, fraudulent, and deceptive products, if a person who complains will be notified of subsequent action, how to file an appeal of an adverse decision, when and if refunds, repairs, or other remedies are available. Specifically, the user section of consumer protection policies should explain:
§ whether a user can report suspected fraud, deception, dangerous products, or violations of the online marketplace’s terms of service, and how to make such report;
§ whether a user who submitted a report will be notified of whether action was taken as a result of the report, the action that was taken and the reason why action was taken or not taken, and how the user will be notified;
§ how to appeal the result of a report; and
§ under what circumstances a user is entitled to refund, repair, or other remedy and the remedy to which the user may be entitled, how the user will be notified of such entitlement, and how the user may claim such remedy;
The FTC would have six months after enactment to “conduct a study to determine the most effective method of communicating common consumer protection practices in short-form consumer disclosure statements or graphic icons that disclose the consumer protection and content moderation practices of social media platforms and online marketplaces.” The FTC would then need to draft regulations to “require social media platforms and online marketplaces to communicate their consumer protection and content moderation practices, and any other information as the Commission may determine, in a clear and conspicuous manner.” The FTC would then need to determine if “such rules would advance consumer understanding of consumer protection and content moderation practices of social media platforms and online marketplaces,” and if so, then it would finalize them. If not, the agency would scrap the regulations.
Covered entities would also need to “establish and implement a consumer protection program that includes policies, practices, and procedures regarding consumer protection and content moderation.” Such consumer protection programs would:
§ ensure compliance with applicable Federal, State, and local consumer protection laws;
§ develop, implement, and ensure compliance with the terms of service…;
§ develop and implement policies regarding the content and behavior permitted on its service both by the platform and users, and ensure compliance with such policies, practices and procedures;
§ mitigate risks that could be harmful to consumer’s safety, well-being, and reasonable expectations of users of the social media platform or online marketplace;
§ implement reasonable safeguards within, and training and education of employees and contractors of, the social media platform or online marketplace to promote compliance with all consumer protection laws and the consumer protection program; and
§ disclose any other requirement the Commission deems appropriate
These requirements would mandate that social media platforms and online marketplaces take action to meet all applicable consumer protection laws, comply with their terms of service, spell out what content and behavior are allowed, address and reduce risks to the safety, well-being, and reasonable expectations of users, and any other requirement the FTC deems necessary. All of this would need to be done on a sliding scale considering the size and complexity of the company, the costs, and the types of activities users in which users are engaging.
In executing the consumer protection programs, OCPA get specific in how online marketplaces and social media platforms will need to take steps to achieve the above listed goals:
§ establish processes to monitor, manage, and enforce the social media platform’s or online marketplace’s consumer protection program, and demonstrate the covered entity’s compliance with Federal, State, and local consumer protection laws;
§ establish processes to assess and mitigate the risks to individuals resulting from the social media platform’s or online marketplace’s amplification of content or products not in compliance with its terms of service;
§ establish a process to periodically review and update the consumer protection program;
§ appoint a consumer protection officer, who reports directly to the chief executive officer; and
§ establish and implement controls to monitor and mitigate known or reasonably foreseeable risks to consumers resulting from hosting content or products.
And while OCPA would not exclude smaller social media platforms and online marketplaces, there would be filing requirements incumbent only upon entities with more than $250,000 in revenue or more than 10,000 monthly users. These annual filings would need to detail:
§ a detailed and granular description of [terms of service and consumer protection policies] and [consumer protection programs];
§ the name and contact information of the consumer protection officer…;and
§ a description of any material changes in the consumer protection program or the terms of service since the most recent prior disclosure to the Commission.
Each company required to make such filings must have their “principal executive officer” and “consumer protection officer” sign the filings and be responsible for any and all material misrepresentations or omissions. Moreover, these filings must be based on these officers’ actual knowledge of “the consumer protection practices of the social media platform or online marketplace.” The FTC must make public all such filings and may withhold information that should not be public in the agency’s view.
The FTC could treat all violations of OCPA as violations of a trade regulation, allowing the agency to seek over $43,000 per violation for even first offenses along with any injunctive relief it can seek. The FTC would receive power to carry out normal notice and comment rulemakings to implement OCPA instead of the onerous, almost impossible to use Moss-Magnuson rulemaking procedures.
State attorneys general or other designated state officials may enforce OCPA through filing suit in federal court in order to:
§ enjoin further such violation by such person;
§ enforce compliance with this Act;
§ obtain civil penalties; and
§ obtain damages, restitution, or other compensation on behalf of residents of the State.
State governments are not barred from suing in their courts under state statutes for conduct that also violates OCPA.
People would be allowed to sue under a private right of action in either federal or state court. Moreover, any “pre-dispute arbitration agreement or pre-dispute joint action waiver” could block a person’s right to sue under OCPA because they would be deemed invalid. People could sue for actual damages, reasonable attorney’s fees and litigation costs, and “any other relief, including equitable or declaratory relief, that the court determines appropriate.”
OCPA clearly states Section 230 has no relevance to violations of this act and does not preempt state statutes. Section 230 would be amended to clarify that it does not impinge the FTC’s ability to enforce any law in its purview.
§ The United Kingdom’s (UK) Investigatory Powers Tribunal (IPT) ruled that a Court of Justice for the European Union (CJEU) ruling is binding on the UK and that a now repealed section of British law allowing for bulk collection of data is incompatible with European Union (EU) law. This decision likely clears the way for more litigation on the UK’s data collection and surveillance practices. Privacy International brought the action and asserted in its statement:
o The IPT’s declaration is a welcome milestone in the bulk communications data litigation saga, but the fight is far from over.
o We have already asked the IPT to reopen this case following new information that came to light. In parallel, we are seeking disclosure of the judicial dissents given in ‘closed’ in the judgment of 23 July 2018 by way of judicial review proceedings.
§ The Federal Trade Commission (FTC) announced that it would drop the remaining charges against AbbVie after the Supreme Court of the United States “declined to review a ruling from the Third Circuit that AbbVie used sham litigation to illegally maintain a monopoly.” The FTC cited the recent Supreme Court ruling that limited its Section 13(b) powers and said it could not pursue $487 million in restitution and disgorgement a lower court had awarded. The agency stated:
o In 2014, the FTC filed a complaint in federal district court, charging that AbbVie and its partner Besins Healthcare Inc. illegally blocked patients’ access to lower-cost alternatives to AndroGel by filing baseless patent infringement lawsuits against potential generic competitors. The complaint also alleged that AbbVie settled one of its infringement lawsuits with an illegal reverse-payment agreement. The district court dismissed the reverse-payment claim, but in June 2018 found AbbVie and Besins liable for filing sham litigation in violation of the antitrust laws, and awarded the FTC $493.7 million in equitable monetary relief to return to consumers.
o In September 2020, the Third Circuit affirmed the district court’s finding of liability on the FTC’s sham litigation claim, and reinstated the reverse payment claim, two important legal victories that protect competition in pharmaceutical markets. Last month, the Supreme Court denied AbbVie and Besins’s petition for certiorari on the sham litigation claim, exhausting the companies’ options for appeal and allowing the liability ruling to stand.
o Since the initial filing of the lawsuit, generic AndroGel products have entered the market, so that patients now benefit from competition among multiple suppliers. In addition, AbbVie and Teva are now subject to Commission orders preventing them from entering into certain reverse-payment settlements. Today, the Commission has withdrawn its reverse-payment claim from federal district court, ending its litigation against AbbVie.
o While handing the Commission important legal victories, the Third Circuit reversed the district court’s nearly half-billion dollar monetary judgment for consumers, holding that the FTC is not entitled to disgorgement under 13(b) of the FTC Act. This determination was effectively affirmed by the Supreme Court’s decision in AMG Capital Management v. FTC.
§ The Senate Homeland Security and Governmental Affairs Committee held a markup and sent a number of technology bills to the Senate. Various press releases explained the bills thusly:
o The “Deepfake Task Force Act” (S. 2559) that “will create a task force, led by the Department of Homeland Security, charged with producing a coordinated plan to explore how a “digital content provenance” standard could assist with reducing the spread of deepfakes, develop tools for content creators to authenticate their content and its origin, and increase the ability of civil society and industry leaders to relay trust and information about the source of the deepfakes to consumers. The task force will be comprised of experts from academia, government, civil society, and industry.”
o The “AI Training Act” (S. 2551) that “create a training program to help federal employees responsible for purchasing AI technologies better understand the risks and benefits it poses to the American people…[and] help ensure the United States maintains a global leadership role in rapidly-developing technologies as foreign competitors like the Chinese government continue to prioritize investments in AI technologies.
o The “State and Local Government Cybersecurity Act” (S. 2520) that would:
§ facilitate coordination between DHS and state and local governments in several key areas.
§ permit the National Cybersecurity and Communications Integration Center (NCCIC) to provide state and local actors with access to improved security tools, policies and procedures, while also encouraging collaboration for the effective implementation of those resources, including joint cybersecurity exercises.
§ build on previous efforts by the Multi-State Information Sharing and Analysis Center (MS-ISAC) to prevent, protect, and respond to future cybersecurity incidents. These changes would also ensure that government officials and their staffs have access to the hardware and software products needed to bolster their cyber defenses.
o The “Cybersecurity Opportunity Act” (S. 2305)
o The “DHS Industrial Control Systems Capabilities Enhancement Act of 2021” (S. 2439)
o The “CISA Technical Corrections and Improvements Act of 2021” (S. 2540)
o The “Domains Critical to Homeland Security Act” (S. 2525) that “would require DHS to conduct an analysis of critical domains – defined in the bill as industries critical to the economic and national security of the United States – to determine whether there is a present or future national security threat in the event their supply chains are disrupted.”
§ The Senate Commerce, Science, and Transportation Committee marked up the “Secure Equipment Act of 2021” (S.1790) introduced by Senators Marco Rubio (R-FL) and Ed Markey (D-MA) “which closes a loophole by directing the Federal Communications Commission (FCC) to clarify that it will no longer review or approve applications from companies on the Commission’s “Covered List”…[and] would also prevent further integration and sales of Huawei, ZTE, Hytera, Hikvision, and Dahua – all Chinese state-backed or directed firms – in the U.S. regardless of whether federal funds are involved” per their press release.
§ Representative Kathy Castor (D-FL) introduced “an updated “Protecting the Information of our Vulnerable Children and Youth Act” or the “Kids PRIVCY Act” to strengthen the Children’s Online Privacy Protection Act (COPPA).” She stated that:
o The bill builds on COPPA's strengths and expands privacy protections for children and teenagers, and incorporates key elements of the UK's Age-Appropriate Design Code, including expansion of coverage to sites likely to be accessed by children and teenagers, a requirement for a Privacy and Security Impact Assessment, and direction to operators to make the best interests of children and teenagers a primary design consideration.
o The legislation specifically strengthens privacy protections for children and teenagers by:
§ Banning Companies from Providing Targeted Advertisements to Children and Teenagers: Prohibits companies from targeting children and teenagers based on their personal information and behavior.
§ Considering Best Interests of Children and Teenagers: Requires an operator to make the best interests of children and teenagers a primary design consideration when designing its service.
§ Requiring Opt-In Consent for all Individuals Under 18: Companies must obtain specific, informed, and unambiguous opt-in consent before collecting, retaining, selling, sharing, or using a young consumer or child’s personal information.
§ Creating a Right to Access, Correct, and Delete Personal Information: Companies must provide individuals the opportunity to access, correct, or delete their personal information at any time.
§ Protecting Additional Types of Information: Expands the type of information explicitly covered to include physical characteristics, biometric information, health information, education information, contents of messages and calls, browsing and search history, geolocation information, and latent audio or visual recordings.
§ Requiring User-Friendly Privacy Policies: Companies must make publicly available privacy policies that are clear, easily understood, and written in plain and concise language.
§ Creating a Protected Class of “Teenagers” Ages 13-17: For the first time in statute, the bill provides protection for teenagers 13-17, allowing them to control who collects their personal information and what companies can do with it.
§ Expands Coverage of Companies: Applies to all sites likely to be accessed by children and teens, not just child-directed services.
§ Limiting Disclosure to Third Parties: The bill prohibits companies from sharing personal information without consent. Furthermore, it creates additional duties companies must comply with before disclosing any personal information with third parties.
§ Requiring Reasonable Data Security Policies, Practices, and Procedures: Requires companies to have a written security policy, point of contact for information security management and processes to identify, assess, and mitigate vulnerabilities.
§ Prohibiting Industry Self-Regulation: Repeals dangerous safe harbor provision that allow for lax enforcement and rubberstamping of potentially unlawful practices.
§ Strengthening FTC Enforcement: Raises the maximum allowable civil penalty per violation by 50 percent and allows the FTC to pursue punitive damages. Also establishes a Youth Privacy and Marketing Division at the FTC.
§ Providing for Parental Enforcement: Parents will be able to bring civil actions to help enforce the bill and any resulting regulations.
§ Banning Forced Arbitration: In a much-needed reversal of current law, companies will no longer be able force their consumers to waive their right to sue.
§ The National Institute of Standards and Technology (NIST) issued for comment NIST Special Publication 800-53A Revision 5, "Assessing Security and Privacy Controls in Information Systems and Organizations," that “provides organizations with a flexible, scalable, and repeatable assessment methodology and assessment procedures that correspond with the controls in NIST SP 800-53, Revision 5.” NIST stated:
o Like previous revisions of SP 800-53A, the generalized assessment procedures provide a framework and starting point to assess the enhanced security requirements and can be tailored to the needs of organizations and assessors. The assessment procedures can be employed in self-assessments or independent third-party assessments.
o In addition to the update of the assessment procedures to correspond with the controls in SP 800-53, Revision 5, a new format for assessment procedures in this revision to SP 800-53A is introduced to:
§ Improve the efficiency of conducting control assessments,
§ Provide better traceability between assessment procedures and controls, and
§ Better support the use of automated tools, continuous monitoring, and ongoing authorization programs.
§ The White House’s Scientific Integrity Task Force announced that it “hosted four roundtables with Federal agency scientific integrity experts, three public listening sessions with members of the public who conduct, communicate, or consume Federal science, and issued a public request for information (RFI) that sought input from the general public.” The Task Force continued:
o The listening sessions gave hundreds of individuals from across the country an opportunity to share their views on the effectiveness of Federal scientific integrity policies and their role in promoting trust in Federal science. Members of the public also shared concerns related to the role of scientific integrity in the equitable delivery of Federal government programs.
o The roundtables, which convened more than 175 participants across four events, focused on the challenges and best practices in scientific communication. Participants also discussed policies and practices to support the professional development of scientists and researchers of diverse backgrounds. Finally, the roundtables reviewed new challenges posed by emerging technologies, such as artificial intelligence and machine learning, as well as by evolving scientific practices, like community-engaged research.
o In addition to its event-based public engagement, the Task Force received feedback from more than 200 individuals and organizations in response to its RFI that was issued by the Office of Science and Technology Policy. The Scientific Integrity Task Force will now turn its attention to analyzing the wealth of information gathered throughout the past several weeks. This input will inform the Task Force’s efforts to prepare a report summarizing recommendations for improving scientific integrity and restoring trust in government.
§ In a blog post, the United Kingdom’s Information Commissioner’s Office (ICO) Deputy Commissioner - Chief Regulatory Officer James Dipple-Johnstone explained the ICO has revised its regulatory approach during the COVID-19 pandemic. He said the ICO “wanted to clearly explain what our commitment to being a pragmatic and empathetic regulator would look like in practice, while reiterating the important role that people’s information right would continue to have.” Dipple-Johnstone asserted:
o As we anticipated at the beginning of the pandemic, some organisations we regulate have faced significant difficulties supporting people’s information rights. NHS organisations, police and local and central government have all faced particular challenges, especially in responding to subject access and freedom of information requests.
o We have today published an updated version of our regulatory approach document. It states our commitment to continue taking into account the challenges organisations we regulate face, but also makes clear the value of information rights. We expect organisations should be able to deal with complaints they receive from members of the public, for instance, and we expect organisations to have robust recovery plans in place to reduce any backlogs.
o We will continue to update on our regulatory approach, to provide clarity to organisations both during the pandemic and beyond. This will include updating our Regulatory Action Policy, which we will consult on later this year.
o Data protection has played a central role in the UK’s response to the pandemic, but the effectiveness of data-driven innovation relies in part on public trust. Likewise, people’s trust in decisions made by government and public authorities relies on transparency. A respect for people’s information rights is central to both, and the ICO will continue to work to protect and support those rights.
§ “‘If You’re Not A Criminal, Don’t Be Afraid’—NSO CEO On ‘Insane’ Hacking Allegations Facing $1 Billion Spyware Business” By Thomas Brewer — Forbes. Shalev Hulio, 39, is the CEO and cofounder of NSO Group, one of Israel's most successful cybersurveillance companies valued at over $1 billion, and the man ultimately responsible for smartphone hacks of high-profile journalists and world leaders, according to allegations made this week. Though he’s coming out of the shadows to deal with those allegations, as well as some apparent contradictions in NSO’s own response, in a rare interview with Forbes, Hulio was in good spirits as he attacked the research that underpinned the so-called Pegasus Project, a coalition of nonprofit and media organizations trying to shine a light on NSO’s operations. The project’s reporting follows years of stories alleging that NSO’s tools were used to infect the iPhones of civil rights defenders, reporters and lawyers.
§ “Govt releases highly redacted COVIDSafe report” By Denham Sadler — Innovation Aus. The federal government has been forced to release a report on the effectiveness of its controversial contact tracing app COVIDSafe, but has removed all parts relating to this and left only basic information and positive comments. The report, released this week following a Freedom of Information request, has wholly redacted all of the parts relevant to its effectiveness and comes nearly a year after the government was required to release it.
§ “Chinese hacking group APT31 uses mesh of home routers to disguise attacks” By Catalin Cimpanu — The Record. A Chinese cyber-espionage group known as APT31 (or Zirconium) has been seen hijacking home routers to form a proxy mesh around its server infrastructure in order to relay and disguise the origins of their attacks. In a security alert published today, the French National Cybersecurity Agency, also known as ANSSI (Agence Nationale de la Sécurité des Systèmes d’Information), published a list of 161 IP addresses that have been hijacked by APT31 in recent attacks against French organizations.
§ “Canada’s domestic spy agency said it countered foreign threats to 2019 election, document reveals” By Alex Boutilier — Toronto Star. Canada’s domestic intelligence agency said it intervened to counter perceived foreign threats to the 2019 federal election, a newly unearthed government document reveals. In August 2019, the Canadian Security Intelligence Service (CSIS) told then-public safety minister Ralph Goodale that the agency had used its broad “threat-reduction” powers “to reduce the threat posed by foreign interference activities to Canada’s democratic institutions and processes.”
§ “Biden Team Plans for Chips Funding Even Before Congress Acts” By Jenny Leonard — Bloomberg. The Biden administration is laying the groundwork to spend roughly $52 billion on semiconductor research and manufacturing even as it’s awaiting congressional approval of the funding, Commerce Secretary Gina Raimondo said. “We’re putting plans in place right now already on the team to invest the $52 billion,” she said Thursday during a White House press briefing. “We need to incentivize the manufacturing of chips in America and so we are very focused on putting the pieces in place so that can happen.”
§ “China sanctions Wilbur Ross, others over U.S. warning about Hong Kong business conditions” By Annika Kim Constantino — CNBC. China said Friday it has sanctioned seven people, including former Trump Commerce Secretary Wilbur Ross, in response to U.S. penalties imposed on Chinese officials over Beijing’s clampdown on democracy in Hong Kong. The reciprocal sanctions were imposed under China’s new Anti-Foreign Sanctions Law, which was passed in June. The sanctions are a response to the United States’ recent warning to companies about the risks of doing business in Hong Kong.
§ “Facebook’s Super Spreaders” By Katrina Northrop — Wire China. Last Friday afternoon, just as President Joe Biden was boarding Marine One for a weekend at Camp David, a reporter yelled out a question: “On Covid misinformation, what is your message to platforms like Facebook?” Above the din of the helicopter, Biden responded with his quintessential frankness: “They’re killing people.” His comment didn’t come out of nowhere. Just the day before, the Surgeon General, in his first formal advisory of the Biden administration, issued a stark warning about Covid-19 related misinformation, specifically calling out social media companies for providing a platform for the dangerous inaccuracies.
§ “Disinformation for Hire, a Shadow Industry, Is Quietly Booming” By Max Fisher — The New York Times. In May, several French and German social media influencers received a strange proposal. A London-based public relations agency wanted to pay them to promote messages on behalf of a client. A polished three-page document detailed what to say and on which platforms to say it. But it asked the influencers to push not beauty products or vacation packages, as is typical, but falsehoods tarring Pfizer-BioNTech’s Covid-19 vaccine. Stranger still, the agency, Fazze, claimed a London address where there is no evidence any such company exists.
§ “Facebook forced to limit misinformation spread via WhatsApp amid Sydney lockdown” By Josh Taylor — The Guardian.New South Wales Health has issued a warning about misinformation circulating on WhatsApp that claims Sydney supermarkets will close as part of the ongoing lockdown, with owner Facebook saying it is working to limit the spread of misinformation on its private messaging app. On Thursday, it was reported that a screenshot purporting to be from NSW Health saying that supermarkets would close for four days as part of the Covid-19 response was circulating on WhatsApp.
§ 12 August
o The Senate Judiciary Committee will markup the “State Antitrust Enforcement Venue Act of 2021” (S.1787), “a bill to ensure state attorneys general are able to remain in the court they select rather than having their cases moved to a court the defendant prefers” per a May 2021 press release issued by Senators Mike Lee (R-UT) and Amy Klobuchar (D-MN).
§ 1 September
o The House Armed Services Committee will mark up the FY 2022 National Defense Authorization Act (H.R.4395).
§ 30 September
o The Federal Communications Commission (FCC) will hold an open meeting. No agenda has been announced as of yet.