UK Unveils Online Safety Bill
Irish court rules DPC can proceed in implementing Schrems II against Facebook and others; Senate considers Endless Frontier Act
The United Kingdom (UK) has followed Australia and the European Union with legislation to address “online harms.”
The British are coming. Does that make Mark Zuckerberg Paul Revere? Or would he rather be George Washington?
Another major bill to regulate online harm debuts. The British bill would address both online platforms where users post content and search engines and would impose similar duties of care on both. London is taking an expansive view of who is a child with everyone 17 and under being placed in this class of users that will receive greater protection than adults. Moreover, the bill has provisions to protect the freedom of expression, privacy, political content, and “recognised” news outlets. Failures to meet their new duties could result in fines as high as 10% of a company’s annual worldwide turnover.
The UK’s Department for Digital, Culture, Media & Sport (DCMS) published its long-awaited online harms bill that sets out the framework by which the UK proposes to regulate harmful and illegal online content. The UK follows Australia and the European Union in proposing legislation to regulate the online world. The Australian Parliament is currently considering the “Online Safety Bill 2021” and the “Online Safety (Transitional Provisions and Consequential Amendments) Bill 2021” (see here for more detail and analysis.) The European Commission (EC) rolled out its The Digital Services Act in December 2020 and is currently negotiating a final bill with other EU stakeholders (see here for more detail and analysis.) And, of course, in the United States (U.S.), there have been calls from both political parties and many stakeholders to revise 47 U.S.C. 230 (aka Section 230), the liability shield many technology companies have to protect them from litigation arising from content they allow others to post. However, to date, no such legislation has advanced beyond mere introduction.
The British bill kicks a lot of details into the future where the regulator and government will have to sort key parts of the law. And so, implementation will prove crucial and likely another front where online platforms can make their cases.
The Online Safety Bill follows the publication of the Online Harms White Paper in April 2019. An initial Government response to the consultation was published in February 2020, and a full Government response in December 2020. The full government response set out in detail the regulatory framework, which will be taken forward through this bill.
The opposition party, as is to be expected, did not laud the legislation. Jo Stevens MP, Labour’s Shadow Secretary of State for Digital, Culture, Media and Sport asserted:
§ Over two years ago the Conservatives promised ‘world leading’ legislation in their White Paper. Instead we have watered down and incomplete proposals which lag behind the rest of the world. Even the Government’s press release admits that it’s proposals will only tackle “some of the worst abuses on social media.
§ Labour backs criminal sanctions for senior tech executives to bring about a change of culture in these companies who for too long have been given a completely free rein.
§ As the NSPCC has identified these proposals do very little to ensure children are safe online. There is little to incentivise companies to prevent their platforms from being used for harmful practices.
§ The Bill, which will have taken the Government more than five years from its first promise to act to be published, is a wasted opportunity to put into place future proofed legislation to provide an effective and all-encompassing regulatory framework to keep people safe online.
In the accompanying Explanatory notes, the DCMS provided an overview of the bill:
The Online Safety Bill establishes a new regulatory regime to address illegal and harmful content online, with the aim of preventing harm to individuals in the United Kingdom. It imposes duties of care in relation to illegal content and content that is harmful to children on providers of internet services which allow users to upload and share user-generated content (“user-to-user services”) and on providers of search engines which enable users to search multiple websites and databases (“search services”).
The Bill also imposes duties on such providers in relation to the protection of users’ rights to freedom of expression and privacy. Providers of user-to-user services which meet specified thresholds (“Category 1 services”) are subject to additional duties in relation to content that is harmful to adults, content of democratic importance and journalistic content.
The Bill confers powers on the Office of Communications (OFCOM) to oversee and enforce the new regulatory regime (including dedicated powers in relation to terrorism content and child sexual exploitation and abuse (CSEA) content), and requires OFCOM to prepare codes of practice to assist providers in complying with their duties of care. The Bill also expands OFCOM’s existing duties in relation to promoting the media literacy of members of the public.
The DCMS explained “[t]he Bill is divided into seven parts:
Part 1 contains definitions of the services to which the Bill applies.
Part 2 sets out the duties of care that apply to providers of user-to-user and search services. These are duties to undertake risk assessments, and also duties with regards to content on their services that is illegal, harmful to children and harmful to adults.
Part 3 sets out further obligations on services in relation to transparency reporting and the payment of fees.
Part 4 sets out OFCOM’s powers and duties as the online safety regulator. There are specific provisions on OFCOM’s duties to carry out risk assessments and to maintain a register of categories of services. Part 4 also establishes OFCOM’s functions and powers with respect to the use of technology in relation to terrorism content and child sexual exploitation and abuse (CSEA) content, information-gathering, enforcement, research, and media literacy.
Part 5 provides for the grounds and avenues for appeals against decisions by OFCOM, and for designated bodies to make super-complaints to the regulator.
Part 6 provides for the powers of the Secretary of State to issue a statement of strategic priorities and guidance to OFCOM, and to review the regulatory framework established by the Bill.
Part 7 contains miscellaneous and general provisions. In particular, it defines key concepts such as providers of regulated services, users, and internet services.”
In the bill it is explained “’user-to-user service’ means an internet service by means of which content that is generated by a user of the service, or uploaded to or shared on the service by a user of the service, may be encountered by another user, or other users, of the service.” Moreover, a “regulated service” means—
(a) a regulated user-to-user service, or
(b) a regulated search service.”
And so, such services would obviously include Facebook, Google, YouTube, Twitter, and other platforms on which people can post. These platforms would be subject to the requirements and enforcement mechanisms of the bill. However, a number of services are exempted. Schedule 1 identifies those entities exempted from the Online Safety Act such as email providers, voice communications providers, and others.
The bill would establish a duty of care for user-to-user services and another duty of care for search services. The overview of Part 2 nicely summarizes the operative sections of the Online Safety Act:
§ This Part imposes duties of care on providers of regulated services and requires OFCOM to issue codes of practice relating to those duties.
§ Chapter 2 imposes duties of care on providers of regulated user-to-user services.
§ Chapter 3 imposes duties of care on providers of regulated search services.
§ Chapter 4 imposes duties on providers of regulated services to assess whether a service is likely to be accessed by children.
§ Chapter 5 requires OFCOM to issue codes of practice relating to particular duties and explains what effects the codes of practice have.
Chapter 2 of the bill details the duties regulated user-to-user services must meet:
§ The illegal content risk assessment duty (see section 7(1)),
§ Each of the illegal content duties (see section 9),
§ The duty about rights to freedom of expression and privacy set out in section 12(2),
§ The duties about reporting and redress set out in section 15(2)(a), and section 15(3) and (5)
§ Each of the record-keeping and review duties (see section 16).
Those regulated user-to-user services children are likely to access (e.g., TikTok, Snapchat, Instagram) must meet additional duties:
§ Each of the children’s risk assessment duties (see section 7(3) and (4)),
§ Each of the duties to protect children’s online safety (see section 10),
Section 7 explains the illegal content risk assessment duty, a term defined as, in relevant part:
an assessment to identify, assess and understand such of the following as appear to be appropriate, taking into account the risk profile that relates to services of that kind—
(a) the user base;
(b) the level of risk of individuals who are users of the service encountering the following by means of the service—
(i) terrorism content,
(ii) Child Sexual Exploitation and Abuse (CSEA) content,
(iii) priority illegal content, and
(iv) other illegal content,
Under the bill, regulated user-to-user services must perform these assessments. When these services will have to perform these assessments will hinge on when OFCOM completes its “risk assessment to identify, assess and understand the risks of harm to individuals presented by regulated services” and its “guidance for providers of regulated services to assist them in complying with their duties to carry out risk assessments.” Upon the day, OFCOM completes both of these tasks, then regulated user-to-user services already operating in the UK have three months to do an illegal content risk assessment. Those regulated user-to-user services that want to begin operating must perform an illegal content risk assessment before they can.
Moreover, there are also definitions of “children’s risk assessment” and “adult’s risk assessments.”
Regulated user-to-user services will also have safety duties regarding illegal content that will require affirmative action in some cases and reactionary action in others. These platforms will have
A duty, in relation to a service, to take proportionate steps to mitigate and effectively manage the risks of harm to individuals, as identified in the most recent illegal content risk assessment of the service.
Additionally, these platforms will also have
A duty to operate a service using proportionate systems and processes designed to—
(a) minimise the presence of priority illegal content;
(b) minimise the length of time for which priority illegal content is present;
(c) minimise the dissemination of priority illegal content;
(d) where the provider is alerted by a person to the presence of any illegal content, or becomes aware of it in any other way, swiftly take down such content.
The British government is not requiring that all illegal content be taken down immediately and is instead imposing a duty to minimize such content unless a person has alerted the provider which then must act “swiftly” in taking down this content.
Under the Online Safety Act, regulated user-to-user services likely to be accessed by children have “duties to protect children’s online safety.” These platforms have a duty to “mitigate and effectively manage the risks of harm” as identified in their children’s risk assessment and also generally according to different age groups of children. Likewise, these platforms have “[a] duty to operate a service using proportionate systems and processes designed to—
(a) prevent children of any age from encountering, by means of the service, primary priority content that is harmful to children;
(b) protect children in age groups judged to be at risk of harm from other content that is harmful to children (or from a particular kind of such content) from encountering it by means of the service.
The Online Safety Bill defines children as those 17 and younger.
Some regulated user-to-user services have additional duties. The so-called Category 1 services also have “duties to protect adults’ online safety,” including but not limited to informing users how harmful priority content is dealt with and how harmful content identified during the risk assessment is managed. OFCOM will have to set the criteria by which regulated user-to-user services are split into Category 1, Category 2A, and Category 2B.
Regulated user-to-user services also have duties regarding the freedom of expression and privacy. All such entities will have
A duty to have regard to the importance of—
(a) protecting users’ right to freedom of expression within the law, and
(b) protecting users from unwarranted infringements of privacy, when deciding on, and implementing, safety policies and procedures.
Category 1 services will have additional duties. In “deciding on safety policies and procedures,” these platforms must assess the impact the policies and procedures might have on the rights to freedom of expression and privacy.
This class of entities also have “duties to protect content of democratic importance.” Category 1 platforms also have “[a] duty to operate a service using systems and processes designed to ensure that the importance of the free expression of content of democratic importance is taken into account when making decisions about—
(a) how to treat such content (especially decisions about whether to take it down or restrict users’ access to it), and
(b) whether to take action against a user generating, uploading or sharing such content.
Category 1 services also have a duty to protect journalistic content that shall come into play when they moderate this sort of content, especially when posted by users. This section of the bill spells out a more defined process for handling complaints about content moderation in terms of the duties this class of regulated user-to-user services must meet.
Regulated user-to-user services will need to meet their new duties of reporting and redress. The platforms must have processes that allow users to report content that is illegal, harmful to children, or harmful to adults.
Providers of search services would have similar but distinct duties and “must comply with the following duties in relation to each such service—
(a) the illegal content risk assessment duty (see section 19(1)),
(b) each of the illegal content duties (see section 21),
(c) the duty to protect rights to freedom of expression and privacy (see section 23),
(d) the duties about reporting and redress set out in—
(i) section 24(2)(a), and
(ii) section 24(3) and (5) so far as relating to subsection (4)(a)(i), (b) or (c)(i) of that section, and
(e) each of the record-keeping and review duties (see section 25).
There would also be additional duties for any provider of search services children are likely to access.
However, the new duties providers of search services must heed does not include the content of “recognised news publisher[s].”
The illegal content risk assessments for providers of search services are similar but less extensive than the one for regulated user-to-user services. In the same vein, the duty to conduct children’s risk assessment track closely with the regulated services. The point at which a “regulated search service must conduct an illegal content risk assessment is the same as for regulated user-to-user services: three months after OFCOM completes its risk assessment or issues its guidance about risk assessments.
Regulated search providers will have “illegal content duties,” including
A duty, in relation to a service, to take proportionate steps to mitigate and effectively manage the risks of harm to individuals, as identified in the most recent illegal content risk assessment of the service.
These entities will also have “[a] duty to operate a service using proportionate systems and processes designed to minimise the risk of individuals encountering the following in or via search results—
(a) priority illegal content;
(b) other illegal content that the provider knows about (having been alerted to it by another person or become aware of it in any other way).
Regulated search providers will also have “duties to protect children’s online safety” that entail taking “proportionate steps to—
(a) mitigate and effectively manage the risks of harm to children in different age groups, as identified in the most recent children’s risk assessment of the service, and
(b) mitigate the impact of harm arising to children in different age groups from content that is harmful to children encountered in or via search results of the service.”
These platforms would also have “A duty to operate a service using proportionate systems and processes designed to—
(a) minimise the risk of children of any age encountering primary priority content that is harmful to children in or via search results;
(b) minimise the risk of children in age groups judged to be at risk of harm from other content that is harmful to children (or from a particular kind of such content) encountering it in or via search results.”
Regulated search services would have to meet the same sorts of duties relating to freedom of expression and privacy and redress and reporting regulated user-to-user services would need to heed.
All regulated services (user-to-user services and providers of search services) must assess whether children are likely to access their services, “a key requirement of the safety duties to be imposed on all providers of regulated services in relation to children.” An assessment must be performed for each different regulated service an entity offers in the UK (e.g., Facebook would need to perform assessments for both Facebook and Instagram.) The bill also provides “OFCOM must prepare guidance for providers of regulated services to assist them in complying with their duties to carry out assessments.”
Moreover, OFCOM must prepare a code of practice for providers of regulated services describing recommended steps for the purposes of compliance with duties set out in section 9 or 21 (safety duties about illegal content) so far as relating to terrorism content” and CSEA. OFCOM must also prepare codes of practice for the other duties regulated services must meet. OFCOM must submit these codes of practice to DCMS which may approve them, but if the Department opts not to, it may present them to Parliament for approval. The codes of practice are important for regulated services because following them will functionally mean compliance with its new duties under the Online Safety Bill.
The bill specifies that “Illegal content” means—
(a) in relation to a regulated user-to-user service, content—
(i) that is regulated content in relation to that service, and
(ii) that amounts to a relevant offence;
(b) in relation to a regulated search service, content that amounts to a relevant offence.”
The DCMS will draft and issue regulations to specify the other offences shall be deemed relevant offences. The bill makes clear a “[r]elevant offence” means—
(a) a terrorism offence (see section 42),
(b) a CSEA offence (see section 43),
(c) an offence that is specified in, or is of a description specified in,
regulations made by the Secretary of State (see section 44), or
(d) an offence, not within paragraph (a), (b) or (c), of which the victim or intended victim is an individual (or individuals).
Certain platforms would have to pay annual fees to the British government. OFCOM would annually set a threshold above which platforms would need to pay a certain fee. The DCMS explained that “providers with qualifying worldwide revenue at or above a specified threshold will have an obligation to notify OFCOM and pay an annual fee” and “[w]here providers whose qualifying worldwide revenue is at or above the threshold do not notify or pay a fee, then enforcement action may be taken against them.”
OFCOM would receive a suite of new enforcement powers, including:
§ The authority to issue provisional notices of enforcement action that triggers a period of representations (i.e., the service may argue why they have not violated the law)
§ The power to impose confirmation decision directing action after the period for representations has expired
§ Levy financial penalties for violations of “the greater of £18 million and 10% of the person’s qualifying worldwide revenue”
§ Asking a court for a service restriction order that would stop other entities from doing business with a regulated service in the event of violations in some cases, or asking a court for a an access restriction order “to impede access to a non-compliant regulated service”
§ The Senate is considering the “Endless Frontier Act” (S.1260)this week and invoked cloture on the motion to proceed by an 86-11 vote, thus setting up a vote to end debate on this stage of consideration. The Senate Commerce, Science, and Transportation Committee released the bill as amended, but there will be other parts added, notably, the Senate Homeland Security and Governmental Affairs Committee’s “Securing America’s Future Act,” among others.
§ On the Senate floor, one of the bill’s sponsors, Senate Majority Leader Chuck Schumer (D-NY), stated:
o We are now one step closer to passing a bill that will keep our country one step ahead in science and technology for decades to come. It is my intention for the Senate to take up the Endless Frontier Act next week in a package with legislation to strengthen our alliances and partnerships; invest in the American semiconductor industry; ensure that China pays a price for its predatory actions; and boost advanced manufacturing, innovation, and critical supply chains.
o For decades, American prosperity has been anchored by our unmatched capacity for innovation and invention in science and in technology. American innovation propelled American industry, and an American workforce brought those innovations to the glob-al economy. But, unfortunately, Federal under-investment in sciences has seen our country slip, exposing critical weak spots in our economy. If we don’t fix them, we will no longer be the No. 1 economic leader in the world in the decade to come.
o So it is an imperative that we do this. This is for our future— our future for jobs, our future for economic leadership, and our future for world leadership. It all boils down to science, some-thing that was ignored, unfortunately, by the last administration, but, fortunately, bipartisan unity in this Senate is bringing us back on the page that we need to do science.
o The Endless Frontier Act would right the ship by making one of the largest investments in American innovation in generations, allowing the United States to outcompete countries like China, create more good-paying jobs, and harden our economic and our national security as well because this bill is vital to national security as well as to economic security.
§ President Joe Biden signed an executive order that revoked a few of President Donald Trump’s executive orders, including Executive Order 13925, Preventing Online Censorship, that had directed the National Telecommunications and Information Administration (NTIA) to file a petition with the Federal Communications Commission (FCC) to initiate a rulemaking to construe portions of 47 USC 230 (aka Section 230). (see here for more detail on the Trump executive order and subsequent action.)
§ Ireland’s High Court turned aside Facebook’s challenge to the Data Protection Commission’s effort to determine whether in light of the Court of Justice for the European Union’s decision striking down the adequacy decision for the United States (U.S.) whether Facebook’s transfers personal data out of the European Union to the U.S. are still legal and whether they should end. The Court decided Facebook’s objections lacked merit, and the DPC may proceed.
o On 16thJuly, 2020, the Court of Justice of the European Union (“CJEU”) delivered its landmark judgment in Case C-311/18 Data Protection Commissioner v. Facebook Ireland Ltd and Maximilian Schrems (commonly now referred to as “Schrems II”) on a reference from the High Court (Costello J.). This case is about what happened after that judgment.
o Following the judgment in Schrems II, the Data Protection Commission (“DPC”) decided to commence an “own volition” inquiry under s. 110 of the Data Protection Act, 2018 (the “2018 Act”) to consider whether the actions of Facebook Ireland Ltd (“FBI”) in making transfers of personal data relating to individuals in the European Union/European Economic Area are lawful and whether any corrective power should be exercised by the DPC in that regard. The DPC decided to commence the inquiry by issuing a “Preliminary Draft Decision” (“PDD”) to FBI on 28th August, 2020.
o FBI took issue, on several grounds, with the decision by the DPC to commence the inquiry by means of the PDD and with the procedures adopted by the DPC. Mr. Schrems, who had made a complaint and a reformulated complaint to the statutory predecessor of the DPC, the Data Protection Commissioner, under the Data Protection Act, 1988 (the “1988 Act”), which had ultimately led to the reference by the High Court to the CJEU leading to the judgment in Schrems II, also took issue with the DPC’s decision and procedures on a number of grounds, some of which overlapped to an extent with the grounds advanced by FBI.
§ The Data Protection Commission (DPC) and Maximillian Schrems reached a settlement that requires, in the words of none of your business (noyb), the DPC to “investigate an original 2013 complaint that lead to the Court of Justice for the European Union (CJEU) decision...[that] will run in parallel with the new "own volition" investigation.” noyb continued:
o Today's decision brings the seventh court case in a long-standing battle between Mr Schrems, the DPC and Facebook to an end. In 2013, Mr Schrems brought a complaint following the Snowden disclosures against Facebook, arguing that Facebook may not transfer his personal data to the United States, where surveillance laws require the sharing of personal data with the US government. The case was referred to the Court of Justice of the European Union (CJEU) twice, leading to the so-called "Schrems I" and "Schrems II" judgments, in which the CJEU decided that the DPC had to investigate Facebook and had a duty to stop the data transfers.
o Instead of swiftly implementing the CJEU decisions, the DPC produced three "branches" off the main complaints procedure (see graphic below). In 2020 it started the third detour by opening an "own volition" procedure on exactly the same subject matter as the existing complaints procedure, while indefinitely "pausing" Mr Schrems' complaints procedure. This would have ultimately removed Mr Schrems from his own case.
o Settlement between DPC and Mr Schrems ensures swift decision, independent of High Court decision. Shortly before the Judicial Review of Mr Schrems would have been heard, the DPC gave in and settled the case. In the settlement, the DPC pledged to run the complaints procedure swiftly once the High Court decided on Facebook's Judicial Review. In addition, if the second "own volition" procedure is allowed by the High Court, Mr Schrems would be able to participate in it.
o In short: The settlement assured that the DPC would decide on Facebook's data transfers in either one or two procedures.
o Next Steps: Irish Decision and EDPB Procedure. After today's judgment the DPC will have to swiftly implement the CJEU decision and prohibit Facebook's EU-US transfers. In fall 2020 the DPC foresaw 21 days to hear from parties and another 21 days to finalise its decision, similar timelines are agreed in the settlement with Mr Schrems. Any national decision by the Irish DPC would likely have to get approved by the European Data Protection Board (EDPB), where the data protection authorities of all 28 EU member states can review the decision and object to it, if they disagree with the DPC's findings. The deadline for an objection is four weeks and triggers a vote on the European level.
§ The United States (U.S.) Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA), “in coordination with the National Security Agency, and the Office of the Director of National Intelligence, as part of the Enduring Security Framework (ESF)—a cross-sector, public-private working group—released a Potential Threat Vectors to 5G Infrastructure paper.” CISA asserted:
o The fifth-generation (5G) of wireless technology represents a complete transformation of telecommunication networks, introducing a vast array of new connections, capabilities, and services. These advancements will provide the connection for billions of devices and will pave the way for applications that will enable new innovation, new markets, and economic growth around the world. However, these developments also introduce significant risks that threaten national security, economic security, and impact other national and global interests. Given these threats, 5G networks will be an attractive target for criminals and foreign adversaries to exploit for valuable information and intelligence.
o To address these concerns, the United States National Telecommunications and Information Administration (NTIA) developed the National Strategy to Secure 5G, a strategic document that expands on how the United States Government will secure 5G infrastructure domestically and abroad. The National Strategy to Secure 5G aligns to the National Cyber Strategy and establishes four lines of effort: (1) facilitating the rollout of 5G domestically; (2) assessing the cybersecurity risks to and identifying core security principles of 5G capabilities and infrastructure; (3) addressing risks to United States economic and national security during development and deployment of 5G infrastructure worldwide; and (4) promoting responsible global development and deployment of secure and reliable 5G infrastructure.
o In alignment with Line of Effort 2 in the National Strategy to Secure 5G, the Enduring Security Framework (ESF) was identified to assist with assessing risks and vulnerabilities to 5G infrastructure. This included building on existing capabilities in assessing and managing supply chain risk. As a result, the ESF 5G Threat Model Working Panel was established.1
o The preliminary focus of the 5G Threat Model Working Panel was to explore and prioritize potential threat vectors that may be associated with the use of 5G non-standalone (NSA) networks. The working panel reviewed existing bodies of work to identify and generate an aggregated list of known and potential threats to the 5G environment, determined and developed sample scenarios of where 5G may be adopted, and assessed risks to 5G core technologies. This analysis paper represents the beginning of the Working Panel’s thinking on the types of risks introduced by 5G adoption in the Unites States, and not the culmination of it. This product is not an exhaustive risk summary or technical review of attack methodologies and is derived from the considerable amount of analysis that already exists on this topic, to include public and private research and analysis.
§ The United Kingdom’s Office for Artificial Intelligence issued guidance titled “Ethics, Transparency and Accountability Framework for Automated Decision-Making.” The Office explained:
o This 7 point framework will help government departments with the safe, sustainable and ethical use of automated or algorithmic decision-making systems.
o It has been developed in line with guidance from government (such as the Data Ethics Framework) and industry, as well as relevant legislation. It supports the priorities of the Central Digital and Data Office, and aligns with wider cross- government strategies in the digital, data and technology space.
o Departments should use the framework with existing organisational guidance and processes.
o It remains the Regulator’s assertion that the Protection of Personal Information Act (POPIA) has a privacy regime which is very similar to the EU regime, and therefore believes that WhatsApp should adopt the EU policy in South Africa, and other countries in the global south that have similar regimes. The Regulator remains of the view that despite WhatsApp operating in different legal and regulatory environments, there are effectively two privacy policies for the users of WhatsApp. There are substantial differences between the policies for users living in Europe compared to the one for users living outside of Europe.
§ Australia’s Parliamentary Joint Committee on Intelligence and Security has published its “Advisory Report on the Telecommunications Legislation Amendment (International Production Orders) Bill 2020,” legislation the country must pass if it is to enter into an agreement with the United States under the “Clarifying Lawful Overseas Use of Data Act” (CLOUD Act.) As explained on the committee’s site, “[t]he Telecommunications Legislation Amendment (International Production Orders) Bill 2020 is drafted to amend the Telecommunications (Interception and Access) Act 1979 to:
o provide a framework for Australian agencies to obtain independently-authorised international production orders for interception, stored communications and telecommunications data directly to designated communications providers in foreign countries with which Australia has a designated international agreement
o amend the regulatory framework to allow Australian communications providers to intercept and disclose electronic information in response to an incoming order or request from a foreign country with which Australia has an agreement
o make amendments contingent on the commencement of the proposed Federal Circuit and Family Court of Australia Act 2020; and
o remove the ability for nominated Administrative Appeals Tribunal members to issue certain warrants.
o The Bill intends to provide for the legislative framework for Australia to give effect to future bilateral and multilateral agreements for cross-border access to electronic information and communications data, such as that being negotiated with the United States for the purposes of the US Clarifying Lawful Overseas Use of Data Act (CLOUD Act).
§ “They Hacked McDonald’s Ice Cream Machines—and Started a Cold War” By Andy Greenberg — WIRED. Of all the mysteries and injustices of the McDonald’s ice cream machine, the one that Jeremy O’Sullivan insists you understand first is its secret passcode.
§ “Progressive Groups Fight AT&T and T-Mobile’s New Texting Rules” By Rachel Cohen — The Intercept. About a month out from the 2020 presidential election, an app called RoboKiller published the first glimpse into how campaigns and advocacy groups were leveraging political texts and calls to shape the race. The app, designed to block automated calls and spam texts, found that after June 2020, robocalls declined, but political text messaging picked up. By the end of September, Republicans had sent 1.8 billion texts to voters, and Democrats had sent 902 million.
§ “Israeli-Palestinian fight spills over into social media” by Sara Fischer and Ashley Gold — Axios. As outrage about the conflict in Gaza and misinformation about clashes between Palestinians and Israelis snowball online, social media companies face yet another test of their capacity to manage their platforms.
§ “Censorship, Surveillance and Profits: A Hard Bargain for Apple in China” By Jack Nicas, Raymond Zhong and Daisuke Wakabayashi — The New York Times. On the outskirts of this city in a poor, mountainous province in southwestern China, men in hard hats recently put the finishing touches on a white building a quarter-mile long with few windows and a tall surrounding wall. There was little sign of its purpose, apart from the flags of Apple and China flying out front, side by side.
§ “Your tech devices want to read your brain. What could go wrong?” By Dalvin Brown — The Washington Post. Ramses Alcaide has spent over a decade thinking about thinking. As a PhD student at the University of Michigan in 2015, he developed a brain-computer interface that would allow people to control software and physical objects with their thoughts. Today, that interface is behind plans by a Boston-based start-up, Neurable, to begin shipping a set of brain-sensing headphones to let you know when you’re poised for peak productivity.
§ “Groups say gunshot detection systems unreliable, seek review” By Don Babwin and Sara Burnett — Associated Press. The gunshot detection system that set in motion the recent fatal police shooting of a 13-year-old boy in Chicago routinely reports gunshots where there are none, sending officers into predominantly Black and Latino neighborhoods for “unnecessary and hostile” encounters, community groups argued in a court filing Monday.
§ “Intel Community Needs Next-Gen Microelectronics for Future of AI” By Aaron Boyd — Nextgov. The intelligence community wants to take advantage of upcoming advances in machine learning and artificial intelligence but needs smaller, more powerful hardware to run those algorithms. The Intelligence Advanced Research Programs Activity—the advanced research arm of the intelligence community—released a broad agency announcement in support of research into the next generation of microelectronics, including processors, semiconductors and other hardware technologies.
§ “Your Car Is Spying On You, And A CBP Contract Shows The Risks” By Sam Biddle — The Intercept. U.S. Customs and Border Protection purchased technology that vacuums up reams of personal information stored inside cars, according to a federal contract reviewed by The Intercept, illustrating the serious risks in connecting your vehicle and your smartphone. The contract, shared with The Intercept by Latinx advocacy organization Mijente, shows that CBP paid Swedish data extraction firm MSAB $456,073 for a bundle of hardware including five iVe “vehicle forensics kits” manufactured by Berla, an American company. A related document indicates that CBP believed the kit would be “critical in CBP investigations as it can provide evidence [not only] regarding the vehicle’s use, but also information obtained through mobile devices paired with the infotainment system.” The document went on to say that iVe was the only tool available for purchase that could tap into such systems.
§ On 18 May, the House Homeland Security Committee will markup technology related legislation, including bills to address the recent Colonial Pipeline ransomware attack:
§ The Senate Armed Services Committee’s Cybersecurity Committee will hold a hearing on 18 May titled “Cybersecurity of the Defense Industrial Base” with these witnesses:
o Rear Admiral William Chase III, Deputy Principal Cyber Advisor to the Secretary of Defense and Director of Protecting Critical Technology Task Force
o Mr. Jesse Salazar, Deputy Assistant Secretary of Defense for, Industrial Policy
§ The House Energy and Commerce Committee’s Consumer Protection and Commerce Subcommittee will hold an 18 May hearing titled “Promises and Perils: The Potential of Automobile Technologies” a week after the National Transportation Safety Board (NTSB) issued its preliminary report “for its ongoing investigation of the fatal, April 17, 2021, crash of a 2019 Tesla Model S near Spring, Texas” with these witnesses:
o Jason Levine, Executive Director, Center for Auto Safety
o Greg Regan, President, Transportation Trades Department, AFL-CIO
o Professor Ragunathan Rajkumar, Department of Electrical and Computer Engineering, Carnegie Mellon University
§ On 18 May, two House Appropriations Committee subcommittees will hold hearings:
o The Financial Services and General Government Subcommittee will hold a hearing titled “The Need for Universal Broadband: Lessons from the COVID-19 Pandemic” with these witnesses:
§ Joi Chaney, National Urban League
§ Matt Dunne, Center on Rural Innovation
§ Max Stier, Partnership for Public Service
o The Defense Subcommittee will hold a closed hearing titled “National Security Agency and Cyber Command FY 2022 Posture” with the head of the National Security Agency and United States Cyber Command General Paul Nakasone testifying.
§ The Senate Commerce, Science, and Transportation Committee’s Consumer Protection, Product Safety, and Data Security Subcommittee will hold a hearing on 18 May titled “Protecting Kids Online: Internet Privacy and Manipulative Marketing” with these witnesses:
o Ms. Angela Campbell, Professor of Law and Co-Director, Institute for Public Representation
o Mr. Serge Egelman, Research Director, Usable Security and Privacy, International Computer Science Institute, University of California Berkeley
o Ms. Beeban Kidron, Founder and Chair, 5Rights
§ On 18 May, the Senate Finance Committee will hold a hearing titled “Funding and Financing Options to Bolster American Infrastructure” with these witnesses:
o Joseph Kile, Ph.D., Director of Microeconomic Analysis, Congressional Budget Office
o Victoria F. Sheehan, President, American Association of State Highway and Transportation Officials
o Heather Buch, Subcommittee Chair, Transportation Steering Committee, National Association of Counties
o Shirley Bloomfield, Chief Executive Officer, NTCA - The Rural Broadband Association
§ The Senate Homeland Security and Governmental Affairs Committee will hold a hearing titled “Examining the Role of the Department of Homeland Security’s Office of Intelligence and Analysis” on 18 May with these witnesses:
o The Honorable Francis X. Taylor, Former Under Secretary for Intelligence and Analysis (2014-2017), U.S. Department of Homeland Security
o Patricia Cogswell, Former Deputy Administrator (2018-2020), Transportation Security Administration, U.S. Department of Homeland Security
o Mike Sena, President, National Fusion Center Association
o Faiza Patel, Director, Liberty & National Security Program, Brennan Center for Justice, New York University School of Law
§ On 19 May, the House Ways and Means Committee will hold a hearing titled “Leveraging the Tax Code for Infrastructure Investment” but witnesses have not yet been announced.
§ The Senate Judiciary Committee’s Competition Policy, Antitrust, and Consumer Rights Subcommittee will hold a 19 May hearing titled “Antitrust Applied: Hospital Consolidation Concerns and Solutions” but no witnesses have been announced.
§ On 20 May, the House Appropriations Committee’s Defense Subcommittee will hold a closed hearing on the Intelligence Community’s World Wide Threat Assessment and the FY 2022 National Intelligence Program/Military Intelligence Program Posture with these witnesses:
o The Honorable Avril Haines, the Director of National Intelligence
o The Honorable David M. Taylor, Performing Under Secretary of Defense for Intelligence & Security, Department of Defense
§ The Commerce, Science, and Transportation will consider Eric Lander’s nomination to be the Director of the Office of Science and Technology Policy (OSTP) on 20 May.
§ The House Select Committee on the Climate Crisis will hold a 20 May hearing titled “Powering Up Clean Energy: Investments to Modernize and Expand the Electric Grid” with these witnesses:
o Linda Apsey, President and CEO, ITC Holdings Corp. Apsey is responsible for the strategic vision and overall business operations of ITC, the largest independent electricity transmission company in the United States. Based in Michigan, the company owns and operates high-voltage transmission infrastructure in Michigan, Iowa, Minnesota, Illinois, Missouri, Kansas and Oklahoma, with plans underway to expand to Wisconsin.
o Donnie Colston, Director, Utility Department, International Brotherhood of Electrical Workers (IBEW). Colston manages issues related to collective bargaining agreements, working conditions, safety-related work practices, and apprenticeship training. A utility lineman, he started his career in transmission and distribution construction before working as an electric troubleman. He has been a member of the IBEW Local Union 2100, which represents the employees of Louisville Gas and Electric Company (LG&E) and Kentucky Utilities Company (KU), for more than four decades.
o Michael Skelly, Founder and President, Grid United. Skelly is a renewable energy entrepreneur and pioneer in the U.S. wind industry who currently leads Grid United, an early-stage transmission development company. He was previously the founder and president of Clean Line Energy, a company that successfully permitted some of the longest transmission lines in the United States in the last 50 years.
o Emily Sanford Fisher, General Counsel, Corporate Secretary & Senior Vice President, Clean Energy Edison Electric Institute (EEI). Sanford Fisher manages EEI’s litigation and legal affairs at EEI, an association that represents all investor-owned electric companies in the United States. She also oversees and coordinates strategic clean energy engagement across EEI and across the federal government.
§ The House Armed Services Committee’s Cyber, Innovative Technologies, and Information Systems Subcommittee will hold a 20 May hearing titled “Reviewing Department of Defense Science and Technology Strategy, Policy, and Programs for Fiscal Year 2022: Fostering a Robust Ecosystem for Our Technological Edge” with these witnesses:
o Ms. Barbara McQuiston, Acting, Under Secretary of Defense for Research and Engineering (USD(R&E)), Office of the Secretary of Defense
o Dr. Philip Perconti, Deputy Assistant Secretary of the Army for Research and Technology (DASA R&T), Department of the Army
o Ms. Joan “JJ” Johnson, Deputy Assistant Secretary of the Navy Research, Development, Test, and Engineering (DASN RDTE), Department of the Navy
o Ms. Kristin Baldwin, Assistant Secretary of the Air Force for Acquisition, Technology and Logistics for Science Technology, and Engineering (SAF/AQR), Department of the Air Force
§ On 20 May, the House Veterans Affairs Committee’s Technology Modernization Subcommittee will hold a hearing titled “Cybersecurity and Risk Management at VA: Addressing Ongoing Challenges and Moving Forward” but no witnesses have been announced.
o Reducing Interstate Rates and Charges for Incarcerated People – The Commission will consider a Third Report and Order, Order on Reconsideration, and Fifth Notice of Proposed Rulemaking that, among other actions, will lower interstate rates and charges for the vast majority of incarcerated people, limit international rates for the first time, and seek comment on further reforms to the Commission’s calling services rules, including for incarcerated people with disabilities. (WC Docket No. 12-375)
o Strengthening Support for Video Relay Service – The Commission will consider a Notice of Proposed Rulemaking and Order to set Telecommunications Relay Services (TRS) Fund compensation rates for video relay service (VRS). (CG Docket Nos. 03-123, 10-51)
o Shortening STIR/SHAKEN Extension for Small Providers Likely to Originate Robocalls – The Commission will consider a Further Notice of Proposed Rulemaking to fight illegal robocalls by proposing to accelerate the date by which small voice service providers that originate an especially large amount of call traffic must implement the STIR/SHAKEN caller ID authentication framework. (WC Docket No. No 17-97)
o Section 214 Petition for Partial Reconsideration for Mixed USF Support Companies – The Commission will consider an Order on Reconsideration to relieve certain affiliates of merging companies that receive model-based and rate-of-return universal service support from a “mixed support” merger condition cap. (WC Docket No. 20-389)
o Enforcement Bureau Action – The Commission will consider an enforcement action.
o Enforcement Bureau Action – The Commission will consider an enforcement action.
§ On 27 July, the Federal Trade Commission (FTC) will hold PrivacyCon 2021.